Jump to content




All Activity

This stream auto-updates

  1. Past hour
  2. In 2015, PPC was a game of direct control. You told Google exactly which keywords to target, set manual bids at the keyword level, and capped spend with a daily budget. If you were good with spreadsheets and understood match types, you could build and manage 30,000-keyword accounts all day long. Those days are gone. In 2026, platform automation is no longer a helpful assistant. It’s the primary driver of performance. Fighting that reality is a losing battle. Automation has leveled the playing field and, in many cases, given PPC marketers back their time. But staying effective now requires a different skill set: understanding how automated systems learn and how your data shapes their decisions. This article breaks down how signals actually work inside Google Ads, how to identify and protect high-quality signals, and how to prevent automation from drifting into the wrong pockets of performance. Automation runs on signals, not settings Google’s automation isn’t a black box where you drop in a budget and hope for the best. It’s a learning system that gets smarter based on the signals you provide. Feed it strong, accurate signals, and it will outperform any manual approach. Feed it poor or misleading data, and it will efficiently automate failure. That’s the real dividing line in modern PPC. AI and automation run on signals. If a system can observe, measure, or infer something, it can use it to guide bidding and targeting. Google’s official documentation still frames “audience signals” primarily as the segments advertisers manually add to products like Performance Max or Demand Gen. That definition isn’t wrong, but it’s incomplete. It reflects a legacy, surface-level view of inputs and not how automation actually learns at scale. Dig deeper: Google Ads PMax: The truth about audience signals and search themes What actually qualifies as a signal? In practice, every element inside a Google Ads account functions as a signal. Structure, assets, budgets, pacing, conversion quality, landing page behavior, feed health, and real-time query patterns all shape how the AI interprets intent and decides where your money goes. Nothing is neutral. Everything contributes to the model’s understanding of who you want, who you don’t, and what outcomes you value. So when we talk about “signals,” we’re not just talking about first-party data or demographic targeting. We’re talking about the full ecosystem of behavioral, structural, and quality indicators that guide the algorithm’s decision-making. Here’s what actually matters: Conversion actions and values: These are 100% necessary. They tell Google Ads what defines success for your specific business and which outcomes carry the most weight for your bottom line. Keyword signals: These indicate search intent. Based on research shared by Brad Geddes at a recent Paid Search Association webinar, even “low-volume” keywords serve as vital signals. They help the system understand the semantic neighborhood of your target audience. Ad creative signals: This goes beyond RSA word choice. I believe the platform now analyzes the environment within your images. If you show a luxury kitchen, the algorithm identifies those visual cues to find high-end customers. I base this hypothesis on my experience running a YouTube channel. I’ve watched how the algorithm serves content based on visual environments, not just metadata. Landing page signals: Beyond copy, elements like color palettes, imagery, and engagement metrics signal how well your destination aligns with the user’s initial intent. This creates a feedback loop that tells Google whether the promise of the ad was kept. Bid strategies and budgets: Your bidding strategy is another core signal for the AI. It tells the system whether you’re prioritizing efficiency, volume, or raw profit. Your budget signals your level of market commitment. It tells the system how much permission it has to explore and test. In 2026, we’ve moved beyond the daily cap mindset. With the expansion of campaign total budgets to Search and Shopping, we are now signaling a total commitment window to Google. In the announcement, UK retailer Escentual.com used this approach to signal a fixed promotional budget, resulting in a 16% traffic lift because the AI was given permission to pace spend based on real-time demand rather than arbitrary 24-hour cycles. All of these elements function as signals because they actively shape the ad account’s learning environment. Anything the ad platform can observe, measure, or infer becomes part of how it predicts intent, evaluates quality, and allocates budget. If a component influences who sees your ads, how they behave, or what outcomes the algorithm optimizes toward, it functions as a signal. The auction-time reality: Finding the pockets To understand why signal quality has become critical, you need to understand what’s actually happening every time someone searches. Google’s auction-time bidding doesn’t set one bid for “mobile users in New York.” It calculates a unique bid for every single auction based on billions of signal combinations at that precise millisecond. This considers the user, not simply the keyword. We are no longer looking for “black-and-white” performance. We are finding pockets of performance and users who are predicted to take the outcomes we define as our goals in the platform. The AI evaluates the specific intersection of a user on iOS 17, using Chrome, in London, at 8 p.m., who previously visited your pricing page. Because the bidding algorithm cross-references these attributes, it generates a precise bid. This level of granularity is impossible for humans to replicate. But this is also the “garbage in, garbage out” reality. Without quality signals, the system is forced to guess. Dig deeper: How to build a modern Google Ads targeting strategy like a pro Get the newsletter search marketers rely on. See terms. The signal hierarchy: What Google actually listens to If every element in a Google Ads account functions as a signal, we also have to acknowledge that not all signals carry equal weight. Some signals shape the core of the model’s learning. Others simply refine it. Based on my experience managing accounts spending six and seven figures monthly, this is the hierarchy that actually matters. Conversion signals reign supreme Your tracking is the most important data point. The algorithm needs a baseline of 30 to 50 conversions per month to recognize patterns. For B2B advertisers, this often requires shifting from high-funnel form fills to down-funnel CRM data. As Andrea Cruz noted in her deep dive on Performance Max for B2B, optimizing for a “qualified lead” or “appointment booked” is the only way to ensure the AI doesn’t just chase cheap, irrelevant clicks. Enhanced conversions and first-party data We are witnessing a “death by a thousand cuts,” where browser restrictions from Safari and Firefox, coupled with aggressive global regulations, have dismantled the third-party cookie. Without enhanced conversions or server-side tracking, you are essentially flying blind, because the invisible trackers of the past are being replaced by a model where data must be earned through transparent value exchanges. First-party audience signals Your customer lists tell Google, “Here is who converted. Now go find more people like this.” Quality The Presidents quantity here. A stale or tiny list won’t be as effective as a list that is updated in real time. Custom segments provide context Using keywords and URLs to build segments creates a digital footprint of your ideal customer. This is especially critical in niche industries where Google’s prebuilt audiences are too broad or too generic. These segments help the system understand the neighborhood your best prospects live in online. To simplify this hierarchy, I’ve mapped out the most common signals used in 2026 by their actual weight in the bidding engine: Signal categorySpecific input (The “what”)Weight/impactWhy it matters in 2026Primary (Truth)Offline conversion imports (CRM)CriticalTrains the AI on profit, not just “leads.”Primary (Truth)Value-based bidding (tROAS)CriticalSignals which products actually drive margin.Secondary (Context)First-party customer match listsHighProvides a “Seed Audience” for the AI to model.Secondary (Context)Visual environment (images/video)HighAI scans images to infer user “lifestyle” and price tier.Tertiary (Intent)Low-volume/long-tail keywordsMediumDefines the “semantic neighborhood” of the search.Tertiary (Intent)Landing page color and speedMediumSignals trust and relevance feedback loops.Pollutant (Noise)“Soft” conversions (scrolls/clicks)NegativeDilutes intent. Trains AI to find “cheap clickers.” Dig deeper: Auditing and optimizing Google Ads in an age of limited data Beware of signal pollution Signal pollution occurs when low-quality, conflicting, or misleading signals contaminate the data Google’s AI uses to learn. It’s what happens when the system receives signals that don’t accurately represent your ideal client, your real conversion quality, or the true intent you want to attract in your ad campaigns. Signal pollution doesn’t just “confuse” the bidding algorithm. It actively trains it in the wrong direction. It dilutes your high-value signals, expands your reach into low-intent audiences, and forces the model to optimize toward outcomes you don’t actually want. Common sources include: Bad conversion data, including junk leads, unqualified form fills, and misfires. Overly broad structures that blend high- and low-intent traffic. Creative that attracts the wrong people. Landing page behavior that signals low relevance or low trust. Budget or pacing patterns that imply you’re willing to pay for volume over quality. Feed issues that distort product relevance. Audience segments that don’t match your real buyer. These sources create the initial pollution. But when marketers try to compensate for underperformance by feeding the machine more data, the root cause never gets addressed. That’s when soft conversions like scrolls or downloads get added as primary signals, and none of them correlate to revenue. Like humans, algorithms focus on the metrics they are fed. If you mix soft signals with high-intent revenue data, you dilute the profile of your ideal customer. You end up winning thousands of cheap, low-value auctions that look great in a report but fail to move the needle on the P&L. Your job is to be the gatekeeper, ensuring only the most profitable signals reach the bidding engine. When signal pollution takes hold, the algorithm doesn’t just underperform. The ads start drifting toward the wrong users, and performance begins to decline. Before you can build a strong signal strategy, you have to understand how to spot that drift early and correct it before it compounds. How to detect and correct algorithm drift Algorithm drift happens when Google’s automation starts optimizing toward the wrong outcomes because the signals it’s receiving no longer match your real advertising goals. Drift doesn’t show up as a dramatic crash. It shows up as a slow shift in who you reach, what queries you win, and which conversions the system prioritizes. It looks like a gradual deterioration of lead quality. To stay in control, you need a simple way to spot drift early and correct it before the machine locks in the wrong pattern. Early warning signs of drift include: A sudden rise in cheap conversions that don’t correlate with revenue. A shift in search terms toward lower-intent or irrelevant queries. A drop in average order value or lead quality. A spike in new-user volume with no matching lift in sales. A campaign that looks healthy in-platform but feels wrong in the CRM or P&L. These are all indicators that the system is optimizing toward the wrong signals. To correct drift without resetting learning: Tighten your conversion signals: Remove soft conversions, misfires, or anything that doesn’t map to revenue. The machine can’t unlearn bad data, but you can stop feeding it. Reinforce the right audience patterns: Upload fresh customer lists, refresh custom segments, and remove stale data. Drift often comes from outdated or diluted audience signals. Adjust structure to isolate intent: If a campaign blends high- and low-intent traffic, split it. Give the ad platform a cleaner environment to relearn the right patterns. Refresh creative to repel the wrong users: Creative is a signal. If the wrong people are clicking, your ads are attracting them. Update imagery, language, and value props to realign intent. Let the system stabilize before making another change: After a correction, give the campaign 5-10 days to settle. Overcorrecting creates more drift. Your job isn’t to fight automation in Google Ads, it’s to guide it. Drift happens when the machine is left unsupervised with weak or conflicting signals. Strong signal hygiene keeps the system aligned with your real business outcomes. Once you can detect drift and correct it quickly, you’re finally in a position to build a signal strategy that compounds over time instead of constantly resetting. The next step is structuring your ad account so every signal reinforces the outcomes you actually want. Dig deeper: How to tell if Google Ads automation helps or hurts your campaigns Building a strategy that actually works in 2026 with signals If you want to build a signal strategy that becomes a competitive advantage, you have to start with the foundations. For lead gen Implement offline conversion imports. The difference between optimizing for a “form fill” and a “$50K closed deal” is the difference between wasting budget and growing a business. When “journey-aware bidding” eventually rolls out, it will be a game-changer because we can feed more data about the individual steps of a sale. For ecommerce Use value-based bidding. Don’t just count conversions. Differentiate between a customer buying a $20 accessory and one buying a $500 hero product. Segment your data Don’t just dump everyone into one list. A list of 5,000 recent purchasers is worth far more than 50,000 people who visited your homepage two years ago. Stale data hurts performance by teaching the algorithm to find people who matched your business 18 months ago, not today. Separate brand and nonbrand campaigns Brand traffic carries radically different intent and conversion rates than nonbrand. Mixing these campaigns forces the algorithm to average two incompatible behaviors, which muddies your signals and inflates your ROAS expectations. Brand should be isolated so it doesn’t subsidize poor nonbrand performance or distort bidding decisions in the ad platform. Don’t mix high-ticket and low-ticket products under one ROAS target A $600 product and a $20 product do not behave the same in auction-time bidding. When you put them in the same campaign with a single 4x ROAS target, the algorithm will get confused. This trains the system away from your hero products and toward low-value volume. Centralize campaigns for data density, but only when the data belongs together Google’s automation performs best when it has enough data to be consistent and high-quality data to recognize patterns. That means fewer, stronger campaigns are better as long as the signals inside them are aligned. Centralize campaigns when products share similar price points, margins, audiences, and intent. Decentralize campaigns when mixing them would pollute the signal pool. The competitive advantage of 2026 When everyone has access to the same automation, the only real advantage left is the quality of the signals you feed it. Your job is to protect those signals, diagnose pollution early, and correct drift before the system locks onto the wrong patterns. Once you build a deliberate signal strategy, Google’s automation stops being a constraint and becomes leverage. You stay in the loop, and the machine does the heavy lifting. View the full article
  3. A reader writes: I work in healthcare IT. Recently, our organization made the decision to switch to a new Electronic Medical Record (EMR) system. I, along with dozens of colleagues, are responsible for building this new EMR to meet our organization’s needs. It’s a months-long process that involves lots of coordinated decision-making across the entire organization. The tentative go-live date for this new system is well over a year from now. Our leadership is telling the entire IT department that no PTO requests will be approved during this time. None of this has been communicated to the department en masse, but it has trickled down to managers, who then relay it to their respective teams. The message from my manager has been, “No PTO will be approved.” When I asked about booking a vacation this summer, the response was, “The go-live date is [specific 2027 date].” Since then, I’ve confirmed that no PTO means no PTO. They’ve said they might be able to grant a day off here or there, depending on project needs. But those decisions would only be made closer to the dates we would want to take off. I have a spouse and small children. The thought of zero vacation for over a year seems really awful to me. (I do think this is only about vacation and not sick time. I don’t think they’re saying if we get sick that we can’t take time off. And we are salaried, so we have been told that we can generally flex our schedules to go to one-off appointments without using PTO. But PTO for vacations is a no-go. ) Many folks in our department are quietly seething, but it doesn’t seem like anyone is willing to bring it up in a large group. Is this something that my company can do? PTO is a part of our compensation package, and we accrue leave every pay period. I am new to this organization, so it’s entirely possible that I am way off-base in thinking that this is a bizarre policy. No, this is absurd. The idea that people should work a full year through with no time off to recharge is ridiculous. And no one can attend a family wedding? A funeral? Be at the birth of their grandchild? All trips of any sort for the year are off the table? Legally, in most states, they can probably do it. California is the exception to that, because California treats vacation time as earned wages and prohibits extreme black-out periods that prevent you from having practical access to the time off. Assuming you’re not in California, the best thing you and your coworkers can do is to push back as a group, pointing out that this is an unacceptable restriction on your use of earned benefits and a massive hit to very routine quality-of-life expectations, that you have lives and commitments outside of work, and that it’s in the organization’s best interest to have well-rested and recharged employees. You said no one seems willing to do that, but why? This is an incredibly normal thing for a group of employees to take issue with and push back on … and if you don’t, you’re going to be stuck with no vacation for a year. Create some friction for your company and make it harder for them to do this. There’s a very good chance that if you push back as a group, they’ll budge. The post my company says no one can take any time off for a full year appeared first on Ask a Manager. View the full article
  4. Anthropic is drawing the line against advertising in AI chatbots. Claude will remain ad-free, the company said, even as rival AI platforms experiment with sponsored messages and branded placements inside conversations. Ads inside AI chats would erode trust, warp incentives, and clash with how people actually use assistants like Claude (for work, problem-solving, and sensitive topics), Anthropic said in a new blog post. Why we care. Anthropic’s position removes Claude, and its user base of 30 million, from the AI advertising equation. Brands shouldn’t expect sponsored links, conversations, or responses inside Claude. Meanwhile, ChatGPT is about to give brands the opportunity to reach an estimated 800 million weekly users. What’s happening. AI conversations are fundamentally different from search results or social feeds, where users expect a mix of organic and paid content, Anthropic said: Many Claude interactions involve personal issues, complex technical work, or high-stakes thinking. Dropping ads into those moments would feel intrusive and could quietly influence responses in ways users can’t easily detect. Ad incentives tend to expand over time, gradually optimizing for engagement rather than genuine usefulness. Incentives matter. This is a business-model decision, not just a product preference, Anthropic said: An ad-free assistant can focus entirely on what helps the user — even if that means a short exchange or no follow-up at all. An ad-supported model, by contrast, creates pressure to surface monetizable moments or keep users engaged longer than necessary. Once ads enter the system, users may start questioning whether recommendations are driven by help or by commerce. Anthropic isn’t rejecting commerce. Claude will still help users research, compare, and buy products when they ask. The company is also exploring “agentic commerce,” where the AI completes tasks like bookings or purchases on a user’s behalf. Commerce should be triggered by the user, not by advertisers, Anthropic said. The same rule applies to third-party integrations like Figma or Asana. These tools will remain user-directed, not sponsored. Super Bowl ad. Anthropic is making the argument publicly and aggressively. In a Super Bowl debut, the company mocked intrusive AI advertising by inserting fake product pitches into personal conversations. The ad closed with a clear message: “Ads are coming to AI. But not to Claude.” The campaign appears to be a direct shot at OpenAI, which has announced plans to introduce ads into ChatGPT. Here’s the ad: Claude’s blog post. Claude is a space to think View the full article
  5. Since then, AI Overviews have rolled out to more countries and more languages, and many websites have continued to see the clicks from organic search decline month-over-month. Today, it seems likely that the impact of AI Overviews has worsened. As…Read more ›View the full article
  6. We live in a world of increasing change. The international order is shifting and political certainties are evaporating day by day. Technological shifts are changing how we experience the world and interact with others. And in the workplace, AI is poised to unleash what might be the most revolutionary set of changes humanity has experienced since the first hunter-gatherers settled down to grow crops and build cities. But while change is everywhere, we still find it hard to manage. The statistics around organizational change have always been brutal. For at least the last quarter century, corporate transformation efforts have failed at a remarkable rate: only three out of ten are brought to something approaching a successful conclusion. The age of AI will make things even more challenging. We will need to adapt more rapidly and more comprehensively, and we will need to manage multiple layers of continuous change at any one time. How will we cope? Many different factors contribute to making change hard, but one in particular stands out: change is tiring. At the human level, constant transformation depletes our energy, attention, and commitment. At the organizational level, this depletion translates into stalled initiatives, institutional resistance, and a diminishing capacity for further adaptation. To make the process of change navigable for real humans—rather than the compliant ideals who often appear in strategy decks—we need to rethink how we understand change. We need to find the stable foundations that persist amidst the maelstrom of transformation. The adaptation fallacy The standard response to the reality of ever-increasing change is to insist that individuals and organizations simply adapt to it. “Everything flows,” as the ancient Greek philosopher Heraclitus is reported to have said. The world is in flux, nothing is fixed, and we should all get used to the idea that the stability of the past was just a temporary illusion. This ancient wisdom has become something of a cliché, the “It is what it is” of the business world. It is offered up as a slogan to hold onto, a manifesto that distils the increasingly rapid change of the Fourth Industrial Revolution. But it doesn’t do much to help people stay afloat. Human beings are not infinitely malleable. There is a psychological and physical toll to constant, chaotic change that compounds in two distinct ways. The first is the sheer quantity of simultaneous initiatives—the burden of switching between half a dozen transformation efforts at once. People find themselves juggling competing priorities, each with its own vocabulary, metrics, and demands on their attention. It becomes hard to see the big picture because the parts never stay still long enough to focus. The cognitive overhead of keeping track of everything crowds out the close attention that each single initiative requires. The second is the exhausting length of individual change processes that can stretch over many months, or even years. The reasons for the change, once vivid and urgent, become abstract and distant. Champions move on, new people arrive who weren’t part of the original vision, and maintaining momentum becomes harder with each passing quarter. The demand that team members “adapt to the new reality” addresses neither problem. The flux pushes and pulls them in different directions with no coherence, giving them no stable ground to stand on. Expecting people to “get used to it” amounts to expecting people not to be human. Leaders who demand adaptation without addressing the underlying human experience are not solving the problem. They are adding to it. The other Heraclitus Heraclitus has some real wisdom that can help here, but we need to move past the most common versions of his sayings. Heraclitus’ most famous aphorism is usually rendered as “You cannot step into the same river twice.” The idea is that when you step into the river, the water flows on and so it is not the same when you take your next step. “Panta rhei.” “Everything flows.” But there is another version of this saying that comes closer to capturing what Heraclitus actually meant: “We step and do not step into the same river twice.” The difference is small, but it matters. Yes, the water flows. Yes, the river is never the same from one moment to the next. But the river itself remains. The river has an identity that persists through its constant flow. There is an important lesson here for organizations seeking to manage change. Recognizing that things flow is important. But we also have to identify and spotlight what it is that persists through that change. Finding, defining, and celebrating the order that underlies the chaos is essential if we do not want to be swept away. The task is not to eliminate flux—that is neither possible nor desirable—but to identify and preserve the stability that gives change its meaning. Purpose, identity, strategic clarity: these define the organization and give it its identity. They provide the stable vessel that allows people to float happily along on the flowing water rather than being pulled under by the constant motion. Providing this stability is the leader’s responsibility. The constants that allow people to navigate change do not maintain themselves. They must be deliberately established, clearly communicated, and actively protected. What leaders must do If change fatigue is not a failure of individual resilience but a failure of organizational design, then leaders must take responsibility for building organizations in which change happens more easily. Here are five principles that can help provide stability in a changing world. Be discerning about what you change. Not every transformation deserves equal energy. The familiar danger of chasing shiny objects—constantly running from one initiative to the next—undermines the efforts that actually matter. Every proposed change should be tested against the organization’s strategic purpose. If it does not clearly advance the core mission, it should not be adding to the cognitive burden on your teams. Communicate the why, not just the what. Much of change fatigue comes not from the pace of change itself but from the cognitive burden of not understanding how changes connect. When people cannot see how a new process, tool, or structure contributes to an outcome, changes feel arbitrary and exhausting. Often, even C-suite executives are not fully aligned on precisely why things matter. That confusion cascades downward, multiplying fatigue at every level. Leaders must articulate the purpose behind each initiative and show how it fits into a coherent whole. Build a unified narrative. When organizations pursue multiple change initiatives simultaneously, a unified story smooths the cognitive burden by holding the pieces together. Rather than experiencing six disconnected transformations, people can understand themselves as participating in a single journey with multiple dimensions. The narrative does not eliminate the work, but it reduces the sense of fragmentation. Create systemic anchors that survive turnover. Long-term change efforts may see key leaders depart before the work is complete. If the change depends entirely on individual champions, it will falter when those individuals leave. There must be a process core—governance structures, documentation, embedded practices—that can survive personnel changes and maintain momentum independent of any single person. Co-design the change with the humans who must live with it. Change fatigue intensifies when transformations are handed down fully formed, only to collide with realities about which leaders were unaware. Co-design reverses this logic. Frontline staff know the constraints they face; customers know what the change feels like on the receiving end. When the people involved help shape the new way of working that will affect them, compliance turns into ownership, and the change arrives already adapted to the world it must survive in. Principles in practice A clear illustration of these principles being put into successful practice comes from Gold Coast Mental Health and Specialist Services in Queensland, Australia, which undertook a sustained transformation to support the adoption of a Zero Suicide approach. The case is instructive both for the ambition of the goal—to permanently shift the culture of a whole health system—and for the care that was taken to make the change sustainable over time. From the outset, the program was framed as a system-wide approach rather than as “the heroic efforts of individual practitioners.” This distinction matters. When success depends on personal endurance, organizations quietly burn through their people. By treating transformation as a collective endeavor supported by organizational structures rather than individual willpower, the program avoided placing impossible burdens on staff already working in an emotionally demanding field. The new practices were institutionalized, not merely announced. Training reached more than 500 staff and was then embedded into orientation for new hires and supported by online modules, face-to-face sessions, and custom-produced materials designed with the local culture in mind. The change was designed to outlast the people who initiated it—the kind of systemic anchor that keeps momentum alive even as personnel turn over. Crucially, the service built feedback loops to prevent drift. Staff received timely data on adherence to the new pathway, followed by supervision and coaching to embed skills. This continuous improvement cycle meant that standards did not have to be constantly re-litigated; the system itself kept reinforcing what good practice looked like. Co-design was an essential component of the Gold Coast approach. The service’s culture change strategy explicitly integrates suicide attempt and loss survivors in leadership and planning roles, recognizing that effective prevention requires perspectives beyond those of clinicians. The result is that change is shaped by those most exposed to its failure modes. Gold Coast’s transformation success did not depend on asking already-stretched professionals to simply try harder. Instead, the program leaders ensured that their teams could see the unifying structures that provided stability, meaning, and identity through change. Conclusion The 70% failure rate is not a law of nature. It is the predictable result of asking people to navigate constant change without giving them anything constant to hold onto. Purpose, identity, strategic clarity—these are not luxuries to be addressed once the “real work” of transformation is complete. They are the vessel that keeps people afloat. Without that vessel, you are asking your people to swim through every change. And eventually, swimmers tire. Build the structures. Communicate the purpose. Shine a clear and steady light on what endures. That is how transformation succeeds—not by demanding more adaptation, but by providing stable foundations on which to build something new. View the full article
  7. A few weeks ago, we completed our seventh profit share at Buffer, distributing $377,005 to 75 teammates. This represents more than a dollar amount. It reflects a real turnaround in our growth and in where we are headed. Just a few years ago, we were not in a position to do this at all. We went through two years of net losses, made hard decisions, and focused deeply on rebuilding a healthy, sustainable business. There were moments when profitability and profit sharing felt far away. So returning to this practice in 2025, with an average share of $5,095 per teammate, feels especially meaningful. It reflects the resilience of our team, the care people have put into their work, and the progress we have made together. In this post, I’ll share more about this milestone, why profit sharing is so meaningful at Buffer, and detail how our formula works. Why we share profitsAt Buffer, profitability is not just a financial milestone. It’s what gives us the freedom to operate as an independent, long-term company. We are now in our 15th year of business. Staying profitable allows us to continue investing in our product, our customers, our team, and our mission without compromising our values. Our founder and CEO, Joel Gascoigne, often talks about how, if Buffer grows and succeeds, we should grow and succeed together. Profit sharing is one of the most tangible ways we put that belief into practice. When the company does well, our teammates benefit directly. That creates a strong connection between our collective effort and real, meaningful outcomes in people’s lives. It also reinforces something we care deeply about: building a company where success is shared. For me personally, profit sharing is a reminder that sustainable growth and people-first values do not have to be in tension. When done thoughtfully, they strengthen each other. Our profit-sharing historyWe started profit sharing in 2017. Here's a look at our profit shares since then: 2017: Net income: $1,418,604 | Profit share pool: $300,000 2018: Net income: $3,418,352 | Profit share pool: $394,997 2019: Net income: $2,392,409 | Profit share pool: $368,051 2020: Net income: $1,940,828 | Profit share pool: $243,047 2021: Net income: $216,557 | Profit share pool: $89,828 2022: Net income: -$1,109,745 | Profit share pool: $0 2023: Net income: -$743,936 | Profit share pool: $0 2024: Net income: $156,244 | Profit share pool: $30,369 2025: Net income: $2,513,367 | Profit share pool: $377,005 Looking at this history, you can see a clear story. We had several strong growth years, followed by a sharp decline in 2021. In 2022 and 2023, we were not profitable and did not issue a profit share. In 2024, we began rebuilding, but at a modest pace. 2025 represents something different. It is our second-largest profit share to date and a meaningful milestone after two challenging years. It reflects the work our team has done to stabilize, refocus, and rebuild a strong foundation for the future. How Our Profit-Sharing Formula WorksWe distribute 15 percent of our annual net profit using a formula designed to balance fairness, transparency, and impact. Here’s how it works: 40 percent distributed equally Every teammate receives the same amount from this portion, regardless of role or tenure. 40 percent based on salary Each teammate receives a share proportional to their salary as a percentage of total company payroll. This reflects our broader compensation philosophy and our investment in building a thoughtful, transparent salary system. 20 percent based on tenure This portion is based on how long each person has worked at Buffer relative to the total months worked by all teammates. For 2025, this resulted in: Net profit: $2,513,367Profit share pool (15 percent): $377,005Team members: 75Average profit share: $5,095Each part of the formula serves a clear purpose. The equal distribution ensures that everyone shares in our success. The salary-based component reflects responsibility and scope. The tenure component recognizes long-term commitment and contribution. We have learned over time that compensation systems only work when people understand them and trust them. That is why we prioritize simplicity and transparency. Our goal is to create a system that feels fair, consistent, and aligned with our values. Looking AheadCompleting our seventh profit share has been a meaningful moment for me. It reflects the dedication of our teammates, the strength of our culture, and the belief that building a healthy company is a collective effort. It also reminds me that setbacks do not define us. How we respond to them does. Profit sharing is not just a financial outcome. It is a reflection of trust, resilience, and shared ownership in what we are building together. Over to youIf this post sparked questions for you about profit sharing or how we operate at Buffer, I would love to hear them. We are always learning, and we are grateful to be building this company alongside such a thoughtful community. View the full article
  8. Much to the chagrin of investors, the value of Bitcoin continues to slide. As of Wednesday morning, the world’s most popular cryptocurrency was trading around $75,000 per coin. That’s down more than 10% over the past five days, down 18% over the past month, and down a whopping 34% over the past six months. It’s a far cry from October of last year, when Bitcoin’s price topped out at nearly $125,000. Values are now roughly where they were in early April 2025, and before that, in November 2024. From bump to slump Notably, Bitcoin’s value is now lower than it was when The President took office last January, effectively giving up all of the “The President Bump” gains that it and other crypto assets saw over much of 2025. That bump was real, too: Bitcoin returns were down around 12% during Q1 2025, but jumped to nearly 30% during Q2, and then 6.3% in Q3, according to data from Coinglass. The downturn seemingly took hold at some point during Q4, when returns were down 23%. Ethereum, the second-largest crypto on the market, saw a similar trajectory: It saw huge increases (37% during Q2 and 67% during Q3), followed by a big drop in Q4 (down 28%). Here are a few notable crypto slumps year to date as of this writing: Bitcoin (BTC): Down 13.86% Ether (ETC): Down 25.56% XRP (XRP): Down 15.13% Why are crypto values declining? As for what’s causing the sell-off? A lot of things, but mostly, investors looking for off-ramps from riskier assets. This can partially be blamed on a government shutdown delaying the release of important economic data (the jobs report was supposed to come out on Friday, but has been delayed), geopolitical tensions rising in many parts of the world, on-again off-again tariff threats, and even the fear of increased regulation on the crypto industry, according to reporting from CNBC. The crypto markets are also caught in the maelstrom of other downturns. The stock market has been flat for the past month, with some earnings announcements—notably Microsoft’s—causing fervor and worry among investors. President The President has also named a new Fed Chair, who if confirmed will take up the position in mid-May, adding another variable of uncertainty into the mix. As a cherry on top, precious metals have seen a steep sell-off, too. Gold and silver prices had shot to the moon in recent months, last week, silver experienced its worst day ever, falling nearly 30%. The confluence of all of these factors is what’s led to massive crypto volatility. This story is developing… View the full article
  9. Small business owners across the country may want to pay attention to a significant legal development concerning Oregon’s Extended Producer Responsibility (EPR) law. The National Federation of Independent Business (NFIB) has filed an amicus brief in the case National Association of Wholesaler-Distributors v. Leah Feldon, et al. in the U.S. District Court for the District of Oregon, challenging this law. This pivotal case could have implications for small businesses, particularly those that operate or sell products across state lines. Oregon’s EPR law aims to shift the burden of disposal and recycling to producers, but NFIB argues that it crosses constitutional boundaries by imposing regulatory requirements on businesses based outside the state. As Beth Milito, Vice President and Executive Director of NFIB’s Small Business Legal Center, pointed out, “Oregon’s EPR regime goes beyond the authority granted to states in the Constitution by attempting to regulate businesses outside of Oregon.” This assertion suggests that if the court finds the law unconstitutional, it could set a precedent that protects small businesses from similar regulations in other states. One of the primary arguments in NFIB’s brief is that Oregon’s EPR law is an unconstitutional extraterritorial regulation, which threatens the operational capabilities of small businesses engaged in interstate commerce. The group contends that imposing additional compliance costs not only burdens businesses financially but also interferes with their ability to compete effectively. Milito emphasizes, “Allowing one state to impose significant burdens on producers outside of its borders will negatively impact any small business that does commerce in the region.” This case underscores the broader implications of regulatory overreach. For many small businesses, compliance with state-specific regulations can be challenging and costly, especially when those regulations extend beyond the state line. Business owners may find themselves navigating a complex patchwork of laws that vary by location, complicating their logistics and operational strategies. The NFIB’s brief also critiques how Oregon’s EPR law grants regulatory authority to a private entity, which can exacerbate these challenges. This delegation could lead to inconsistent enforcement and added layers of confusion for small producers, who may struggle to understand the requirements they must meet to sell their products in Oregon. The risk of running afoul of such regulations may force businesses to divert resources away from growth and innovation toward compliance efforts. For small business owners, the key takeaways are clear. They should remain vigilant about the potential for regulatory changes that could affect their operations and be prepared for the possibility of costs associated with compliance. The outcome of this case could either embolden other states to enact similar laws or serve as a wake-up call for regulatory reconsideration across the country. As the legal battle unfolds, small business owners should monitor the discussions surrounding the case, especially how it may clarify the extent of state powers over out-of-state businesses. The NFIB’s ongoing involvement in litigation aims to protect not just its members but the broader interests of small businesses nationwide. For those looking for more detailed information, the full text of NFIB’s amicus brief and further updates can be found on their website. Engaging with community and industry partners about these issues could provide valuable insights and strengthen advocacy efforts aimed at protecting small business rights, particularly in terms of navigating complex regulatory landscapes. As developments arise, small business stakeholders should remain informed and proactive in their approach to adapting to any regulatory changes that may come their way. For further details on the NFIB’s position and the ongoing legal context, visit NFIB’s official page. Image via Google Gemini This article, "NFIB Challenges Oregon’s EPR Law, Citing Interstate Commerce Risks" was first published on Small Business Trends View the full article
  10. The U.S. Justice Department and a coalition of states plan to appeal a federal judge’s remedies ruling in the Google search antitrust case. The appeal challenges a decision that found Google illegally monopolized search but stopped short of imposing major structural changes, such as forcing a divestiture of Chrome or banning default search deals outright. What’s happening. The DOJ and state attorneys general filed notices of appeal yesterday, challenging U.S. District Judge Amit Mehta’s September remedies ruling, Bloomberg and Reuters reported. Mehta ruled in August 2024 that Google unlawfully maintained its search monopoly through default search agreements with Apple, Samsung, and other device makers. Those deals cost Google more than $20 billion a year and blocked rivals from key distribution channels. After a second remedies trial in 2025, Mehta rejected the government’s request to force Google to divest Chrome or prohibit payments for default search status. Instead, he ordered Google to rebid its default search and AI app contracts annually. Why we care. The appeal means we still don’t know how much Google will keep controlling where search gets placed. And that control basically decides who wins traffic. If stricter fixes happen, it could change default search settings, open the door to rival search engines, and shift how people use search across devices. Yes, but. The DOJ and states haven’t detailed their legal arguments. Court filings didn’t specify which parts of the ruling they will challenge, though attention is expected to focus on Chrome and Google’s default search deal with Apple. What to watch. The U.S. Court of Appeals for the D.C. Circuit is expected to hear the case later this year. For now, it’s business as usual for Google — though its most important contracts now face annual review, and the risk of tougher remedies remains firmly on the table. What they’re saying. David Segal, Yelp’s vice president of public policy, welcomed the appeal. In a statement shared with Search Engine Land, Yelp said the trial court’s remedies do not go far enough to restore real competition in search: “Unfortunately, the measures put forth in the trial court’s remedy decision are unlikely to restore competition — for instance, it allows for Google to continue to pay third parties for default placement in browsers and devices, which was the primary mechanism by which Google unlawfully foreclosed competition to begin with. “Internet users, online advertisers and others who rely on and seek to compete in the industry deserve a level playing field with more, higher quality, and fairer search options — and the need for a more competitive space is all the more clear as Google seeks to leverage its vast power over the web, especially search indexing and ranking, to come to dominate the GenAI space.” View the full article
  11. Private banker quietly became paedophile financier’s money manager in his final yearsView the full article
  12. Today
  13. President Donald The President says history is on his side. He wants to build a towering arch near the Lincoln Memorial and argues that the nation’s capital first clamored for such a monument two centuries ago — even going so far as to erect four eagle statues as part of the project before being derailed by the attack on Fort Sumter. “It was interrupted by a thing called the Civil War, and so it never got built,” The President said aboard Air Force One as he flew to Florida last weekend. “Then, they almost built something in 1902, but it never happened.” The President’s history is off — the eagles he references are actually part of a bridge connecting Virginia and Washington that was built decades after the Civil War. The closest Washington came to an arch was a wood and plaster construction built in 1919 to mark the end of World War I — and even that was always meant to be temporary. “For 200 years they’ve wanted to build an arc,” The President said, meaning an arch. “They have 57 cities throughout the world that have them. We’re the only major city – Washington, D.C. – that doesn’t.” Chandra Manning, a history professor at Georgetown University, said Washington was fledgling in the 19th century, dealing with a housing shortage, a lack of boarding houses for visitors, roads that went nowhere and an incomplete U.S. Capitol. “Washington coming into the Civil War was still this unfinished city,” Manning said. “There’s no push for decorative memorialization in Antebellum Washington because it’s still such a place that doesn’t even have all the functional buildings it needs yet.” The President has offered a similar historical rationale for the $400 million ballroom he demolished the White House’s East Wing to begin building — arguing that officials for 150 years have wanted a large event space. That claim, too, is dubious. While space at the White House has indeed long been an issue, there’s no record of public outcry for a ballroom. The President nonetheless is employing a similar argument to justify the arch. “I think it will be the most beautiful in the world,” he said. ‘Biggest one of all’ The arch would stand near the Arlington Memorial Bridge, which spans the Potomac River. The President first unveiled the idea at an October dinner for top donors to his ballroom. Without divulging how much the arch would cost, who would pay for it or whether he’d seek approval from planning officials, the president showed off three different-sized arch models, all featuring a statue of Lady Liberty on top. The president acknowledged then that the largest one was his favorite, and The Washington Post reported that The President is mulling building an arch standing 250 feet (76 meters) tall. Asked about that aboard Air Force One, The President didn’t confirm the exact height he desires, but offered: “I’d like it to be the biggest one of all.” “We’re setting up a committee, and the committee is going to be going over it,” The President said. “It’ll be substantial.” The president says he’d like the new structure to be reminiscent of the Arc de Triomphe, at the end of the Champs-Élysées in Paris, which was built to honor those who fought for France during the French Revolution and Napoleonic wars. But that monument stands only 50 meters (164 feet) high. A 250-foot Washington arch would dwarf the Lincoln Memorial and White House, and even rival the Capitol, which stands 288 feet (88 meters). The finished arch would be part of a building boom The President has personally triggered, anxious to use his background as a onetime New York construction mogul to leave a lasting physical mark on the presidency. In addition to the ballroom, The President is closing the Kennedy Center for two years of renovations amid backlash from artists over changes he’s made at the nation’s premier performing arts venue. He replaced the lawn in the Rose Garden with a patio area reminiscent of his Mar-a-Lago estate in Florida, and redecorated the Lincoln Bathroom and Palm Room in the White House’s interior. The President also installed a Walk of Fame featuring portraits of past presidents along the Colonnade, massive flagpoles on the north and south lawns, and golden flourishes, cherubs and other flashy items to the substantially overhauled Oval Office. The arch would extend the president’s influence into Washington, where he has talked of beautifying “tired” grassy areas and broken signage and street medians and also deployed the National Guard to help break up homeless encampments. Harrison Design, a local firm, is working on the project, though no construction start date has been announced. The President wants to unveil the new structure as part of celebrations marking America’s 250th birthday. The bridge actually came after the Civil War Pressed on what The President meant by the four eagles, the White House sent a photo showing eagle sculptures at the four corners of the Arlington Memorial Bridge, but no further details. “President The President is right. The American people for nearly 200 years have wanted an Arch in our Nation’s capital to showcase our great history,” White House spokesperson Davis Ingle said in a statement. “President The President’s bold vision will be imprinted upon the fabric of America and be felt by generations to come. His successes will continue to give the greatest Nation on earth — America — the glory it deserves.” The president’s timing is off, though. The Arlington Memorial Bridge was first proposed in 1886, but it wasn’t approved by Congress until 1925. According to the National Park Service, the bridge was conceived after the Civil War and meant to memorialize the symbolic reunification of the North and South. It was originally built to link the site of the Lincoln Memorial with the home of Confederate Gen. Robert E. Lee — where Arlington National Cemetery now stands. At the time, the direction the eagles would face — right or left, meant to symbolize inward toward the city or outward facing visitors — sparked controversy. The park service says the bridge was constructed between 1926 and 1931, and an engineer’s report lists only slightly different dates — still decades after The President’s timeline. Washington also once had a Victory Arch built near the White House in 1919, to commemorate the end of World War I. It was wood and plaster, however, and meant to be temporary. That structure was torn down in the summer of 1920. A 2000 proposal called for a peace arch in Washington, but those plans were abandoned after the Sept. 11 attacks the following year. Manning, who is also a former National Park Service ranger, said that, Washington aside, “I don’t know of a long U.S. tradition of building arches for things.” “That sounds like an import from elsewhere to me,” she said. —Will Weissert, Associated Press View the full article
  14. We may earn a commission from links on this page. Deal pricing and availability subject to change after time of publication. At $99 on Woot, the factory reconditioned Bowers & Wilkins Px7 S2 headphones are nearly $60 cheaper than the next best price on Amazon and the lowest they’ve been to date, according to price trackers. While “refurbished” might scare off some shoppers, this deal includes all accessories in retail packaging, a 90-day warranty, and a professional inspection that guarantees like-new functionality. Cosmetic wear should be minimal and only noticeable up close, if that. You get a 90-day Woot warranty, and free shipping if you’re a Prime member (with a $6 fee for everyone else). This deal runs for about three weeks or until the stock runs out. Bowers & Wilkins Px7 S2 headphones $99.00 at Woot Get Deal Get Deal $99.00 at Woot The Px7 S2’s knit fabric ear cups and metallic accents look sharp, and the padding feels good against the skin. Clamp force is on the firmer side but generally comfortable, though long sessions can create pressure points for some head shapes. They work well for commuting and travel, thanks to active noise cancelling that handles voices and higher-frequency noise better than low-end rumbles. Physical buttons handle power, volume, playback, and noise control. They’re tactile and easy to use, though there’s no audible feedback for volume changes. As for portability, the ear cups swivel flat but don’t fold, so portability depends on the included hard case, which is well-made but bulky. Battery life is a strong point. With ANC on, they can stretch past 30 hours, and a 15-minute charge gets you several more hours of playback. Sonically, these headphones skew bass-heavy. You get a punchy, energetic profile that works well for electronic music, though it can feel muddy for anything more detailed, like acoustic or jazz. There’s a two-band EQ in the companion app, but no deep customization beyond that. Codec support includes aptX HD and aptX Adaptive, which is great if your devices support them. All said, if you're not chasing absolute top-tier ANC and can live with the stock sound signature, you're getting flagship-level comfort and design here, at a mid-range price point. Our Best Editor-Vetted Tech Deals Right Now Apple AirPods 4 Active Noise Cancelling Wireless Earbuds — $139.99 (List Price $179.00) Apple Watch Series 11 [GPS 46mm] Smartwatch with Jet Black Aluminum Case with Black Sport Band - M/L. Sleep Score, Fitness Tracker, Health Monitoring, Always-On Display, Water Resistant — $329.00 (List Price $429.00) Samsung Galaxy Tab A9+ 10.9" 64GB Wi-Fi Tablet (Graphite) — $149.99 (List Price $219.99) Apple iPad 11" 128GB A16 WiFi Tablet (Blue, 2025) — $299.99 (List Price $349.00) Blink Mini 2 1080p Security Camera (White) — $23.99 (List Price $39.99) Ring Outdoor Cam Pro Plug-In With Outdoor Cam Plus Battery (White) — $189.99 (List Price $259.99) Amazon Fire TV Stick 4K Plus — (List Price $24.99 With Code "FTV4K25") Deals are selected by our commerce team View the full article
  15. Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web. Google's search rankings are highly volatile...View the full article
  16. If your CPCs keep climbing, the cause may not be your bid strategy, your budget, or even your competitors. You might be suffering from low ad quality. Let’s break down the most foundational — and most misunderstood — metric in your Google Ads account. If you want to stop overpaying Google and start winning auctions on merit, you need to understand how the 1-to-10 Quality Score actually works. The difference between Quality Score, Ad Strength, and Optimization Score Before we dive in, let’s clear up the confusion. Google shows a lot of “scores” and “diagnostics,” and you can safely ignore most of them. Quality Score is the exception. Ad strength is an ad-level diagnostic. It checks whether your responsive ad follows best practices, like having enough headlines and descriptions. It has zero impact on auction performance. Optimization score is a sales metric. It measures how many Google recommendations you’ve reviewed. It does not reflect real campaign performance. Quality Score is different. It’s foundational. This keyword-level diagnostic summarizes the quality of your ads. Along with your bid, it determines Ad Rank. Ad Rank determines whether your ad appears at all, where it appears on the SERP, and how much you pay per click. The formula is simple: Ad Rank = price × quality. The 1–10 score you see is only a summary, but it reflects the real-time quality calculation Google runs on every single search. Setting up your dashboard: How to find your Quality Score You can’t fix what you can’t see. To get started, go to your Keywords report in Google Ads and add these four columns: Quality Score Exp. CTR Ad Relevance Landing Page Exp. When you analyze Quality Score, don’t judge keywords in isolation. You’ll drive yourself crazy. Look for patterns at the ad group level instead. If most keywords have a Quality Score of 7 or higher, you’re in good shape. If most are at 5 or below, that’s your cue to roll up your sleeves and improve ad quality. The three core components of Quality Score and how to fix them 1. Ad Relevance: The ‘message match’ This is the only part of Quality Score fully within your control. It asks one simple question: Does the keyword match the ad and the landing page? If your ad relevance is generally “Below average,” the fastest fix is Dynamic Keyword Insertion. It automatically inserts your keywords into the ad text. If you prefer a manual approach, make sure the keywords in the ad group actually appear in both the ad copy and the landing page. 2. Landing Page Experience: The “Delivery” When Google sends users to your site, do they find what they’re looking for? Or do they bounce after two seconds and head back to Google for a better result? If your landing page experience score is low, start with the PageSpeed Insights tool. A “Below average” rating often points to slow load times, a poor mobile experience, generic content, weak navigation, or all of the above. 3. Expected CTR: The “Popularity Contest” Google only makes money when users click, so it favors ads people are most likely to click. If your expected CTR is lagging, start with competitive research: Check Auction Insights to see who you’re competing against. A “Below average” expected CTR means their ads are earning higher click-through rates than yours. Next, visit the Google Ads Transparency Center and review your competitors’ ads. Are their offers more enticing? Is their copy more clickable? Borrow what works and update your own ads. If your ads are great but CTR is still low, review the Search terms report. You may be showing for irrelevant queries, which explains why users aren’t clicking on an otherwise awesome ad. What’s a realistic Quality Score goal? I’ll be honest: chasing a 10/10 Quality Score everywhere is a waste of time. It’s unrealistic and usually unnecessary. Instead, do a quick check-up every few months. Find one or two ad groups with lower Quality Scores, identify the most “Below Average” component, and fix that first. Improving ad quality takes more effort than raising budgets or bids. But it pays off with more clicks at the same — or even lower — cost. This article is part of our ongoing Search Engine Land series, Everything you need to know about Google Ads in less than 3 minutes. In each edition, Jyll highlights a different Google Ads feature, and what you need to know to get the best results from it – all in a quick 3-minute read. View the full article
  17. Shore Capital Partners, a Chicago-based private equity firm founded by billionaire Justin Ishbia, has raised more than $400 million for its second industrial fund. View the full article
  18. A figure resembling Italy’s prime minister on a Rome church wall blurred the lines between religion and politicsView the full article
  19. Despite a weak ADP jobs print, Treasury yields went nowhere, reinforcing a growing bearish, defensive case for rates, according to the CEO of IF Securities. View the full article
  20. Sales closing techniques are vital for your success in selling. By honing various methods, such as the Assumptive Close or the Scarcity Close, you can markedly improve your chances of sealing the deal. Each technique serves a unique purpose, whether it’s creating urgency or reinforcing value. Comprehending these strategies can transform your approach to sales. Let’s explore these fundamental types of closes and see how they can enhance your performance in the field. Key Takeaways Assumptive Close: Presumes the prospect is ready to buy, guiding the conversation towards next steps and reinforcing confidence in their decision. Puppy Dog Close: Offers free trials to create emotional attachment, reducing perceived risk and increasing likelihood of commitment to purchase. Scarcity Close: Utilizes limited-time offers to instill urgency, prompting quicker decision-making from prospects interested in the product. Summary Close: Recaps key benefits to reinforce value, encouraging commitment by reminding prospects of what they stand to gain. Takeaway Close: Implements reverse psychology by suggesting the product may not be a perfect fit, prompting prospects to reaffirm their interest. Understanding Sales Closing Grasping sales closing is essential for any salesperson who wants to succeed in their role, as it represents the final step in the sales process where discussions and demonstrations culminate in a definitive agreement. Comprehending sales closing involves knowing the different types of sales closes, like the assumptive close, which assumes the prospect is ready to buy. This method can be effective when you’ve built rapport and addressed their needs throughout the process. Remember, effective closing isn’t about pushing; it’s about patience and persistence. Prospects may initially resist, but your continuous engagement helps them feel supported. Tailoring your approach and adapting closing techniques to individual customer characteristics will improve your sales effectiveness and boost your closing rates considerably. Common Sales Closing Techniques Comprehending various sales closing techniques can greatly improve your ability to secure deals. The Assumptive Close involves confidently proceeding as if the prospect is ready to buy, which can instill confidence. Alternatively, the Puppy Dog Close lets prospects try the product for free, nurturing emotional attachment. The Scarcity Close plays on the fear of missing out by emphasizing limited-time offers, prompting quicker decisions. The Summary Close recaps the key benefits discussed, reinforcing value to encourage commitment. Finally, the Now-or-Never Close introduces urgency, motivating prospects to act swiftly with special incentives. Effective Closing Techniques When you’re closing a sale, using effective techniques can make a big difference in your success. Strategies like the Assumptive Close, which presumes the prospect is ready to buy, and the Scarcity Close, which creates urgency, can prompt quicker decisions. Furthermore, the Puppy Dog Approach lets prospects try the product, increasing their confidence and likelihood of purchase. Assumptive Close Strategy The Assumptive Close strategy stands out as a potent technique in sales, effectively guiding prospects toward a purchase by confidently presuming their readiness to buy. This approach involves asking questions about details like delivery timing or the quantity needed, which encourages the prospect to align with your expectation of a sale. It’s crucial to understand the prospect’s needs and demonstrated interest to guarantee your confidence doesn’t come off as aggression. Pay attention to client reactions and any objections they may have, as this will help you reinforce their readiness and address concerns. When executed correctly, the Assumptive Close can streamline the sales process, making it feel like a natural progression rather than a hard sell. Scarcity and Urgency Utilizing scarcity and urgency in sales can greatly improve your closing techniques, as they tap into the natural human tendency to fear missing out on valuable opportunities. The Scarcity Close highlights limited-time offers or low stock levels, prompting prospects to act quickly and secure a deal. Research indicates that 70% of consumers are more inclined to purchase when they perceive limited availability. Effective urgency tactics involve clearly communicating deadlines, such as “only available until the end of the month” or “only two spots left.” The Now-or-Never Close emphasizes time-sensitive incentives to encourage immediate action. Nevertheless, employing these techniques requires honesty and transparency, as misleading claims can damage trust and long-term customer relationships. Puppy Dog Approach Building on the effectiveness of scarcity and urgency, the Puppy Dog Close offers an alternative approach that engages prospects in a hands-on way. This technique allows potential buyers to experience your product or service risk-free, often through free trials or demos. By immersing themselves in the offering, prospects can form a stronger emotional attachment, greatly increasing the likelihood of a purchase. The familiarity gained during this trial period encourages a sense of ownership and commitment, leading to higher conversion rates. Nevertheless, successful implementation requires careful timing and rapport-building, ensuring that prospects feel valued and comfortable. In the end, the Puppy Dog Close minimizes perceived risk, making the shift to purchase more seamless and effective for both the salesperson and the prospect. The Assumptive Close When you assume a prospect is ready to buy, you’re employing a strong technique known as the Assumptive Close. This method encourages quicker decisions by framing the conversation around next steps, showing confidence in your prospect’s readiness. Consider these key aspects when using the Assumptive Close: Establish a strong rapport with your prospect. Ask questions about delivery timing or product specifics. Monitor their interest and objections closely. Use their feedback to align with their buying intent. The Puppy Dog Close The Puppy Dog Close is a potent sales technique that lets prospects try a product or service for free or at a low cost, nurturing a sense of ownership. By allowing customers to experience the benefits firsthand, this approach creates an emotional attachment that often leads to a commitment to purchase after the trial period. As a result, it effectively minimizes perceived risk and can greatly increase closing rates when done correctly. Trial Experience Benefits Trial experiences, often referred to as the Puppy Dog Close, offer significant advantages for both salespeople and prospects. By allowing prospects to try a product or service, you can effectively boost their confidence in making a purchase. Here are some key benefits: Increased Emotional Connection: Experiencing the product firsthand creates a bond. Higher Conversion Rates: Familiarity with the product often leads to more sales. Reduced Perceived Risk: A no-obligation trial makes prospects feel more secure about their decision. Immediate Satisfaction: Products that show quick results encourage commitment. This approach leverages reciprocity, making customers feel inclined to buy after enjoying a free trial. Ultimately, the Puppy Dog Close is a practical tactic that benefits everyone involved. Emotional Attachment Creation Creating an emotional attachment is a key aspect of the Puppy Dog Close, which involves offering prospects the chance to experience a product or service without any commitment. This technique allows potential customers to interact with your offering, nurturing familiarity that can lead to a purchase. Here’s a breakdown of the Puppy Dog Close benefits: Benefit Description Experience Allows prospects to use the product for free. Familiarity Builds a connection, increasing comfort with the product. Reduced Risk Lowers perceived risk, making the decision easier. Increased Value Perception Customers recognize the product’s benefits firsthand. Higher Closing Rates Engaged customers are more likely to buy after the trial. Commitment After Trial Once you’ve given prospects the chance to try out your product or service through the Puppy Dog Close, their experience can greatly influence their commitment to purchase. By allowing them to interact with your offering risk-free, you encourage emotional attachment, which motivates them to buy. Research shows that customers who test products before purchasing are 60% more likely to commit. This technique works well in industries like: Software, where functionality is essential Fitness equipment, demonstrating tangible benefits Automotive sales, providing real-life driving experience Retail, showcasing product quality and fit Offering a trial minimizes perceived risk and leverages reciprocity, making prospects feel inclined to reciprocate your trust by committing to a purchase after enjoying the trial. The Scarcity and Takeaway Close The Scarcity and Takeaway Close are potent techniques in the salesperson’s toolkit, designed to prompt quicker decisions from prospects. The Scarcity Close leverages the fear of missing out (FOMO) by creating urgency through limited-time offers or exclusive benefits, nudging interested prospects to decide faster. Conversely, the Takeaway Close employs reverse psychology, suggesting the product or service may not be the best fit, prompting prospects to reconsider and reaffirm their interest. Both techniques shine when dealing with prospects who’ve shown genuine interest but need an additional push to finalize their commitment. Successful implementation requires careful monitoring of cues and readiness to guarantee the approach feels natural rather than overly aggressive, as urgency can greatly boost conversion rates. Improving Your Closing Techniques Improving your closing techniques is essential for increasing your sales success, especially as prospects progress through the buying process. To improve your effectiveness, consider these strategies: Adapt your techniques based on the prospect’s personality and buying stage. Regularly analyze past sales interactions to identify what worked and where you can grow. Use sales technology to automate follow-ups and gain insights into customer needs. Engage in role-playing with peers to practice handling objections in a supportive environment. Frequently Asked Questions What Are the 7 Essential Selling Skills Every Sales Person Should Know? To excel in sales, you should master several crucial skills. First, effective communication helps you connect with clients. Next, active listening allows you to understand their needs. Building emotional intelligence aids in establishing rapport. Problem-solving skills enable you to address unique challenges. Furthermore, strong product knowledge boosts credibility. Finally, adaptability guarantees you can respond to varying situations. By developing these skills, you’ll improve your sales performance and client relationships considerably. What Are the Three Types of Closes of a Sale? There are three primary types of sales closes you should know: hard closes, soft closes, and trial closes. A hard close involves directly asking for the sale, often creating urgency. Conversely, a soft close encourages conversation, allowing you to understand the prospect’s feelings about the product. Meanwhile, a trial close assesses readiness by asking questions to clarify any remaining objections. Each type serves a specific purpose, enhancing your ability to finalize sales effectively. What Are the Sales Closing Techniques? Sales closing techniques are strategies designed to guide prospects toward making a purchase decision. For instance, the Assumptive Close involves acting as if the sale is already made, encouraging the prospect to move forward. The Puppy Dog Close lets them try the product without risk, nurturing attachment. The Scarcity Close plays on urgency by highlighting limited availability, whereas the Summary Close reviews key benefits, reinforcing the product’s value and addressing the prospect’s needs effectively. Which of the Following Is a Common Sales Closing Technique? A common sales closing technique is the Assumptive Close. In this approach, you assume the prospect is ready to buy, which can streamline the closing process. You might ask logistical questions, such as when they’d like the product delivered or how many units they need. This technique can create a natural shift into finalizing the sale, as it encourages the prospect to visualize their purchase without feeling pressured. Conclusion Achieving proficiency in these seven crucial sales closing techniques can greatly improve your effectiveness as a salesperson. By comprehending methods like the Assumptive Close and the Scarcity Close, you can better engage potential customers and increase conversion rates. Each technique serves a unique purpose, from creating urgency to reducing perceived risks. Continuously practicing and refining these skills will elevate your overall sales strategy, leading to more successful outcomes in your sales endeavors. Focus on incorporating these techniques into your daily interactions. Image via Google Gemini and ArtSmart This article, "7 Essential Types of Sales Closes Every Salesperson Should Know" was first published on Small Business Trends View the full article
  21. Bill Hunt shows how disciplined reasoning transforms SEO escalations into resolvable system behaviors rather than endless debates. The post The Real SEO Skill No One Teaches: Problem Deduction appeared first on Search Engine Journal. View the full article
  22. We may earn a commission from links on this page. For me, third-party "Joy-Cons" were an absolute necessity for comfortable portable play on the original Switch. But because the Switch 2 upgraded its Joy-Cons to a more complicated magnetic connection, we haven't seen many third-party options come out for it yet. It hasn't been as much of an issue for me as on the original Switch, since I do think the Joy-Con 2 are more comfortable than the original Joy-Cons. But they're still not ideal, especially when compared to other portable gaming options like the Steam Deck. That's why controller company Abxylute's recent announcement is so interesting to me. The company announced on Tuesday that it will soon launch Kickstarters for two different portable controller options for the Nintendo Switch 2, both of which use a novel solution. Instead of sliding onto the console's sides like the Joy-Cons, these are instead shells you slide the console itself into. Just plug your console into the USB-C dock at the bottom of the shell, and you'll get a wired connection to the controller halves that sit on either side of the shell, while still being able to use the system's built-in screen. These Joy-Con 2 alternatives offer gamer a bit more grip, and, as a bonus, one of them also essentially turns your Switch 2 into a portable GameCube. Credit: Abxylute Part of the fun of third-party Joy-Cons for the original Switch is that they took advantage of the console's modularity to offer wacky form factors. Take the NYXI Wizard, for example. This controller looked like a classic Nintendo GameCube WaveBird controller out of the box, but you could also take both halves of the controller off a central connector piece and slap them onto the sides of the original Switch for authentic portable Smash Bros. or Super Mario Sunshine gameplay. I missed seeing this kind of innovation on the Switch 2, especially since that system now offers even more GameCube games via Switch Online. Thankfully, though, Abxylute is picking up the slack. While its N6 portable controller is more traditional, its N9C is for retro gamers. It looks more like an original GameCube controller than a WaveBird, but it has the same familiar stick and button layout, and even has the right colors. There are modern niceties here, like home buttons and a more fully-featured D-Pad and right stick than on an actual GameCube controller. But this is shaping up to be the best way to play GameCube games on the Switch 2 on the go. The N6, meanwhile, has all the same features, but in a more common, Xbox 360 style layout, similar to the CRKD Nitro Deck 2. There are also some power user features in the fine print, like drift-proof sticks (capacitive on the N9C and Hall effect on the N6), a gyroscope, optional macro buttons, and on the N9C, clicky micro-switch buttons. You could use either controller for any type of game, if you'd like. But it's really the GameCube form factor that's convincing me here. Yes, this design will likely be more comfortable than the Joy-Con 2, but it's also more fun. Credit: Abxylute Unfortunately, because the Kickstarter campaigns for these have yet to launch, we don't yet know what they'll cost. Competitors like the Nitro Deck 2 cost $99, though, so you can probably expect to pay around that much. Abxylute's HandheldDIY J6 for the original Switch also costs $95 (when not on sale), which is another good barometer. That's on the pricey end: My favorite third-party Joy-Cons for the original Switch only cost $50. But for the extra features here, plus the pricing on the competition, it's probably a cost I'd be willing to eat—especially because alternatives are still rare. The only third-party Switch 2 Joy-Cons you can buy right now that offer the same magnetic plug-and-play as the official ones are the NYXI Hyperion 3, and they have yet to be released and cost $126.99 (and they'll jump to $170 once early bird deals end). Abxylute's new reveals may not have that same satisfying snap-on tech, but they're feature-rich, and should still be a good upgrade over the official Joy-Con 2. View the full article
  23. Google may finally be starting to address a popular SEO and AI visibility “tactic”: self-promotional “best of” listicles. That’s according to new research by Lily Ray, vice president, SEO strategy and research at Amsive. Across several SaaS brands hit hard in January, a pattern emerged. Many relied heavily on review-style content that ranked their own product as the No. 1 “best” in its category, often updated with the current year to trigger recency signals. What’s happening. After the December 2025 core update, Google search results showed increased volatility throughout January, according to Barry Schwartz. Google hasn’t announced or confirmed any updates this year, but the timing aligns with steep visibility losses at several well-known SaaS and B2B brands. According to Ray: In multiple cases, organic visibility dropped 30% to 50% within weeks. The losses were not domain-wide. They were concentrated in blog, guide, and tutorial subfolders. Those sections often contained dozens or hundreds of self-promotional listicles targeting “best” queries. In most cases, the publisher ranked itself first. Many of the articles were lightly refreshed with “2026” in the title, with little evidence of meaningful updates. “Presumably, these drops in Google organic results will also impact visibility across other LLMs that leverage Google’s search results, which extends beyond Google’s ecosystem of AI search products like Gemini and AI Mode [and AI Overviews], but is also likely to include ChatGPT,” Ray wrote. Why we care. Self-promotional listicles have been a shortcut for influencing rankings and AI-generated answers. If Google is now reevaluating how it treats this content, any strategies built around “best” queries are in danger of imploding. The gray area. Ranking yourself as the “best” without independent testing, clear methodology, or third-party validation has been considered (by most) to be a sketchy SEO tactic. It isn’t explicitly banned, but it definitely conflicts with Google’s guidance on reviews and trust. Google has repeatedly said that high-quality reviews should show first-hand experience, originality, and evidence of evaluation. Self-promotional listicles often fall short, especially when bias is not disclosed. Yes, but. Self-promotional listicles likely weren’t the only factor impacting organic visibility. Many affected sites also showed signs of rapid content scaling, automation, aggressive year-based refreshes, and other tactics tied to algorithmic risk. That said, the consistency of self-ranking “best” content among the hardest-hit sites suggests this signal could now carry more weight, especially when used at scale. What to watch. Whether self-promotional listicles earn citations and organic visibility. Google rarely applies changes evenly or instantly. If this volatility reflects updates to Google’s reviews system, the direction is clear. Content designed primarily to influence rankings, rather than to provide credible and independent evaluation, is becoming a liability. For brands chasing visibility in search and AI, the lesson is familiar: SEO shortcuts work until they don’t. The analysis. Is Google Finally Cracking Down on Self-Promotional Listicles? View the full article
  24. Efficiencies chief had previously told FT no services were cut in apparent contradiction of party’s claim of vast wasteView the full article
  25. Norwegian skier Nikolai Schirmer on Wednesday handed the International Olympic Committee a petition signed by more than 21,000 people and professional athletes who want to stop fossil fuel companies from sponsoring winter sports. Schirmer delivered the “Ski Fossil Free” petition to the IOC’s head of sustainability, Julie Duffus, at a hotel in the Italian city of Milan two days before the Milan Cortina Winter Olympics kick off. The petition asks the IOC and the International Ski and Snowboard Federation, FIS, to publish a report evaluating the appropriateness of fossil fuel marketing before next season. Schirmer, a filmmaker and two-time European Skier of the Year, spoke exclusively with The Associated Press outside the hotel, and said the IOC informed him that it would not allow media to witness their meeting. “It seems like the Olympics aren’t ready to be the positive force for change that they have the potential to be,” Schirmer told the AP afterward. “So I just hope this can be a little nudge in the right direction, but we will see.” Nikolai Schirmer Retreating winters spurred the skier to take action Schirmer is a freeride skier who documents his adventures exploring Europe’s steep terrains. While freeride skiing is not currently an Olympic event, he said he felt like he needed to bring attention to fossil fuel marketing. “The show goes on while the things you depend on to do your job — winter — is disappearing in front of your very eyes,” he said. “Not dealing with the climate crisis and not having skiing be a force for change just felt insane. We’re on the front lines.” Burning fossil fuels – coal, oil and gas – is the largest contributor to global climate change by far. As the Earth warms at a record rate, winters are shorter and milder and there is less snow globally, creating clear challenges for winter sports that depend on cold, snowy conditions. Researchers say the list of locales that could reliably host a Winter Games will shrink substantially in the coming years. Schirmer launched his petition drive in January. He surpassed his goal of 20,000 signatures in one month, and people continue to sign. It’s a first step, he argues, much like a campaign nearly 40 years ago that led to a ban of tobacco advertising at the Games. United Nations Secretary-General António Guterres has urged every country to ban advertising from fossil fuel companies. In his meeting on Wednesday, Schirmer said, the IOC’s head of sustainability pointed to the organization’s commitments to renewable energy. He said he feels that isn’t enough. The IOC told the AP in a statement that climate change is one of the most significant challenges facing sport and society. It didn’t say whether it will review fossil fuel marketing, as demanded by the petition. Olympic partners play an important role in supporting the Games, and they include those investing in clean energy, the statement said. FIS welcomes mobilization campaigns like this one, spokesperson Bruno Sassi said. He noted that He noted that no fossil fuel companies are partners of the FIS World Cup and FIS World Championships. Athletes say the petition is the start of a conversation Athlete-driven environmental group “Protect Our Winters” supported the petition drive. This is the first coordinated campaign about fossil fuel advertising centered around an Olympic Games, POW’s CEO Erin Sprague told the AP. American cross-country skier and Team USA member Gus Schumacher said he signed because it starts the conversation. “It’s short-sighted for teams and events to take money from these companies in exchange for helping them hold status as good, long-term energy producers,” he wrote in a text message. American cross-country skier Jack Berry said he’s hopeful this is an influential step toward a systemic shift away from the industry. Berry is seeking a spot on Team USA for the Paralympics in March. An Italian oil and gas company is sponsoring these Olympics Italy’s Eni, one of the world’s seven supermajor oil companies, is a “premium partner” of these Winter Games. Other oil and gas companies sponsor Olympic teams. Eni said it’s strongly committed to the energy transition, as evidenced by how it’s growing its lower carbon businesses, reducing emissions and aiming for carbon neutrality by 2050. And the company defended its role in the Winter Games. “Through the partnership with the biggest event hosted by Italy in the next 20 years, Eni wants to confirm its commitment to the future of the country and to a progressively more sustainable energy system through a fair transition path,” spokesperson Roberto Albini wrote in an e-mail. A January report found that promoting polluting companies at the Olympics will grow their businesses and lead to more greenhouse gas emissions that warm the planet and melt snow cover and glacier ice. Albini disputed the emissions calculations for Eni in the Olympics Torched report. Published by the New Weather Institute in collaboration with Scientists for Global Responsibility and Champions for Earth, the report also looks at the Games’ own emissions. “They have lots of sponsors that aren’t in these sectors,” said Stuart Parkinson, executive director at Scientists for Global Responsibility. “You can get the sponsorship money you’re after by focusing on those areas, much lower carbon areas. That reduces the carbon footprint.” McDermott reported from Cortina D’Ampezzo, Italy. AP Olympics: https://apnews.com/hub/milan-cortina-2026-winter-olympics The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org. —Jennifer McDermott and Fernanda Figueroa Associated Press View the full article
  26. Shares of Chipotle Mexican Grill are down over 6% in premarket trading following a relatively humdrum fourth-quarter earnings report. The report, released on Tuesday, February 3, showed a 2.5% decrease in comparable restaurant sales from quarter-three and a 1.7% drop year-over-year. However, it appears Chipotle has a plan to fix all that: more limited-time offerings. Yes, the company’s secret weapon of choice is to bump up its number of fresh menu options. This shift will include four limited-time offers throughout the year, Chipotle CEO Scott Boatwright said in an earnings call. He described the move as an increase in Chipotle’s “menu innovation cadence.” The limited-time offers (or LTOs) will start next week with the return of Chicken al Pastor, which Boatwright called “the most celebrated limited-time offer in history, with two times the requests on social media to bring it back compared to any other LTO.” Boatwright adds that Chipotle’s data shows a “core guest is more likely to choose a restaurant that has a new menu item.” Protein, rewards, and of course AI Chipotle has also recently rolled out its “high-protein line,” with Boatwright nodding to the increased use of weight-loss drugs. It includes a $3.50 taco with 15 grams of protein as an addition to an 80-gram double-protein bowl. There’s also a $3.80 high-protein cup “that is inspired by hacks that our guests rely on to boost their intake and offers a solution to those looking for smaller portions, which is a fast-growing trend with the adoption of GLP-1s,” Boatwright stated. Furthermore, the fast-casual chain is relaunching its rewards program and using AI to create more “personalized and impactful” experiences. Even with these steps, Chipotle predicts its comparable restaurant sales for 2026 will be flat. The company did report some wins for quarter-four. It reached $2.98 billion in revenue, beating Wall Street’s expected $2.96 billion, according to consensus estimates cited by CNBC. What happened to that Chipotle boycott? Quarter-one for 2026 has brought its own uncertainties to the fast-casual chain thanks to misinformation spreading online. Chipotle faced boycott calls in January after Bill Ackman, the billionaire CEO and founder of Pershing Square Capital Management, donated $10,000 to a GoFundMe campaign for Jonathan Ross, the ICE agent who shot and killed Renee Nicole Good as she turned her vehicle away from him. In 2016, Ackman bought a 9.9% stake in Chipotle, valued at about $1 billion, Newsweek reports. At the time, Pershing Square Capital was one of Chipotle’s top shareholders, but the company sold all of its shares as of November 2025. In response to the boycott, Chipotle took to social media to clarify that Ackman is no longer connected to the brand. Chipotle’s stock price (NYSE: CMG) was down more than 33% over 12 months when the market closed on Tuesday. View the full article
  27. Your beauty and skincare products are full of fats and oils. They’re what makes that cream so moisturizing or that emollient so good at repairing your skin barrier. Often, those lipids come from palm oil or even animal fats, both of which are environmentally damaging to produce. But soon, the lipids in your personal care products could come from upcycled carbon, skipping the agriculture industry entirely. Savor, a tech company that makes fats and oils directly out of carbon, has already proven this technology through the launch of its butter, which began commercial production in 2025. Now, Savor is announcing a personal care and beauty division, bringing its plant- and animal-free fats beyond food to what it calls a “new era” of clean beauty. How Savor makes fats without plants or animals Savor turns the typical production of fats on its head. The usual formula to create fat starts with energy (from the sun or even grow lights), which grows plants, which can then be turned into oils—or be fed to livestock, which produce milk that becomes butter or fat that goes into skincare, such as beef tallow. Those processes require lots of land and have intense climate consequences. Both livestock farming and palm oil, which is used in a majority of beauty and personal care products, drive deforestation, leading to biodiversity loss, greenhouse gas emissions, and more. Savor, however, skips all those agricultural steps. Instead, the company turns energy—like captured carbon dioxide, methane, or green hydrogen—directly into fats through a thermochemical process. That carbon is combined with hydrogen, oxygen, and heat to create fatty acids, which can then be composed and rearranged into chains that mimic different fats, from butter to palm oil and cocoa butter. “Technically we’re making beautiful ingredients from thin air,” says Jennifer Halliday, an advisor across the biotechnology, beauty, and life sciences industries who is working with Savor. It’s a replica of ancient chemistry. Billions of years ago, hydrothermal vents at the bottom of the ocean created a chemical reaction that formed fatty acids out of hydrogen and carbon dioxide. Opportunities in the beauty industry Savor’s butter has already been adopted by chefs and restaurants, including Michelin-starred SingleThread, in Healdsburg, California, and Jane the Bakery, in San Francisco. It launched commercially in March 2025. Expanding from food to personal care makes sense for Savor, says Kathleen Alexander, cofounder and CEO of the startup, because the two industries overlap in terms of ingredients, environmental impact, and opportunity for change. “Two of the main pillars associated with our platform are sustainability and versatility, or tunability,” she says. “Those wind up being very important in food, and they’re very important in the beauty space as well.” By using its animal- and plant-free lipids, Savor says beauty companies could reduce their products’ emissions by more than 90%, compared to tropical oils like coconut or palm. “Palm and tropical oils wind up showing up a lot in the beauty sector, and those are products that we can really only grow in some of the most rich and biodiverse areas of the world.” Alexander adds. The agricultural industry at large takes up half of the world’s habitable land, and produces 25% to 30% of global greenhouse gas emissions. Savor skirts this entirely; the company says it requires 800 times less land to make its fats and oils than the agricultural industry. Currently, Savor has a 25,000-square-foot pilot facility outside of Chicago, with plans for a large-scale commercial plant by 2029. The startup, founded in 2022, has raised $33 million, according to PitchBook. Its Series A, funded in 2024, was led by food tech VC firm Synthesis Capital and Bill Gates’s Breakthrough Energy. Vegan tallow and more To launch its beauty and personal care division, Savor created three unique products. First, a Vegan Tallow, a colorless and odorless alternative to beef tallow, which has become a recent skin care craze. “We first made that for food customers, and we absolutely still have food customers that are interested in that,” Alexander says. “But the market pull in food for vegan tallow, it turns out, is a little bit lower than the pull we’re seeing in beauty and cosmetics.” Savor also created what it calls Climate Conscious Triglycerides, a palm-free emollient; and Mimetic, made to mimic the skin barrier’s structure to nourish and repair it. Don’t expect to see Savor-branded beauty products on store shelves, though. The startup created these three products to show what is possible, but ultimately, it’s a B2B company that will give its ingredients to brand formulations. Savor says it’s actively engaged with beauty brands, ingredient distributors, and personal care formulators to bring these materials to market, but can’t yet share names. And there’s lots of room for interest to grow, it adds, as brands adapt to regulatory pressure around their supply chains. Traditional feedstocks from plants and animals are also subject to increasing volatility, because of climate change’s effects on crops, geopolitics, traceability concerns, and general price swings. “We’ve actually just had a change in the GHG Protocol Standard to require corporations to start including land use in their accounting, which is just huge,” Alexander says as an example. “That is one of the biggest advantages from an environmental perspective of our platform, that we require less land to make our fats and oils.” Humans have always had an inherently extractive relationship with the planet, she adds. It’s how our food chain works; it’s how we make all sorts of products. “What we’re doing at Savor is rethinking, what if humans could make molecules ourselves?” she says. “What would it mean to really exist on this planet in a way where we can actually not necessarily have to have to make use of other creatures in order to nourish ourselves.” View the full article




Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.