Everything posted by ResidentialBusiness
-
Starmer calls for India to have permanent UN Security Council seat
UK prime minister discusses Ukraine and fossil fuels in meeting with Indian counterpart Narendra ModiView the full article
-
Novo Nordisk to acquire Akero Therapeutics for its experimental liver disease drug
Novo Nordisk said on Thursday it would buy U.S.-based Akero Therapeutics for up to $5.2 billion to add its promising experimental liver disease drug, in the first major deal by the Danish drugmaker’s new CEO to boost growth. The deal underscores new Novo Nordisk CEO Mike Doustdar’s efforts to revive sales growth and fend off intense competition from U.S. rival Eli Lilly. Doustdar, who took over the reins in July, last month also announced the company would cut 9,000 jobs. Akero is testing its drug, efruxifermin, in a late-stage trial of patients with severe liver scarring, or cirrhosis, due to a type of fatty liver disease known as metabolic dysfunction-associated steatohepatitis (MASH). Efruxifermin could be a potential breakthrough in treatment of fatty liver disease and become a “cornerstone” treatment either on its own or in combination with Wegovy, Doustdar said in a statement on Thursday. Under the deal, Novo would pay Akero shareholders $54 per share upfront in cash, which represents a premium of about 16.2% to Akero’s last close of $46.49 on Wednesday. The Danish drugmaker will also pay an additional $6 per share to Akero’s shareholders if efruxifermin secures a full U.S. approval for the condition by June 30, 2031, the companies said. Shares of Akero jumped more than 19% in premarket trading, while Novo’s Denmark-listed shares were down nearly 2%. Some Novo investors have recently told Reuters that they want to see the company invest heavily in research and development to build out its future drug pipeline and revive investor confidence with a growth story. Some have also said they would prefer the company diversify beyond weight loss and diabetes. Doustdar has said the company will focus on developing the next generation of highly effective obesity and diabetes drugs that can also treat related cardiometabolic conditions such as MASH, rather than expand into other disease areas. —Anna Ringstrom, Mariam Sunny and Maggie Fick, Reuters View the full article
-
IBM Unveils Granite 4.0: Efficient Language Models Set to Transform Workflows
IBM has just unveiled Granite 4.0, a breakthrough in enterprise-level language models designed to make advanced AI more accessible for small businesses. Boasting enhancements that increase efficiency while slashing memory requirements, this new generation of models promises affordable AI solutions for everyday tasks, from customer support to process automation. Granite 4.0 introduces a range of models tailored for varying operational needs: Granite-4.0-H-Small, Granite-4.0-H-Tiny, and Granite-4.0-H-Micro. Each is engineered to operate within diverse hardware constraints which is particularly advantageous for small businesses often operating on tighter budgets. The Granite-4.0-H-Small model, for instance, supports robust enterprise workflows, ideal for tasks that require coordination across various tools, such as customer service automation. Meanwhile, the Tiny and Micro models cater to edge applications, allowing small enterprises to execute essential tasks swiftly and efficiently. “Granite 4.0’s remarkable increase in inference efficiency translates into cost savings and faster task execution, allowing small businesses to remain agile in a competitive marketplace,” said an IBM spokesperson, emphasizing how the new models can be integrated into existing operations with ease. Small business owners stand to gain substantially from these new models. First, their decreased memory requirements mean reduced hardware costs. Businesses can expect to save on both operational expenses and initial investments in technology. This is especially relevant for start-ups and smaller firms that need to stretch their budgets while incorporating advanced technologies. Benchmarks show a noteworthy improvement in performance; even the smallest models in the Granite 4.0 collection outperform their predecessors despite being less than half their size. This efficiency becomes even more pronounced when handling complex tasks that require extensive memory, such as analyzing large data sets or running multiple customer inquiries simultaneously. Enhanced speed in processing means quicker responses, translating to better customer experiences. The models are built with versatility in mind. They not only operate effectively as stand-alone solutions but can also function as integral components within larger systems. This flexibility means that businesses can gradually incorporate AI capabilities into their operations without overwhelming their existing infrastructure. That said, small business owners might face challenges during the integration and deployment of these new models. While the models are designed for low latency and edge applications, ensuring compatibility with existing software and systems may require additional effort. There’s also the steep learning curve that often accompanies new technologies, meaning businesses may need to invest time in training staff to utilize these solutions effectively. Moreover, although the models are cost-efficient, initial setup costs can still be a hurdle for some smaller businesses. They may need to conduct thorough assessments to determine the best fit for their unique workflows before making a decision on which model to deploy. Granite 4.0 represents a significant step forward in IBM’s commitment to providing affordable, high-performance AI solutions for enterprises of all sizes. The advancements not only lower barriers to entry but also empower small business owners with tools that enhance efficiency and customer engagement. As AI continues to permeate various industries, small businesses that adopt these models early on may find themselves ahead of the curve, benefiting from improved operational capabilities and overall productivity. For those interested in learning more about the Granite 4.0 models and exploring how they could fit within their business, visit the original IBM announcement here. In this rapidly evolving technological landscape, often those who adapt based on the available resources can outperform their competition. It appears that Granite 4.0 may just be the tool many small businesses need to thrive. Image via IBM This article, "IBM Unveils Granite 4.0: Efficient Language Models Set to Transform Workflows" was first published on Small Business Trends View the full article
-
IBM Unveils Granite 4.0: Efficient Language Models Set to Transform Workflows
IBM has just unveiled Granite 4.0, a breakthrough in enterprise-level language models designed to make advanced AI more accessible for small businesses. Boasting enhancements that increase efficiency while slashing memory requirements, this new generation of models promises affordable AI solutions for everyday tasks, from customer support to process automation. Granite 4.0 introduces a range of models tailored for varying operational needs: Granite-4.0-H-Small, Granite-4.0-H-Tiny, and Granite-4.0-H-Micro. Each is engineered to operate within diverse hardware constraints which is particularly advantageous for small businesses often operating on tighter budgets. The Granite-4.0-H-Small model, for instance, supports robust enterprise workflows, ideal for tasks that require coordination across various tools, such as customer service automation. Meanwhile, the Tiny and Micro models cater to edge applications, allowing small enterprises to execute essential tasks swiftly and efficiently. “Granite 4.0’s remarkable increase in inference efficiency translates into cost savings and faster task execution, allowing small businesses to remain agile in a competitive marketplace,” said an IBM spokesperson, emphasizing how the new models can be integrated into existing operations with ease. Small business owners stand to gain substantially from these new models. First, their decreased memory requirements mean reduced hardware costs. Businesses can expect to save on both operational expenses and initial investments in technology. This is especially relevant for start-ups and smaller firms that need to stretch their budgets while incorporating advanced technologies. Benchmarks show a noteworthy improvement in performance; even the smallest models in the Granite 4.0 collection outperform their predecessors despite being less than half their size. This efficiency becomes even more pronounced when handling complex tasks that require extensive memory, such as analyzing large data sets or running multiple customer inquiries simultaneously. Enhanced speed in processing means quicker responses, translating to better customer experiences. The models are built with versatility in mind. They not only operate effectively as stand-alone solutions but can also function as integral components within larger systems. This flexibility means that businesses can gradually incorporate AI capabilities into their operations without overwhelming their existing infrastructure. That said, small business owners might face challenges during the integration and deployment of these new models. While the models are designed for low latency and edge applications, ensuring compatibility with existing software and systems may require additional effort. There’s also the steep learning curve that often accompanies new technologies, meaning businesses may need to invest time in training staff to utilize these solutions effectively. Moreover, although the models are cost-efficient, initial setup costs can still be a hurdle for some smaller businesses. They may need to conduct thorough assessments to determine the best fit for their unique workflows before making a decision on which model to deploy. Granite 4.0 represents a significant step forward in IBM’s commitment to providing affordable, high-performance AI solutions for enterprises of all sizes. The advancements not only lower barriers to entry but also empower small business owners with tools that enhance efficiency and customer engagement. As AI continues to permeate various industries, small businesses that adopt these models early on may find themselves ahead of the curve, benefiting from improved operational capabilities and overall productivity. For those interested in learning more about the Granite 4.0 models and exploring how they could fit within their business, visit the original IBM announcement here. In this rapidly evolving technological landscape, often those who adapt based on the available resources can outperform their competition. It appears that Granite 4.0 may just be the tool many small businesses need to thrive. Image via IBM This article, "IBM Unveils Granite 4.0: Efficient Language Models Set to Transform Workflows" was first published on Small Business Trends View the full article
-
How to Stop Umming and Erring When Presenting
You know that moment. You’re standing in front of an audience, your mind races ahead, and suddenly every second word seems to be “um” or “er”. It’s normal. But when it happens too often, it can distract your audience and quietly chip away at your confidence. In our recent skills booster, How to Stop Umming and Erring When Presenting, Presenting Expert and Productivity Ninja Dom Kracmar explored why it happens, what it reveals about how our minds work, and how to use silence as one of the most powerful tools in communication. Why we do it Those little “umms” and “errs” aren’t mistakes. They’re a sign that your brain is working hard to find the right words. The challenge isn’t getting rid of them, it’s learning to manage them. When we feel under pressure, silence feels risky. But in reality, silence builds authority. The pause tells your audience, I’m thinking. I’m in control. How to pause with purpose Rather than trying to sound flawless, focus on creating space. A clear pause between thoughts helps you slow down, breathe, and give your listeners time to process what you’ve said. Try this in your next meeting or presentation: Start speaking after a short pause instead of rushing in. When you finish a key point, pause again before moving on. If you lose your train of thought, take a breath rather than filling the gap. It feels uncomfortable at first, but soon those pauses become your superpower. The balance that sounds natural A few fillers are fine; they make you sound human. The goal is balance: enough warmth to sound natural, but enough calm to sound confident. Dom summed it up perfectly during the session: “Clarity doesn’t come from speaking faster. It comes from giving your words room to land.” Want to take it further? If this idea resonated with you, you’ll love our full-day workshop Love Your Inner Presenter. It’s designed to help you find your authentic voice, manage nerves, and bring more calm and confidence to every presentation. The post How to Stop Umming and Erring When Presenting appeared first on Think Productive UK. View the full article
-
Royalties, Residuals, and Reality Checks | ARC-SLC
CPAs translate fame into cash flow, contracts, and accountability. Accounting ARC With Harshita Multani Center for Accounting Transformation Go PRO for members-only access to more Center for Accounting Transformation. View the full article
-
Royalties, Residuals, and Reality Checks | ARC-SLC
CPAs translate fame into cash flow, contracts, and accountability. Accounting ARC With Harshita Multani Center for Accounting Transformation Go PRO for members-only access to more Center for Accounting Transformation. View the full article
-
Daily Search Forum Recap: October 9, 2025
Here is a recap of what happened in the search forums today...View the full article
-
Forget the Great Decoupling – SEO’s Great Normalization has begun
For years, SEO success was measured by charts pointing up and to the right. We celebrated traffic spikes, shared screenshots, and treated rising sessions as proof of strategy. But those days are gone. The Great Decoupling represents the decline of informational traffic The SEO playbook that was glorified for the last 10 years prioritized generating useless hordes of informational traffic. Marketing teams and agencies alike bamboozled executives by pumping up traffic numbers to obscene levels. People on LinkedIn celebrated programmatic SEO case studies and “SEO heists” that racked up irrelevant clicks – encouraging copycat behavior that ultimately led to the further enshittification of our beloved SEO profession. Here’s the aftermath of the SEO heist: In 2024, Google escalated its war on generic, low-value content marketing by deploying AI Overviews and updating its algorithm to target scaled content abuse. The Great Normalization is here It turns out that impressions were inflated all along thanks to SEO tool scraping. Source: iPullRank But we can’t blame it all on SEO tool scraping. The B2B SEO playbook became a cookie-cutter formulaic bastardization of HubSpot’s inbound marketing model, which encouraged the prioritization of high-volume definitional search terms, often via glossary pages and FAQ directories. That’s why if you examine any Cybersecurity company, they all have the same SEO strategy. And let’s be real: most websites never deserved those clicks in the first place. They were either topically irrelevant or played a questionable role in influencing conversions. Real-world examples of the top-of-funnel collapse A B2B cybertech client saw a 70% drop in traffic to its glossary pages. A martech company experienced a similar 70% decline in its FAQ content. A business operations brand lost around 60% of visits to its definitions section. And a B2B data operations client reported a 55% decrease in glossary traffic. AI is the great equalizer Top-of-funnel clicks are being eaten alive due to AI’s ability to easily summarize consensus information based on its own training data, driving even more zero-click searches. Because so many categories are topically mature, ChatGPT, Claude, Gemini, etc., have already scraped the answers to most generic queries, which are pre-loaded in their index of training data. For pre-loaded AI answers that don’t involve LLM grounding, there’s no chance of getting your brand recommended, and no chance of earning a citation. AI Overviews targeting informational queries are simply a continuation of what we previously saw with featured snippets and People Also Ask boxes, reducing traffic opportunities. Get the newsletter search marketers rely on. See terms. Why educational blogs no longer work in the AI era Across dozens of GSC accounts, the trend is clear – educational blogging as a marketing strategy has stopped delivering results. Top-of-funnel informational pages have become irrelevant These clicks rarely influenced conversions and now account for most traffic losses. Companies that have pumped up vanity traffic are paying the price. Middle-of-funnel content remains valuable Pages comparing solutions or providing evaluation content still attract users who are actively considering purchases. Optimizing for AI-driven search and grounded prompts still pays off. Bottom-of-funnel intent is critical High-intent queries like “best insider threat platforms” require citations and grounded answers. AI cannot fully replace these pages, making them high-value for organic search and revenue attribution. Branded content holds strong Homepage traffic is rising in the AI era, Siege Media found. Companies that invest in reputation, commercial intent content marketing, and ecosystem management will maintain visibility, even as AI-generated answers dominate consensus topics. AI is changing metrics and measurement Prompt monitoring in AI search is quickly becoming the new keyword rank tracking. Irrelevant prompts tracking generic AI answers without citations or brand mentions are useless. Only prompts requiring LLM grounding or real-time references are worth monitoring. This shift is the next evolution beyond the Great Decoupling: it’s the Great Normalization where top-of-funnel traffic is rapidly declining, and brands are forced to compete with a lower inventory of available clicks but more meaningful engagement from higher intent searches. A recent study by Semrush found that: “AI search visitors tend to convert better because LLMs can equip users with all the information they need to make a decision.” Actionable ideas Here’s how to adapt your SEO strategy for the Great Normalization – and focus on metrics that truly reflect impact. Avoid chasing top-of-funnel traffic: Focus on bottom-of-funnel, branded, and middle-of-funnel content that influences conversions. Invest in AI-optimized content: Target queries requiring LLM grounding, citations, and brand mentions. Use GSC and Semrush to track meaningful AI SEO metrics. Measure what matters: Track organic search, attribution, and performance of grounded prompts. Ignore vanity impressions and outdated rankings. Diversify channels: Don’t rely solely on organic search. Integrate LinkedIn, YouTube, webinars, social media advertising, and ecosystem visibility to hedge against zero-click searches and AI dominance. Embrace the Great Normalization The Great SEO Normalization is a much needed strategic correction. Top-of-funnel clicks are disappearing, but branded traffic, middle-and bottom-of-funnel optimization matter more than ever. Companies that adapt their SEO strategy to this new reality will thrive. Focus on what influences conversions and embrace a new era where traditional search and AI search coexist. View the full article
-
The Fed released the minutes from its September rate cut meeting. Here’s what they reveal
Most members of the Federal Reserve’s interest-rate setting committee supported further reductions to its key interest rate this year, according to minutes from last month’s meeting released Wednesday. A majority of Fed officials felt that the risk unemployment would rise had worsened since their previous meeting in July, while the risk of rising inflation “had either diminished or not increased,” the minutes said. As a result, the central bank decided at its Sept. 16-17 meeting to reduce its key rate by a quarter-point to about 4.1%, its first cut this year. Rate cuts by the Fed can gradually lower borrowing costs for things like mortgages, auto loans, and business loans, encouraging more spending and hiring. Still, the minutes underscored the deep division on the 19-person committee between those who feel that the Fed’s short-term rate is too high and weighing on the economy, and those who point to persistent inflation that remains above the central bank’s 2% target as evidence that the Fed needs to be cautious about reducing rates. Only one official formally dissented from the quarter-point cut: Stephen Miran, who was appointed by President Donald The President and was approved by the Senate just hours before the meeting began. He supported a larger, half-point cut instead. But the minutes noted that “a few” policymakers said they could have supported keeping rates unchanged, or said that “there was merit” in such a step. The differences help explain Chair Jerome Powell’s statements during the news conference that followed the meeting: “There are no risk-free paths now. It’s not incredibly obvious what to do.” Miran said in remarks Tuesday that he thinks inflation will steadily decline back toward the Fed’s 2% target, despite The President’s tariffs, and as a result he doesn’t think the Fed’s rate needs to be nearly as high as it is. Rental costs are steadily declining and will bring down inflation, he said, while tariff revenue will reduce the government’s budget deficit and reduce longer-term interest rates, which gives the Fed more room to cut. Yet many other Fed officials remain concerned about stubbornly high inflation, the minutes showed. Jeffrey Schmid, president of the Federal Reserve’s Kansas City branch, said in a speech Monday that “inflation is too high” and argued that the Fed should keep rates high enough to cool demand and prevent inflation from worsening. And Austan Goolsbee, president of the Fed’s Chicago branch, said in an interview Friday with The Associated Press that he supported a cautious approach toward more cuts, and wanted to see evidence that inflation would cool further. “I am a little uneasy with front loading rate cuts, presuming that those upticks in inflation will just go away,” he said. The minutes provide insight into how the Fed’s policymakers were thinking last month about inflation, interest rates, and hiring. Since then, however, the federal government shutdown has cut off the flow of economic data that the Fed relies on to inform its decisions. The September jobs report wasn’t issued as scheduled last Friday, and if the shutdown continues, it could also delay the release of the inflation report set for next Wednesday. —Christopher Rugaber, AP Economics Writer View the full article
-
2026: When AI Assistants Become The First Layer via @sejournal, @DuaneForrester
Google took 9 years to scale. ChatGPT does it in 4. The adoption slope is the wake-up call. The post 2026: When AI Assistants Become The First Layer appeared first on Search Engine Journal. View the full article
-
Domino’s tasty new tagline is a jingle and logo rolled into one
For its first brand refresh in 13 years, Domino’s updated its color palette, packaging, and font to look more engaging. When it came to making new tagline, though, the Michigan-based pizza chain is trying something unique: they just added more Ms to their wordmark. Mmm. Domino’s announced a rebrand Wednesday that includes brighter reds and blues and a new font called Domino’s Sans that was designed to “be thicker and doughier” and proves that using sans serifs doesn’t have to be bland. Team members will get new branded gear to wear in the kitchen and out delivering orders, and there’s a reimagined suite of new pizza boxes, including one black-and-metallic-gold box designed for premium menu items, like Domino’s Handmade Pans, to better upsell pricier pizzas. It’s a brand refresh optimized for craveability, like recent rebrands for Burger King and Papa Johns that used squishy type and color palettes chosen to convey freshness and ingredients. This is graphic design meant to look delicious and make you hungry. For its new tagline—which Domino’s is calling it its “cravemark”—they tapped Shaboozey, who draws out the “m” sound when saying “dominos” in campaign ads (like “dommmino’s,” get it?). On screen, the musician’s jingle is visually reinforced with an animation that adds the extra Ms to the Domino’s wordmark before the letters snap into the Domino’s domino logo. “Rather than launching a more traditional tagline, we’re baking craveability right into our name and every aspect of our brand as a reminder of this relentless focus,” Domino’s executive vice president Kate Trumbull said in a statement. “You literally can’t say ‘Domino’s’ without saying ‘mmm.'” The cravemark is a wordmark, tagline, and jingle all rolled up into one, and judging by who the company got to sing it, Domino’s has big ambitions for its new brand asset. Shaboozey’s “A Bar Song (Tipsy)” tied the record for the longest-running No. 1 in Billboard Hot 100 chart history last November, and he was the only artist to be featured on Beyoncé’s Cowboy Carter twice. In a statement, he called pizza “that one food that brings everyone together” across generations and cultures. Domino’s is aiming for an asset with wide appeal and recognition. Domino’s is the leading U.S. pizza chain by revenue, bringing in $4.78 billion annually, ahead of Papa John’s, Yum! Brands-owned Pizza Hut, and Little Caesars. Same as its competitors, though, Domino’s has also seen its year-over-year same-store sales plummet since the pandemic as fewer people seek out delivery options. Like Peloton or Zoom, delivery pizza brands are pandemic darlings that are readjusting to new norms. Pizza has been hit especially hard by inflation, with median restaurant meal prices rising 12% for pizza, more than any other food category, according to data from Datassential. Other pizza chains have responded to shifting quick-service restaurant trends with redesigns of their own, like Papa Johns, which introduced a food-inspired brand refresh last year, and Pizza Hut, which is experimenting with its own new store concepts and premium menu items at home and a new, fun, retro logo abroad. Trumbull, the Domino’s executive, denies that their brand refresh is due to the company struggling, but it does have something to do with repositioning the brand. “Over the past decade, we became known as a technology company that happens to sell pizza,” she said. The pizza chain remembered for its pizza tracker and website would like to start being remembered for “making and delivering the most delicious products and experience,” as Trumbull put it. Though the brand refresh will show up in digital advertising and on Domino’s app and website, the impetus behind it is human. View the full article
-
University of Phoenix IPO: All eyes on the stock price as education partners go public during shutdown
With no end in sight to the political impasse in Washington that has shut down the government, the U.S. IPO market is expected to experience a significant slowdown just as it was beginning to show signs of life again. Some companies are nevertheless forging ahead with their listings. Phoenix Education Partners, parent company of the for-profit University of Phoenix, which announced its IPO plans one day before the shutdown began, said on Wednesday that it has priced its shares at $32. That’s the midpoint of its earlier targeted range of between $31 and $33 a share. The company intends to list on the New York Stock Exchange (NYSE) under the “PXED” ticker symbol. Selling shareholders will offer roughly 4.3 million shares of its common stock, with Morgan Stanley, Goldman Sachs, BMO Capital Markets, and Jefferies serving as lead book runners. At its offering price, Phoenix Education has a valuation of roughly $1.14 billion, Bloomberg reported. “An attractive and growing sub-segment” University of Phoenix is almost 50 years old and has been accredited since 1978, according to a company prospectus filed with the Securities and Exchange Commission (SEC). The school is geared toward online adult education, with most of its students already in the workforce and seeking to advance their careers in some way. “Adult learners represent an attractive and growing sub-segment of the higher education market,” Phoenix Education writes in its prospectus. “However, they face unique challenges that are not addressed by traditional programs designed for 18- to 22-year-olds, including the time constraints and responsibilities of work, community and caring for dependents.” The school says it had an average total enrollment of 82,700 degree-seeking students as of the first nine months of this fiscal year. Roughly 70% of its students are seeking bachelor’s degrees, with business and IT being its most popular areas of study, followed by healthcare and behavioral and social sciences. Is University of Phoenix profitable? Unlike some of the high-profile tech startups that have gone public this year, University of Phoenix is already profitable. Last year, it generated net income of $115 million on revenue of $950 million, according to its SEC filings, up from net income of $52 million on revenue of $801 million in 2022. What else is there to know? Phoenix Education is backed by Apollo Global Management and investment firm Vistria Group, with the former being a majority shareholder. The company is expected to list its shares today (October 9, 2025) at some point after the opening bell. View the full article
-
Inside Intel’s big bet to save US chipmaking — and itself
Claims of a breakthrough at the chips company’s new Arizona facility will be tested by sceptical Big Tech customersView the full article
-
Use the ‘Von Restorff Effect’ When Studying Difficult Concepts
Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding Lifehacker as a preferred source for tech news. Memory is incredibly complex, which is probably why it’s been studied and researched so much—and why researchers who make big discoveries about how it functions get phenomena named after them. A great example of this: the Von Restorff effect, which refers to our brains’ acuity for remembering distinctive pieces of information over more uniform ones, a fact discovered by a German psychiatrist named Hedwig von Restorff. It’s an interesting fact of biology, but it can also be very useful to know when you’re studying a difficult concept. Here’s why, and how to put the Von Restorff effect to good use. What is the Von Restorff effect?In previous articles, I’ve discussed various memory effects you can capitalize on when you study, like the primacy effect, the Proust effect, and the production effect. All of these begin with a nice, a round-looking “P" and the consonant blend of "Pr-." If I add the Von Restorff effect to that list, it will stand out solely because of the variety in its name—the rigid angle of that “V" and the hard sound of the Restorff. Because of this, you’re more likely to remember the Von Restorff effect when asked to name the four memory phenomena we just went over—and that’s the Von Restorff effect in action. In the simplest terms, it is an “isolation effect” that ensures your brain will remember an item in a list if it has a distinctive feature that sets it apart from other entries on the list. When studying, you can achieve this effect by changing the meaningfulness of an item or a piece of information in some way, whether by color, size, font choice, or another approach. How to use the Von Restorff effect to remember what you studyThe good news is that, unlike when you try to harness the power of other memory-based study techniques, this one is pretty straightforward. It's almost too simple, which is why you might not be familiar with it or ever have considered finding ways to use it. Once you realize the simplicity of this concept, it’s easy to find ways to work it into your study habits. In your notes, use a highlighter specifically to highlight words or concepts you need to remember for a test. Color-coding your notes is a supremely helpful studying technique on its own because even taking the time to choose the colors and their meanings helps you focus. When using flashcards, write concepts and words you’re having a hard time grasping in a different color. If you're using the Leitner method (which you should be), you can rewrite or recreate cards every time you drill them. With that technique, you move your cards into piles every time you get them right or wrong, then study the piles at different frequencies, focusing on ones you know less often than ones that are giving you trouble. Every few times you move a card up or down into a pile, try rewriting or recreating it with a different text color. If a card is red but your color for easy ones is blue, you'll be instantly reminded that this is one you recently struggled with, which can help you lock in and pull the information from your memory using active recall. When studying with others and talking through the material (or reading aloud from your notes), try associating a unique motion—like a specific hand gesture—with the concepts you’re discussing. Use a shape, like a circle or box, to outline key parts of your notes or textbook materials. Do this when you're mind-mapping, especially, so all your mind maps are uniform. Dates can be in squares, vocab words in circles, and so on. To make quick and efficient mind maps, I recommend an app like Xmind. Also when mind-mapping, move the circle containing the most important concepts (or the one you’re struggling to grasp) to a more removed part of your page, differentiating it from the rest of the clusters visually. I do a version of this with my to-do list, which is written on a whiteboard on the fridge. Unusual things, like a doctor's visit or something I need to buy but don't usually get on my weekly trip to the grocery store, get written off to the side. They're more notable that way, drawing my attention away from the larger block of text and reminding me, over and over, that I have to do them. By the time doctor's appointment day rolls around or I'm in the store trying to remember my list, those are seared into my brain. If you're revising your notes, use different text styles. This is something I did all the time in college, even when taking notes in class. I would bold anything a professor led into by announcing, "This will be on the test." I would turn vocab words purple. Even if you're hand-writing (which you should be!), you can draw squiggly underlines under certain ideas, circle others ... you get the point. Get zany with it, but keep it uniform. If you draw a big box around one date, do it to all of them. If you draw arrows pointing to a state law and a circle around a federal one, do it to all of them. The goal is to differentiate the things you need or struggle to remember using color, motion, shape, or even orientation. Setting certain ideas or words apart will make them stick in your brain better, no matter how you do it. View the full article
-
AI search relies on brand-controlled sources, not Reddit: Report
A dominant share – 86% to be specific – of AI citations come from sources brands already control, according to a new analysis of 6.8 million citations across ChatGPT, Gemini, and Perplexity conducted by brand visibility platform Yext. Why we care. The findings challenge the perception that forums like Reddit dominate AI answers. Yext’s analysis also suggests that brands can directly influence visibility in generative results by keeping website content accurate, structured, and crawlable. (Yes, SEO.) By the numbers. Across all industries, 86% of AI citations came from sources brands owned or managed. Websites lead: 44% of citations came from first-party sites. Listings close behind: 42%. Reviews and social: 8%. Forums: Just 2%. AI model bias: Gemini favors websites (52.1%), OpenAI leans on listings (48.7%), and Perplexity spreads across sources like MapQuest and TripAdvisor. Industry variation: Retail: 47.6% from brand websites. Finance: 48.2% from authoritative local pages. Healthcare: 52.6% from listings like WebMD and Vitals. Food service: 13.3% from reviews and social (the most of any sector). What they’re saying. The most impactful sources are the ones they can already control or influence, according to Christian J. Ward, Chief Data Officer at Yext: “Discussions about measuring AI visibility are missing the most important factor. The consumer. AI generates answers based on a person’s real-world location and context, not a generic brand view. This has led to more confusion than clarity about what really powers AI. Our research clears this up. We start with the consumer and their specific query. That is what determines visibility.” About the analysis. Yext analyzed 6.8 million AI citations from 1.6 million queries across ChatGPT, Gemini, and Perplexity between July and August, using its Scout platform to test four intent types across four industries. The report. You can read it here. View the full article
-
Russia destroys 60% of Ukraine’s gas production ahead of winter
Moscow’s intensified bombing campaign seeks to ‘break our spirit’, says head of state energy groupView the full article
-
IRS announces most operations are closed and nearly half of its workforce will be furloughed
The IRS will furlough nearly half of its workforce on Wednesday as part of the ongoing government shutdown, according to an updated contingency plan posted to its website. Most IRS operations are closed, the agency said in a separate letter to its workers. The news comes after President Donald The President and Congress failed to strike an agreement to fund federal operations, and the government shutdown has entered its second week, with no discernible endgame in sight. The agency’s initial Lapsed Appropriations Contingency Plan, which provided for the first five business days of operations, stated that the department would remain open using Democrats’ Inflation Reduction Act funds. Now, only 39,870 employees, or 53.6%, will remain working as the shutdown continues. It is unclear which workers will remain on the job. Doreen Greenwald, president of the National Treasury Employees Union, said in a statement that taxpayers should expect increased wait times, backlogs and delays implementing tax law changes as the shutdown continues. “Taxpayers around the country will now have a much harder time getting the assistance they need, just as they get ready to file their extension returns due next week,” she said. “Every day these employees are locked out of work is another day of frustration for taxpayers and a growing backlog of work that sits and waits for the shutdown to end.” She urged the The President administration and Congress to “reach an agreement that reopens government and restores the services that Americans need and deserve.” The notice to workers states that furloughed workers and those who remain on the job will receive back pay once the shutdown ends. This is notable since the Republican administration on Tuesday warned of no guaranteed back pay for federal workers affected by a government shutdown. Last week, The President said roughly 750,000 federal workers nationwide were expected to be furloughed across agencies, with some potentially fired by his administration. Representatives from the IRS, the Treasury and the White House did not comment on the furlough plans. Earlier this year the IRS embarked on mass layoffs, spearheaded by the Department of Government Efficiency, affecting tens of thousands of workers. At the end of 2024, the agency employed roughly 100,000 workers — and currently that hovers around 75,000. —Fatima Hussein, Associated Press View the full article
-
Pulte hints at how Fannie, Freddie may spur builder activity
President The President wants the two government-sponsored loan buyers to help speed lower-cost home creation and their oversight agency's director is following up. View the full article
-
Preparing C-Level For The Agentic Web via @sejournal, @TaylorDanRW
AI-driven environments demand content that is human-readable yet machine-interpretable, redefining how brands approach design and SEO performance metrics. The post Preparing C-Level For The Agentic Web appeared first on Search Engine Journal. View the full article
-
Use 'Close Reading' to Retain Every Detail When You Study
We may earn a commission from links on this page. Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding Lifehacker as a preferred source for tech news. When you’re assigned a text, you’re supposed to take away the big-picture ideas. The problem is that sometimes, you can miss the forest for the trees, skimming passages just to get it all over with. To get to that broad understanding you need, you should use a method that is all about being detail-oriented. It’s called “close reading,” and it will help you grasp whatever you read so you understand not only your assigned passage or chapter, but the concept you're studying overall. What is close reading?Close reading is a method that requires you to focus on every detail in a text and operate on the assumption that no detail is in there by accident or for no reason. That's not an assumption so much as it's a fact. Think about the process required for that content to be in front of you. A subject matter expert had to research it all, write it all down, pass it through perhaps an agent and definitely an editor, if not multiple editors. If it's for school, that material had to be reviewed by some kind of board and approved for use. Everything that made it to the final version had multiple sets of eyes on it. All of it is important and deserves to be there; anything irrelevant would likely have been cut. In Teaching Literature, literary critic Elaine Showalter says that “close reading” is “slow reading, a deliberate attempt to detach ourselves from the magical power of storytelling and pay attention to language, imagery, allusion, intertextuality, syntax, and form.” The idea is to break the habit of reading casually or taking everything in at once and instead form a new habit wherein you examine every tiny element of what you’re reading and question why the author included it. The method is even included in the United States’ Common Core standards for K-12 education. There, it’s defined as “the methodical investigation of a complex text through answering text dependent questions geared to unpack the text’s meaning.” Essentially, you’re not just looking at what the text says, but how it says it—and that’s beneficial: Per Literacy in Focus, that means you’ll eventually be able to truly grasp what the text means. Educators have researched the use of the method and found it increases reading comprehension, for instance. Using close reading helps learners understand what a text says, but also why it’s important. How to use the close reading methodClose reading involves reading everything twice (or more than twice). The first time is to get what it says, but the following times are to understand why it says it and why it says it that way. The goal is to be able to apply that deeper thinking and understanding to what you learned the first time through. Read for understandingOn your first pass through the material, just get the gist. Figure out the main argument or story, who is speaking, and to whom. Don't worry about details. Just take it all in and make a mental note of what stands out, confuses or, or seems relevant. Read for detailsOn your second time through the content, you'll actually do the "close reading" part. Highlight key terms, circle new words, and look all of those those up. Pay special attention to formatting, like subtitles, section breaks, and visuals like charts or photos. Remember: None of it is in there by accident. Think, too, about how the information is ordered. Does the author start broad, describing an issue before zeroing in on examples? Or do they start with a specific example and then explain broader context? Ask questionsRead it again, but this time, ask questions. Write them down on a separate paper or even in the margins. It's likely you'll be reading something digital, so just open up a blank word processing doc if you have to. Some questions to ask, though these will vary depending on your content, grade level, author, and more: Why is this section titled the way it is? Why did the author use that [evocative/obscure/flippant/etc.] word? Why introduce this idea before that one? Why did the author start the chapter with a character-driven story before explaining the broader context? Who is the intended audience? Students? Educators? The general public? Plumbers? Politicians? Crucially, ask questions even—or especially—if you're unsure of the answer. Maybe you don't know why the author used a particular phrase or for whom the content was written. That doesn't matter. By asking, you become more curious and think more deeply. It keeps you engaged. Read for meaningOnce you've read it enough times that you've identified the main idea and found some questions to ask, you've made it to your final read-through. Here, zoom out. Stop thinking of individual words, unusual punctuation, or why the author included a particular graph. Think instead of the bigger picture again, like you did on that first read-through. What's the author's purpose? How do the things you noticed the second time through or the questions you asked the third time through contribute to that message? Maybe a recurring reference in a short story hints at a theme. Maybe the structure of an essay is meant to mirror the thoughts of its subject. That's what this exercise is about. You're moving beyond what the text says and understanding how it's delivering its message. Obviously, this is a useful practice when studying literature, but it works for a variety of subjects. If you're reading history, you might wonder why an author frames a particular battle or policy in a negative or positive way and wonder, too, how the people on the other side of that battle or policy felt—and why their view isn't represented. Even asking those questions gets you thinking deeper and relating more to the material, which makes it not only easier to understand, but easier to remember. View the full article
-
IT Leaders Urged to Prioritize Employee Experience for Competitive Edge
Small business owners looking for an edge in today’s competitive landscape might find that the key to unlocking productivity and innovation lies not just in customer-facing strategies, but in prioritizing employee experience. Recent findings from HP’s research highlight that while budgetary constraints are always a concern, the real hurdles to adopting Digital Employee Experience (DEX) systems stem from strategic and cultural barriers. The study—entitled “Fulfillment at Work: IT View”—surveyed 1,825 IT decision-makers across a range of countries, including the US, UK, and Japan. A striking 35% of respondents identified lack of prioritization as the main reason for not implementing DEX, and 32% viewed these systems as merely “nice to have.” Yet, for small businesses, leveraging tools that enhance employee satisfaction could be the foundation of business growth and a distinct competitive advantage. “Every organization talks about digital transformation, but most focus on customer-facing technology,” noted HP’s representatives. “Companies that will dominate moving forward will also prioritize employee technology, which will allow them to attract the best talent, increase productivity, and adapt faster to change.” This realization is crucial for small business owners who may still be caught in the cycle of viewing employee experience as secondary to customer needs. Investing in DEX solutions has tangible benefits. By creating a work environment that eliminates barriers and enhances connectivity, employees are better positioned to deliver their best work, regardless of location. This is particularly vital in a time when flexible work arrangements are becoming the norm. Small businesses adopting these systems can expect not only to boost morale but also to see improved operational efficacy. Moreover, the positive impact on IT teams themselves is noteworthy. When technology simplifies processes rather than complicating them, IT leaders find fulfillment in solving meaningful problems rather than merely addressing crises. The study emphasizes that when IT focuses on enhancing employee experience, the entire organization benefits from increased visibility and a stronger sense of purpose. However, small business owners should also consider potential challenges when implementing DEX solutions. A lack of strategic prioritization can lead to pushback or indifference among stakeholders. Ensuring the buy-in of leadership is vital, as their support will facilitate smoother integration and acceptance of new technologies. Furthermore, small businesses must align DEX strategies with their overall mission and values, so that any digital tools adopted feel seamless and relevant to their workforce. Additionally, while investing in technology can lead to improvements, it is equally important to foster a company culture that values employee input. Ensuring that employees feel heard and included in decision-making processes can reinforce the effectiveness of DEX systems. Organizations that overlook this aspect may find themselves with high-tech solutions that don’t resonate with staff, negating the benefits of any investment. In a nutshell, embracing DEX is not merely about acquiring the latest technology; it’s about recognizing that employee satisfaction directly influences productivity and engagement. For small businesses aiming to thrive in a volatile market, prioritizing employee experience can unlock human potential and enhance performance. As HP’s research underscores, there’s a competitive advantage waiting to be tapped. Small business owners must evaluate where they stand on the DEX spectrum. Will they step up to prioritize employee satisfaction, or will they miss the opportunity to elevate their organizations into a new era of productivity? For more insights, you can read the full report on HP’s website here. Image via Envato This article, "IT Leaders Urged to Prioritize Employee Experience for Competitive Edge" was first published on Small Business Trends View the full article
-
IT Leaders Urged to Prioritize Employee Experience for Competitive Edge
Small business owners looking for an edge in today’s competitive landscape might find that the key to unlocking productivity and innovation lies not just in customer-facing strategies, but in prioritizing employee experience. Recent findings from HP’s research highlight that while budgetary constraints are always a concern, the real hurdles to adopting Digital Employee Experience (DEX) systems stem from strategic and cultural barriers. The study—entitled “Fulfillment at Work: IT View”—surveyed 1,825 IT decision-makers across a range of countries, including the US, UK, and Japan. A striking 35% of respondents identified lack of prioritization as the main reason for not implementing DEX, and 32% viewed these systems as merely “nice to have.” Yet, for small businesses, leveraging tools that enhance employee satisfaction could be the foundation of business growth and a distinct competitive advantage. “Every organization talks about digital transformation, but most focus on customer-facing technology,” noted HP’s representatives. “Companies that will dominate moving forward will also prioritize employee technology, which will allow them to attract the best talent, increase productivity, and adapt faster to change.” This realization is crucial for small business owners who may still be caught in the cycle of viewing employee experience as secondary to customer needs. Investing in DEX solutions has tangible benefits. By creating a work environment that eliminates barriers and enhances connectivity, employees are better positioned to deliver their best work, regardless of location. This is particularly vital in a time when flexible work arrangements are becoming the norm. Small businesses adopting these systems can expect not only to boost morale but also to see improved operational efficacy. Moreover, the positive impact on IT teams themselves is noteworthy. When technology simplifies processes rather than complicating them, IT leaders find fulfillment in solving meaningful problems rather than merely addressing crises. The study emphasizes that when IT focuses on enhancing employee experience, the entire organization benefits from increased visibility and a stronger sense of purpose. However, small business owners should also consider potential challenges when implementing DEX solutions. A lack of strategic prioritization can lead to pushback or indifference among stakeholders. Ensuring the buy-in of leadership is vital, as their support will facilitate smoother integration and acceptance of new technologies. Furthermore, small businesses must align DEX strategies with their overall mission and values, so that any digital tools adopted feel seamless and relevant to their workforce. Additionally, while investing in technology can lead to improvements, it is equally important to foster a company culture that values employee input. Ensuring that employees feel heard and included in decision-making processes can reinforce the effectiveness of DEX systems. Organizations that overlook this aspect may find themselves with high-tech solutions that don’t resonate with staff, negating the benefits of any investment. In a nutshell, embracing DEX is not merely about acquiring the latest technology; it’s about recognizing that employee satisfaction directly influences productivity and engagement. For small businesses aiming to thrive in a volatile market, prioritizing employee experience can unlock human potential and enhance performance. As HP’s research underscores, there’s a competitive advantage waiting to be tapped. Small business owners must evaluate where they stand on the DEX spectrum. Will they step up to prioritize employee satisfaction, or will they miss the opportunity to elevate their organizations into a new era of productivity? For more insights, you can read the full report on HP’s website here. Image via Envato This article, "IT Leaders Urged to Prioritize Employee Experience for Competitive Edge" was first published on Small Business Trends View the full article
-
Google and Figma partner in bid to remake real-time design with Gemini 2.5
Creativity has always been governed by time—not just how long it takes to bring an idea to life, but how long a creator can stay “in flow.” Every designer knows the frustration of an idea hanging in digital limbo. But those pauses, once accepted as inevitable, are now starting to vanish. Figma, the cloud-based interface design tool, and Google Cloud, the computing and storage platform, have announced the integration of Google’s Gemini 2.5 Flash directly into Figma’s design platform. The collaboration aims to let designers generate visuals and make edits almost instantly, eliminating the lag between an idea and its execution. For users, that means faster collaboration, smoother iteration, and a more natural creative flow. “The economic significance of latency in AI is far greater than just speed—it’s about changing the commercial viability and product experience for every application built on a generative model,” says Matt Renner, president of global revenue at Google Cloud. Lower latency, in turn, decreases computational and financial expenses, allowing the AI tools to become more scalable and efficient for high-volume tasks, he added. Gemini 2.5 Flash (also known as “Nano Banana”) rose to prominence for its ability to merge multiple images, keep characters consistent across edits, and generate a wide range of styles, from lifelike portraits to classic art, in mere seconds. Google claims that in early integrations of Gemini 2.5 Flash, Figma users saw a 50% reduction in latency for the platform’s “Make Image” feature, unlocking faster image generation capabilities for its users. “In Figma, every second AI can return to the user, whether it’s time saved renaming layers, editing images or even generating multiple images at the same time, frees them up to focus on the kind of higher-level problem solving and deep iteration that’s at the root of all great design,” says Abhishek Mathur, vice president of platform engineering at Figma. The partnership signals a deeper strategic shift for Google Cloud. Rather than competing for user attention, the company aims to embed its AI models, including Gemini 2.5 Flash, Gemini 2.0, and Imagen 4, directly into third-party creative ecosystems like Figma to make Gemini an unseen accelerant that enhances existing tools instead of forcing users to switch platforms. “Our focus is on helping users go from idea to production, and we see AI as core to how this workflow will evolve moving forward,” Mathur adds. Google’s ecosystem strategy to scale Gemini Over the past year, Google has woven Gemini into a broad range of partner products, from workspace tools to data analytics suites, positioning the speed and security of its AI models and ease of integration as its defining edge. Salesforce has integrated Gemini into its Agentforce platform to power AI agents across Google Cloud and Salesforce environments. Oracle now supports Gemini models on Oracle Cloud Infrastructure, enabling enterprises to build multimodal AI agents that can process text, images, motion, and audio data. Google’s underlying bet is simple: If AI feels secure and frictionless, widespread adoption will feel inevitable. Released in June, Gemini 2.5 Flash is known for generating high-quality visuals quickly and affordably (one-third the price of Gemini Pro). The model can deliver its first reply in under half a second, making it ideal for fast-moving creative apps, chatbots, or customer support systems. Moreover, the latest Gemini 2.5 Flash updates have improved its accuracy in following instructions, made responses more concise, and boosted speed by up to 40%, making it one of the fastest, most efficient AI models available. Experts caution, however, that faster performance alone may not be enough for Figma to win over creative professionals. “Drops in latency will certainly encourage tool usage and support the kind of cross-discipline collaboration Figma has been building toward for years,” says AJ Joplin, senior analyst at Forrester, who focuses on experience design, design organizations, and design leadership. “But taste still matters. The efficiencies gained from generative tools can quickly disappear if teams don’t pair that time saved with the ability to critically assess what the AI produces.” The AI design race is heating up The Google-Figma partnership comes amid an escalating AI-design race. Adobe has integrated AI models from Google Cloud, OpenAI, and others like fal.ai, Ideogram, Luma, Pika, and Runway to power its Firefly platform and Sensei AI features, including generative fill, AI video editing, 3D design, and smart stock tools. And Canva has become an AI-first platform. Announced at OpenAI’s Developer Day on Oct. 6, Canva is now a pilot partner for ChatGPT app integration, allowing users to create and edit designs directly within ChatGPT. The move aims to bring visual design tools to the chatbot’s 800 million weekly users. If Gemini 2.5 Flash delivers on its promise, the future of design won’t just be more intelligent; it will feel instantaneous. And in the new economy of creativity, that sense of speed may prove to be the ultimate edge. View the full article
-
30-Year SEO Expert: Why AI Search Isn’t Overhyped & What To Focus On Right Now via @sejournal, @theshelleywalsh
Thirty years of industry insight from Carolyn Shelby reveals how AI, LLMs, and Google’s strategy are redefining search models and long-term marketing outcomes. The post 30-Year SEO Expert: Why AI Search Isn’t Overhyped & What To Focus On Right Now appeared first on Search Engine Journal. View the full article