All Activity
- Past hour
-
Why 'Retro' Photography Is Back (and How to Get Started With It)
They say the best camera is the one you have with you. But if everyone has at least a pretty good camera in their phone, why would Gen Z (and really, everyone) be drawn to retro photography? Despite the downsides of bulky, standalone film cameras, the aesthetics and tangibility of old school photography still has a lot to offer. When we talk about “retro” photography, there’s a lot we could mean, but there’s a distinct revival trend around 80s- and 90s-style camera gear and aesthetics we want to focus on. Think Polaroid cameras and standalone point-and-shoots. And if you’re not already drawn in by the appeal of tangible photos and nostalgic vibes, allow me to make the case for why you should. What is the appeal of retro photography?There’s a tendency to think of camera technology as steadily progressing forward in a linear fashion. But for creative purposes, it’s often more helpful to think of it in terms of aesthetic eras. Every type of camera has distinct physical qualities that contribute to the appearance of the images they create. And those qualities, over time, become associated with the times and images they capture. For one very obvious example, consider film grain. As this video essay from YouTube channel Nerdwriter explores, film grain was initially just an artifact of how film cameras work. As it became possible to eliminate film grain, however, our brains started associating this grain with older cinema, or a more generic “film look.” With more control over how—and whether—film grain appears in an image, it can be used to deliberately create a chaotic energy in otherwise still footage. This same principle applies to the aesthetics of every era of camera technology. 90s point-and-shoots, for example, were characterized by harsh, unflattering lighting. They typically had poor low-light performance, so a blinding flash was sometimes the only viable source of light. Now, with better cameras and lighting equipment, that look can become a deliberate stylistic choice. Film and instant cameras also provide a tangible experience that forces more deliberate choices. You might notice a lot of your old family photos have a kind of awkward, staged vibe, and there’s a reason for that. When you only had a dozen or so chances to take a picture, you had to be more careful to make sure everyone is posed and in frame, eyes are open, etc. Now, it’s easy to take dozens of photos until you get the right one, but going back to limited use cameras can force you back into thinking ahead about the image you want. And when it comes to instant cameras, there’s nothing quite like the experience of having a physical memento immediately. Everyone has piles of photos in their camera roll that they’ll never look at again, but if someone hands you a Polaroid of you and your loved ones, there’s a solid chance it’s going on your fridge or in your scrapbook. If nothing else, there’s also something to be said for photography without all the AI nonsense that’s so unavoidable now. We have guides on how to take photos on Android or iOS without all the post processing junk. But that can only go so far. On some level, every modern smartphone is doing some kind of digital processing to create a look that’s appealing to the vast majority of users. That can result in a smoothed over, generic look that might not actually be what you want. These are the best retro photography optionsSo, okay, you’re sold on the idea of experimenting with nostalgic aesthetics. Where can you get started? The great news here is that you have well over a century of camera history to play around with. Used camera gear is going to be your friend, and you can often find great tools for relatively cheap either online or at local photography shops. In general, there are a few interesting categories of dedicated cameras to check out: Polaroid-style instant cameras. Through a convoluted process of bankruptcy, acquisitions, and relaunches, Polaroid is back, but it’s not the only game in town anymore. Fujifilm, Kodak, and Lomography all offer their own brands of instant cameras that can snap photos and immediately print them out. Classic digital point-and-shoots. Today’s point-and-shoots are geared a bit more towards professional photographers who want a high degree of control. But you can find a lot of cheap, used digital cameras from the last couple decades that still take surprisingly good photos. In many cases, the digital noise or lens artifacts that would’ve been considered flaws when these cameras were new can offer creative opportunities to get a specific nostalgic look. Ancient smartphones. The earliest smartphones from the late 2000s had some pretty atrocious cameras by modern standards. But they also lacked a lot of the AI and post-processing that’s come to dominate the landscape today. You can find cheap, used smartphones on sites like eBay for as little as $50, which can be a handy way to get some authentically aughts-era photos without having to fake it. Fujifilm Instax Mini 12 Instant Camera Mint Green and Fuji Film Value Pack (60 Sheets) Bundle with Sturdy Tiger Accessories, Carrying Case, Photo Album 64 Pockets $171.95 at Amazon Shop Now Shop Now $171.95 at Amazon Some camera gear—particularly when it comes to DSLR and mirrorless camera lenses and systems—can retain their value for long periods of time. But there’s a wide range of used or outdated camera gear circulating that provide distinct looks and feels. As you explore older cameras, pay attention to the unique aspects of the photos they create, and experiment with how you can use those traits to convey a different story. How to get started with retro photographyCamera and smartphone manufacturers will never let you forget about their latest and greatest hardware, but where do you go to find the best gear from a decade or two ago? The used market for photography equipment can be scattered and fractured, but here are some tips to get started with your hunt: Check out your local photography or thrift stores. Few things can be more useful to a photographer than that one shop in town that always seems to have a used lens or proprietary power adapter that you need for your camera. And sometimes, if they trade in used camera gear, you can find unique devices that you wouldn’t find anywhere else. If you establish an ongoing relationship with your local camera shop, you can even get the opportunity to try out gear you might otherwise have to buy to experiment with. Search specialty camera gear sites. You can always find used cameras on generic auction sites like eBay, but for my money, I like checking out specialty sites like Adorama and Precision Camera. These sites offer a selection of used camera gear, and sometimes receive a better selection than you find on eBay. Every once in a while, I’ll sort the used camera section by lowest price and scroll to see what kind of budget options are on offer. Join a local photography group. Camera gear is expensive, but you don’t always need to invest a ton of money just to explore different aesthetics. In many cities, groups of photography enthusiasts will get together for photo walks or just to meet up and trade tips. Making friends with other photographers is a great way to learn from others and even share experience with each other’s equipment. Even if you don’t want to invest in camera gear specifically, it can be a helpful exercise to look back through photos from past eras and observe what they have in common. Pull out your old family photo albums and compare them to the photos on your phone. Grab a movie from your childhood and examine how it looks different from the polished reboot that just dropped a couple years ago. In photography—and all art—the details matter. A difference in color saturation, noise texture, or even how an image is framed can convey a world of meaning. As you explore the aesthetics of retro photography, your grasp of contemporary visual media can grow, making you better able to express yourself through visual art. View the full article
-
Oracle Launches AI-Powered Smart Assistant to Revolutionize Restaurant Support
Oracle has unveiled a significant technological advancement that promises to streamline operations for small restaurants and food services: the new Smart Assistant integrated into the Oracle Simphony Cloud Point of Sale (POS) system. This generative AI tool aims to empower restaurant teams to handle technical and operational challenges with unprecedented efficiency. For small business owners navigating the complexities of restaurant management, this innovation could be a game changer. With the Smart Assistant, restaurant staff can ask straightforward questions like, “Why isn’t the workstation printer working?” or “Why can’t I log in to Simphony?” The AI delivers immediate, actionable insights tailored to each brand’s specific guidelines, significantly enhancing the operational workflow. “The Oracle Simphony Cloud Smart Assistant is a game-changer for restaurant operators,” said Etienne Piat, vice president of service excellence and innovations at Oracle Consumer Industries. The AI tool not only reduces the workload for IT teams but also empowers staff to resolve common issues on the spot, ensuring that focus remains on providing exceptional guest service. One of the most appealing aspects for small business owners is the Smart Assistant’s ability to minimize dependency on external support. By allowing in-house staff to troubleshoot technical issues, restaurants can save both time and money. The contextual nature of the Smart Assistant means that staff can quickly access personalized support just by clicking on on-screen error messages. This feature transforms potentially stressful situations into manageable tasks, ultimately enhancing customer satisfaction. Early users of the Smart Assistant are already reporting improvements in their operations. The capability to integrate brand-specific standard operating procedures ensures that guidance reflects unique business practices. This offers a consistent approach across various locations, allowing small restaurant chains or franchises to maintain compliance without added complexity. This cohesive support can significantly improve both efficiency and service quality, essential factors for businesses aiming to stand out in a competitive market. Real-world applications of the Smart Assistant are vast. With immediate access to insights, staff can troubleshoot common POS issues such as login failures or device connectivity problems, thus reducing downtime. The built-in generative AI, trained on extensive Simphony documentation, enables meaningful answers to arise in real-time. This minimizes the need for external support calls and boosts first-time resolution rates—two critical metrics for any business looking to optimize operations. However, small business owners should also consider potential challenges. Implementing a new technology often requires training staff to effectively utilize new tools, which can initially divert attention from daily operations. Employees will need time to adapt to this system and provide feedback to refine the accuracy of AI responses. Moreover, while self-service technology can reduce support costs, it’s vital that businesses maintain a balance between automation and human assistance. As this technology becomes more ubiquitous, small restaurants will need to weigh the benefits against the potential disruption that comes with integrating a new system. Nonetheless, the Oracle Simphony Smart Assistant’s advantages appear significant, particularly for businesses striving to maintain peak performance while prioritizing customer experience. The Smart Assistant will be broadly available to Simphony Cloud customers worldwide within the next 12 months, supporting over 100 languages. As small businesses consider their operational strategies moving forward, it’s clear that advancements like these could redefine how restaurants handle in-house technical support and operational efficiency. For more details on how the Smart Assistant could benefit your restaurant, visit Oracle’s dedicated webpage. Image via Google Gemini This article, "Oracle Launches AI-Powered Smart Assistant to Revolutionize Restaurant Support" was first published on Small Business Trends View the full article
-
How to Play Retro Games on Your Modern Phone or TV
We may earn a commission from links on this page. One of my favorite ways to spend my free time is watching old movies. I love catching up on classics on my big OLED screen, and delving into the history of a medium I love. Unfortunately, it’s harder to do that with video games. While pulling up an old movie is usually as easy as finding it on streaming or renting it digitally, old video games are split across a number of different consoles, and you can’t always count on rereleases to make them accessible on modern systems. Luckily, there are still options for those who go looking for them. You can hunt down a vintage system and hook it up to your modern screen using an adapter, yes, but you can also use the power of modern devices to “emulate” these games in virtual environments, often with improvements—and if you do it right, it's all perfectly legal. What is video game emulation?Emulation is a massive rabbit hole, and can get about as deep as you want it to be. I’ve been using it for decades, and I’m still learning new things. But there are some basics you should know that will help you get started, including how it works, its legal status, the drawbacks of not playing on real hardware, and the benefits it offers beyond simple convenience. Credit: Michelle Ehrhardt Essentially, emulation uses the power of modern machines to brute force virtual environments that are close enough to real hardware that files designed for it think they’re running on the real deal, and will boot up like they are. Usually, this means emulators won’t come out until one or two generations after a console's official release, but there are now emulators for everything from the Nintendo Entertainment System to the Nintendo Switch (which runs on older hardware than you might think). Granted, you might expect Nintendo’s not too happy about that, but the kicker is that there’s not a lot the company can do about it (aside from trying its best to scare emulator developers). A court case from way back in the day ruled that, so long as emulators don’t distribute copyrighted software, they’re allowed to write their own code that mimics official hardware all they want. That means you’ll need to provide your own games for your emulators, and in some cases BIOS (or operating system) files. To stay on the right side of the law, most emulator guides won’t tell you how to go about that, but there’s at least one method that’s totally fair game. It turns out that U.S. law allows you to make your own backup copies of games you own, so long as you don’t distribute them. With that, there are plenty of legal devices that will help you rip your game files from your own cartridges and discs, some of which I cover here. Some emulators are even so advanced that they’ll play your real discs if you simply put them in your PC’s disc drive. Still, even if everything’s above board, there are a few drawbacks to emulating rather than playing on real hardware. The biggest issue you’ll notice is with accuracy, as some games might have graphics or audio bugs. Input lag is also a common complaint, as emulators often need extra time to register your button presses, since they need to both read them and feed them through your software. Finally, some games might not even run on emulators at all, especially ones with unusual requirements. The original Xbox, for instance, is notoriously difficult to emulate. On the flip side, though, there are benefits to emulating that real hardware can’t replicate, and they mostly come from the extra power of your modern device. Emulated games can often run smoother than on real hardware, hitting higher frame rates. You’re also usually able to render your games at higher resolutions than originally intended, basically playing them in HD. And most importantly for difficult games or flexible play sessions, you can use save states, which allow you to quickly save your current place in a game to a file, and reload it on demand. This, in turn, allows you to save your game whenever, outside of whatever save system is built into it. It’s perfect if you only have a few minutes to play, or if you’re about to fight a difficult boss in a punishing retro game and don’t want to replay the whole level if you mess up (no judgment here). Because save states essentially take your emulator back in time, they can introduce instability, so it’s advised to use them in addition to more traditional saves, rather than as a full-on replacement for them. Emulators for more modern, difficult-to-run HD systems also may not support save states. Still, those are enough improvements that I often prefer playing retro games through emulation, even if I have real hardware available to me. And while some of those enhancements are available on official emulation—Nintendo Switch Online has save states, for instance—not all of them are. I haven’t even gotten into widescreen hacks, which lets you play old 3D games in a more modern aspect ratio without stretching the video, or HD texture packs yet. Benefits like these are why, if you’re willing to put in a little elbow grease, unofficial emulators are well worth trying out. What you need to start emulatingThe fans who develop emulators are crafty, and they’ve had plenty of time to refine their work, so most modern devices are able to emulate retro games to some degree. It’s become a running joke that Doom will play on just about anything, including a pregnancy test. But from a realistic point of view, there are a few things you’ll probably want on hand before you get started. Credit: Michelle Ehrhardt If you’re playing on a laptop or a desktop computer hooked up to a monitor, you’ll probably want a controller. Most emulators will support mouse and keyboard controls if you truly can’t get one, but using a controller will help a lot with the old school console experience. Aside from that, though, you might be all set. I have a desktop PC that’s pushing 10 years old at this point, and it’s still able to emulate games through the PS2 and GameCube era at full speed, while upscaling them. Beyond that is when emulation tends to get a bit more demanding, but for retro games, you probably won’t need to upgrade your machine. If you want to play on a TV, though, you could have a bit more of a shopping list in store for you. In addition to a controller, you’ll also need some type of computer to emulate your games with, and while you can drag a laptop or desktop PC into your living room, it’s often not the most convenient solution. Instead, I suggest getting either a docked Steam Deck or a Raspberry Pi. The former’s a bit pricier, and has had stock issues during the RAM crisis, but it’s also compact, plenty powerful when it comes to emulation, easy to output to a TV via a dock, and can play PC games natively, too. With the right Steam Deck emulation setup, you can essentially turn it into your own homebrew Nintendo Switch, but for all your consoles. The latter, meanwhile, is far cheaper (although its price has also been inflated by the RAM crisis) and smaller, but will take a bit of knowhow to set up and can struggle when emulating systems released after the PS1. Your best bet if you choose to go this route is probably to buy a Raspberry Pi kit, as these will come with a case, cables, storage, and often a fan to get you started. You can also sometimes find these cheaper than a Raspberry Pi motherboard on its own. But again, the world is your oyster when it comes to which devices you want to emulate with. It’s possible to emulate on a streaming stick or box, too. Generally, if a device has a motherboard and can display a video signal, people will usually find a way to game on it. To wit, you should look into emulating on mobile devices, too. These days, it’s possible to both emulate on an iPhone and on Android, and there is a whole slew of handheld gaming consoles that essentially build controllers into phone hardware running Android to give you an experience similar to a DS or PSP. These can be a great way to play portably, whether using touch controls, a Bluetooth controller, or built-in controls. And if you get a USB-C dock, you can then connect these devices to the big screen to play on them when you get home. You can even get a cheap handheld that runs Linux, for a similar experience to a Raspberry Pi while on the go. Read on here for more details about portable emulation. Which emulators to get, and how to set them upNow, it’s time to actually install your emulators, of which you have many choices. I've compiled a list of the apps you’ll probably be using to emulate your games, depending on the platforms you're interested in, before going into how to get them: Retroarch: An app with multiple emulator “cores” in it, that can run games from most systems up through the PS1 era, including the Super Nintendo and Sega Genesis. Duckstation: A standalone app for emulating PS1, with enhanced stability and graphics features compared to Retroarch. Mupen64Plus: A standalone app for emulating Nintendo 64, with enhanced stability and graphics features compared to Retroarch. Flycast: A standalone Sega Dreamcast emulator with support for upscaled graphics and widescreen hacks. MelonDS: A standalone Nintendo DS emulator with community-driven forks that can run on two separate displays for a more authentic experience. Azahar: A standalone Nintendo 3DS emulator with community-driven forks that can run on two separate displays for a more authentic experience. Supports custom graphics drivers on mobile. PPSSPP: A standalone PSP emulator with a highly themed user interface reminiscent of the original console. Dolphin: A standalone GameCube and Wii emulator with high stability, support for custom mobile graphics drivers and upscaled graphics, and the ability to use motion controls. Usually preferable to emulating PS2 or Xbox, if playing a multi-platform game. PCSX2: A standalone PS2 emulator with support for upscaled graphics. Best used for PS2 exclusives. Not available on mobile. NetherSX2: A standalone PS2 emulator for mobile. Many of the same features as Dolphin, but lower stability, and no motion control or custom driver support. Cemu: A standalone Wii U emulator with support for custom mobile graphics drivers and upscaled graphics. No save state support. Requires a high-end machine. RPCS3: A standalone PS3 emulator with support for upscaled graphics, custom mobile graphics drivers, and save states. Requires a high-end machine. Eden: A standalone Nintendo Switch emulator with support for upscaled graphics and custom graphics drivers on mobile. No save state support. Requires a high-end machine. Phew, that’s a lot. On the plus side, most of these emulators are available for Windows, Mac, Linux, and Android, although iOS users have a bit less to pick from, as Apple restrictions on certain programming techniques mean higher-end devices like GameCube and beyond are difficult to run on its phones. On the plus side, iOS does have access to some potentially more convenient options for older systems, like Delta, which comes with cute touchscreen control overlays built-in. Now, you could install these apps one-by-one, point them at your game files (which you’ll usually be guided through as part of setup), and play your games by booting up the specific emulator you want and picking the game you want to play from a list. But not only is that slow and inconvenient, it’s not as pretty and is less like using an actual retro console. To solve that problem, we have installers and frontends. Emulator installersIn this case, installers are programs that will help you set up all your emulators in one fell swoop, or will sort your games into collections by system or genre for you, and will boot you into the appropriate emulator when you select a game. For installers, you have a few options. My favorite is Emudeck, which despite being named after the Steam Deck, will run you through a simple setup wizard that will install any emulator you could possibly want, whether you’re on steamOS, Linux, or Windows. There’s also an Android version in the works, and you can get early access to it if you subscribe to the development team’s Patreon. Alternatively, there’s Retrodeck. This is a Linux-only tool, but some users prefer it to Emudeck thanks to more fluid hotkey settings and a less bug-prone (but potentially slower) update process. Nicedeck is another alternative that aims to hit a middle-ground between Emudeck and Retrodeck, and conveniently is the only one of these options that also works for Mac. As someone who just manually installed a bunch of Android emulators one-by-one, I would highly recommend using an installer to automate the process instead—´specially because many Android emulators need to be sideloaded, something that is about to get harder starting next year. An installer will also usually help you set up configurations like individual desired aspect ratio and upscaling settings for each system you want to play, too, which will save you some tedious trips to each individual emulator’s settings menu. But just because your emulators are installed doesn’t mean we’re done yet. Instead of having to bounce from emulator app to emulator app and scroll through what can often be ugly built-in menus, let’s put all your games in one convenient, easy-on-the-eyes place. Emulator frontendsA frontend is an app that will sort your games by system, or by custom collections you set up, like genre. You’ll choose a game from one of its many lists, and the frontend will tell the appropriate emulator app to boot up the game. Then, when you’re done gaming, your emulator will take you back to your frontend. It’s a much more intuitive and console-like experience, and people have created plenty of themes to make them look just as nice as official console menus. Many frontends even come with “scrapers” built in, so they can fetch and display box art next to your games. Credit: Michelle Ehrhardt The most common and robust choice here is ES-DE, or Emulation Station Desktop Edition. It’s what I use personally, and comes packaged with installers like Emudeck and Retrodeck. It has the most configuration options available for it, but can be a bit slow to boot. Also, while it’s free on Windows, Mac, and Linux, a small one-time Patreon donation is required to get the app for Android. ES-DE alternatives on desktop are rare, but options like LaunchBox may be preferable for some users. Other frontend apps are more common on Android, as ES-DE took some time to come to Android, and some users prefer a more playful interface while on mobile. Popular free options include Daijisho and Beacon, although I’m particularly interested in Cocoon, which is modeled after the Nintendo 3DS menu and has built-in dual screen support. Another option, if all of this sounds like too much setup, is to use Batocera. This is a Linux install that essentially packages largely pre-configured emulators for a wide variety of systems alongside a customized version of Emulation Station. Basically, you install it on your compatible device and boot into it separate from your main operating system, so everything lives in its own confined home. While that means it’s a bit limited, it’s also mostly plug-and-play. It’s also possible to run Batocera off a USB stick or SD card, if you don’t want to install it onto your device’s internal storage. How to choose and install emulators and frontends for different systems and devices could be a whole series of articles on its own, but the community is welcoming, and is doing its best to make emulation easy and available to as many people as possible. The above programs should be enough to get you started, but if you have additional questions, experts like Retro Game Corps and subreddits like r/emulation are always there to help you out. How to make your games look old school (or HD) Credit: Michelle Ehrhardt When emulating a game, you have three options: You can go with the raw emulation output, which will by default usually mimic a console’s native resolution but might not look fully accurate depending on the screen you’re playing on; you can upscale the resolution for a more HD image, and can even apply fanmade texture packs to make individual games look even crisper; or, you can turn on a CRT filter to try to get a more retro feel, as well as help pixel art or low polygon models look a bit smoother. Frankly, this is another area where it’s possible to go on for days. You can mix and match different options from these approaches to your heart’s content, and Retroarch alone has hundreds of filter and shader options built-in (options do differ from emulator to emulator). Improving the look of 3D emulated gamesFor 3D games, the idea is to try to get a more modern experience. Widescreen hacks are a good place to start. These extend the aspect ratio to 16:9, then apply tweaks to the emulation so that the screen renders more of the play environment instead of simply stretching the default 4:3 image. It doesn’t work for every game, and can break the design in others (Resident Evil has very purposeful camera angles), but it’s often worth trying, especially in games where situational awareness is helpful, like platformers. HD Texture packs, meanwhile, help clear up low-resolution HUDs or 2D assets (which are still quite prevalent in 3D retro games) that won’t be covered by upscaling. These need to be developed on a per-game basis, so you’ll need to search for them, but a popular example is Henriko Magnifico’s 4K Zelda texture packs. Personally, I do think these can sometimes interfere with a developer’s intended art style too much, but some people swear by them. Improving the look of 2D emulated gamesFor 2D games, I like to try to make my game look like it’s playing on an old-school TV, and that’s not just for flavor. Pixel art was designed with CRT televisions in mind, which would smooth and blur harsh edges together to make pixels look more hand drawn (here’s a good example). You lose that effect if you just use raw emulation footage on a modern television, but you can mostly get it back with the right filters. This is far from a solved issue, but so far, my favorite option is the zfast-crt.slangp shader in Retroarch (found in the Quick Menu under Shaders). This is a subtle effect that feels far more accurate to me than the CRT filters often included in official retro game collections, and it’ll work on any device that runs Retroarch. What’s great is that, because this is included with Retroarch, it’ll also work for any system that Retroarch supports, which includes most retro consoles you would play 2D games on. But CRTs provided an additional benefit beyond making pixel art look nice. Because of the way they scan in their images, they’re highly resistant to motion blur. If you have a device with a 120Hz screen, you can mimic this using a technique called black frame insertion. This technique inserts a single black frame into every other frame of your video output, breaking up the image and helping your eyes reset. While this will slow down your gameplay on a standard 60Hz screen, a 120Hz screen will let you use black frame insertion while still getting 60 fps gameplay. This is built into a toggle in Retroarch’s default Settings > Video > Synchronization page, but to be honest, I find this implementation comes with some pretty intense flickering. Instead, I prefer the crt-beam-simulator.slangp shader developed by the folks over at Blur Busters, which has a more subtle effect that looks more like the old school TVs I remember from back in the day. Getting this running in Retroarch takes a few extra steps, but luckily, Retro Game Corps has a great video walking through it, including how to tweak it to your liking and combine it with the zfast-crt.slangp if you’d like. With tools like these, it’s clear that the appetite for playing games from older consoles isn’t going anywhere anytime soon, even if it’s harder than pulling up an old movie on Netflix. Whether you’re on PC, Mac, a Steam Deck, or mobile, you’ve got plenty of options already, even as hardware costs rise. From where I'm sitting, the frontier for retro gaming looks bright. View the full article
-
Where paid media optimization should stop in long sales cycles
In long sales cycles, a lot of what happens after lead submission involves people. When you optimize campaigns to final sales, you’re teaching the ad platform to respond to how well the sales team performed that month rather than lead quality, and that’s a problem no amount of campaign changes will fix. The common advice is to “optimize the full funnel” (i.e., track media spend to revenue, optimize campaigns to sales, etc.). But beyond lead capture, most of what drives sales has little to do with your paid media. It’s about who’s on the sales team, how busy they are, and dozens of other factors you can’t influence through targeting or creative. When your sales team becomes the signal I’ve spent over 15 years in financial services marketing, but this isn’t unique to mortgages or insurance. If your sales process relies heavily on people, you’ll recognize this immediately. In most businesses, there’s someone like Dave. In my case, he’s a mortgage adviser, but in yours, he might be your top enterprise sales rep, your star business development manager, or your best project estimator. He closes deals at twice the rate of his colleagues, not because he gets better leads, but because he’s naturally gifted at building rapport, asking the right questions, and guiding anxious customers through difficult decisions. However, Dave isn’t always there. Sometimes he’s on vacation, sometimes he might leave the company for a better opportunity, or sometimes your business hires three more Daves. The makeup of your sales team likely changes constantly. You might have more experienced closers one month, fewer the next, a recruitment drive that brought in several new starters, or Dave and two of his colleagues leaving within a month of each other. Sales rates can swing dramatically based purely on who’s in the office, regardless of lead quality. This can lead to targeting problems. For example, when the conversion rate drops because Dave’s away and a junior team member is covering his accounts, the algorithm sees it as a targeting problem rather than a staffing issue. If you’ve set your campaigns to optimize for sales, it thinks, “Our targeting stopped working. These clicks are lower-quality for this conversion action now. We should shift spend away from these audiences.” Eventually, this could result in keywords that were previously working well being turned off, audiences that were driving sales volume no longer being bid for, and, eventually, a decline in the entire account’s performance. But the leads haven’t changed, only the team has. Dig deeper: How to diagnose and fix the biggest blocker to PPC growth Your customers search everywhere. Make sure your brand shows up. The SEO toolkit you know, plus the AI visibility data you need. Start Free Trial Get started with Operational factors that distort your conversion data It’s not just the sales team makeup either. Let’s say: The team gets slammed in Q4 as everyone tries to close before year-end, response times stretch from two days to over a week, and customers get impatient and look elsewhere. Perhaps market conditions shift, and your most competitive product gets pulled. Or summer vacations mean the team is running short-handed, and some leads go cold before anyone contacts them. Then September comes and everything bounces back to normal. It goes beyond the day-to-day. Budget approvals get delayed, product ranges change, and planning delays push projects back. The specific reason varies by business, but the effect on your conversion data is always the same. The algorithm ends up thinking targeting got worse when, in fact, the team was just busy with leads from other sources. When Dave becomes a superhuman: The Santa Claus Rally The Santa Claus Rally, also known as the December Effect, is the best example I’ve seen of how human behavior can throw off algorithmic targeting. Every December in financial services, something strange happens. In the third week of December, conversion rates from lead to sale spike dramatically. We’ve seen increases of up to 150% compared to normal weeks. If campaigns are optimized for sales, the algorithm thinks, “Whatever we’re doing this week is working incredibly well!” Then the holiday week arrives, and everything crashes, with conversion rates plummeting to a fraction of normal levels. None of it has anything to do with paid media. In week three, Dave and his colleagues are in target-hitting panic mode. End-of-year bonuses are on the line, and there’s one final push before the holiday break, so they’re calling leads faster, following up more aggressively, and closing deals they might typically have let simmer. Dave is working like a machine. Then the holiday week arrives, and everyone’s mentally checked out, customers aren’t answering phones, and Dave has finally taken time off. The team that’s still at work is thinking more about family get-togethers and less about targets. The lead quality, targeting, and ads haven’t changed. The team is just working at different levels of intensity due to seasonality. The algorithm overpays for normal performance and underbids for identical audiences, purely based on when Dave and his team take their vacations. Dig deeper: How to analyze your marketing funnel and fix costly drop-offs Where optimization should actually stop So if optimizing for sales is being distorted by things outside your control, how should you draw the line? How can you balance this lead distortion and still drive the right type of leads? The answer is your last point of control, which, for these kinds of sales, means at lead submission. But not just simply counting leads. Instead, value them based on both likelihood to convert and the commercial value of the end sale. The other issue is that most high-value businesses only generate a handful of sales per month, which isn’t enough data for automated bidding to learn anything useful. Lead valuation also solves this issue by providing the platform with hundreds of conversion events rather than a few sales. This means automated bidding can actually function properly, campaign and audience testing can become meaningful, and the data stays reliable. You’re optimizing to lead quality before Dave and the sales team get involved. To be clear, importing downstream conversion stages or revenue into ad platforms can be extremely powerful. But optimization to those signals only works when volume is sufficient, conversion lag is manageable, and the sales process is stable. Get the newsletter search marketers rely on. See terms. How to build lead valuation The starting point is your historical data, ideally 12 months of it, though you can work with six. You need to understand which leads actually closed, what they were worth, and what they had in common at the point of inquiry. For financial services, it’s things like loan amount and term. For B2B, it might be company size or sector. For construction, it’s usually project size and urgency. From there, it’s about grouping leads by their likelihood to close to a sale and by what a typical deal size looks like, and then assigning each group an expected revenue value. The check to make sure it’s working as expected is simple. The total estimated value you assign to your leads over a period should roughly match the revenue they actually generated. If not, the model needs work. Ideally, you should revisit it at least quarterly as your campaigns and operational factors change. As an example, you might end up with a high-likelihood lead worth $850, a mid-range lead at $420, and a lower-likelihood lead at $120. Once you have that, set up your conversion tracking to pass the expected value back to the platform on your conversion action and use value-based bidding (target return on ad spend in Google Ads) to point the algorithm toward the leads that are actually worth chasing. Dig deeper: How to make automation work for lead gen PPC Optimize for what you can control “Optimize the full funnel” sounds sensible until you realize how much of that funnel you don’t actually control. You can influence the targeting, the creative, the landing page, and the experience that gets someone to submit a form. After that, it’s over to Dave and the sales team, and dozens of other factors that have nothing to do with your campaigns. When you expect an algorithm to optimize for things it can’t see, it will start drawing the wrong conclusions, chasing the wrong audiences, and getting worse over time. The answer isn’t to stop measuring what happens after lead submission. You absolutely should continue measuring, as those numbers can tell you a lot about what’s going well and what might need to be corrected for. Remember: When lead quality stays steady, but sales drop, that’s an operations issue, not a paid media one. When both drop at the same time, look at your campaigns. When sales spike, but lead quality is flat, that’s Dave having a great month, not your targeting. That visibility is genuinely helpful, but it just shouldn’t be what you’re optimizing to. Build lead valuation, feed expected values back to your platform, and let the algorithm do what it’s actually good at: finding people who look like your best leads. Leave the rest to Dave. Know where your control ends, as that’s where optimization should stop. View the full article
-
my boss asked me to mentor my coworker, but it’s really my boss who needs mentoring
A reader writes: About three years ago, we had a new manager start at my job, Fergus. Fergus is a very nice guy, but has never been a manager before. He delegates some of his core tasks to us, and seems to struggle with things like project management, clear and proactive communication, and HR-type stuff. It doesn’t happen all the time, but when he has a tricky situation, he will come to me and ask my opinion on how to handle it, and I coach him on what to say and what actions should come next. (Before I started here eight years ago, I’d been a department head at my previous company. That place was toxic as hell, and I happily took a step down out of management to get out of there.) Two weeks ago, Fergus asked me to be a mentor to one of my colleagues, Chip. Please note that there is no real hierarchy in our department; other than Fergus, we are all peers on the org chart. Chip is older than me, a gem, and also a bit quirky. Most of Fergus’s “what should I do?” questions in the past were in relation to Chip. We do very different jobs within the department, but I agreed to the mentorship as long as it was what Chip wanted. Chip just wants the drama to go away so he can focus on his work. All agreed to the mentorship. For the last week, I’ve been talking with three people who Fergus told me had lodged serious complaints against Chip, so I could get an idea of what goals to work towards. The first person gave me a lot of valuable feedback about how, yes, there were some instances that Chip could have handled better, but a lot of the issues could be solved by having more consistent department procedures and communications tools. The second person had a lackluster interaction with Chip two years ago. They worked through the issues that led to the misunderstanding, and she showed me an email thread that showed the new procedures were working fine and she was satisfied. The third person had no idea why I was asking her questions. She had no issues with Chip or anyone in our department. She had never spoken to Fergus. As far as we can tell, a few weeks ago she was raging about a bad experience with an external vendor, and one of her office mates is Fergus’s spouse. (Many yikes happening here, and I had to reassure her that she had done nothing wrong and no one was in trouble.) At this point, I think it’s clear that while Chip could use a bit of mentorship on “reading the room” and working with sensitive customers, most of the work really needs to happen with Fergus. We need better department procedures, and Fergus needs to work on his own leadership skills. He’s a nice guy and I think as much as he wants to do well here, he seems to have some sort of anxiety around HR-type things. These instances that are looming large in his mind are old news or nonexistent issues based on rumors and assumptions. I agreed to the mentorship because, while I do believe that Fergus should be the one doing this in theory, I want Chip to stay and be successful and I don’t think that will happen if Fergus tries to mentor him. So … how do I “manage up” with Fergus? I just got done teaching my whole department about Change and Project Management because too many situations had happened because we lacked those processes. Now I’m doing HR type stuff. I’ve drawn the line that I’ll only interfere in management if it is negatively impacting me. I am looking for a script for how to talk to Fergus about his own leadership journey while also not becoming his mentor. I can’t go to my grandboss, because Fergus and his spouse are very well-connected and I don’t want to spend my political capital there. I just want to be left alone to do my work. Nooo, don’t get involved in this at all. You’re not Chip’s manager, you’re not being paid to do this work, and the fact that Fergus would prefer not to do it and is bad at it doesn’t make it your job. If your company wants it to be your job, they need to pay you for it and give you a level of authority that would make this sort of coaching and intervention appropriate. It’s not inherently inappropriate to be asked to coach or mentor a peer, but this is much more than that — it’s not appropriate for you to be digging into other people’s concerns about a peer, even though Fergus asked you to. That’s squarely Chip’s manager’s job. Unfortunately he doesn’t have a manager who’s willing or able to do it, but that doesn’t mean you should step up and do it yourself. Tell Fergus you talked with the three people he suggested and learned that two of them didn’t have issues with Chip at all and the other raised issues that could be solved by more consistent department procedures and communications tools. Then say that in doing this, you realized that it didn’t feel appropriate for you to dig into a peer’s performance in that way and you’re not comfortable staying in that role for a peer, so you’re going to officially hand that responsibility back to Fergus. If you’d like to do this stuff, you could say, “If at some point there’s room to create an additional manager role on the team to work on issues like this, I’d definitely be interested in being considered for that. But otherwise I realize I’ve overstepped and prefer to stick to being Chip’s peer.” If Fergus tries to tell you that you’re not overstepping because you deputized you to act in his place, you can say, “I appreciate you putting that trust in me. I’m really not comfortable doing that without formally having a job that would give me standing to do that kind of management with a peer. But if it’s an option to formalize that kind of arrangement, I’d love to talk about that.” It is similarly not your job to talk to Fergus about his own leadership deficiencies. You can certainly flag that your team needs better processes for X or that situation Y is a problem, and if he expresses uncertainty about how to handle those things, there may be room to say at some point, “I know there are some great classes on management that HR has sent people to for things like this” (or something similar). But anything beyond that is getting into coaching Fergus on management, and that’s something that needs to come from above him. Not only is it inappropriate for you to try to do it from below, but if you did try, it’s likely to mean (a) tons of unpaid labor from you, (b) probably with very little payoff (because if Fergus hasn’t figured out after three years that he needs to learn to manage, it’s highly unlikely that you’ll be able to cajole him into it from below), and (c) is highly likely to be a huge exercise in frustration because it will allow you to think this stuff might change when in fact very little probably will. (I have been in exactly those shoes before, and it is a fruitless, frustrating path that will suck out all your energy and not pay you for it.) You said you just want to be left alone to do your work, and the good news is: you can be. But to do that, you need to decline Fergus’ attempts to delegate his management work to you, and you have to accept that the department is probably going to stay poorly run. The post my boss asked me to mentor my coworker, but it’s really my boss who needs mentoring appeared first on Ask a Manager. View the full article
-
As the U.S. cripples Cuba with a blockade, Trump gives a Russian oil tanker access
President Donald The President on Sunday night said he has “no problem” with a Russian oil tanker off the coast of Cuba delivering relief to the island, which has been brought to its knees by a U.S. oil blockade. “We have a tanker out there. We don’t mind having somebody get a boatload because they need … they have to survive,” The President told reporters as he flew back to Washington. When asked if a New York Times report that the tanker would be allowed to reach Cuba was true, The President said: “I told them, if a country wants to send some oil into Cuba right now, I have no problem whether it’s Russia or not.” On Monday, Russia’s Transport Ministry said the oil tanker Anatoly Kolodkin arrived at the Cuban port of Matanzas carrying “humanitarian supplies” of about 730,000 barrels of oil. The vessel is sanctioned by the United States, the European Union and the United Kingdom following the war in Ukraine. Kremlin spokesman Dmitry Peskov said Monday that Russia had previously discussed its oil shipment to Cuba with the United States. “Russia сonsiders it its duty not to stand aside, but to provide the necessary assistance to our Cuban friends,” he told reporters. The President, whose government has come at its Caribbean adversary more aggressively than any U.S. government in recent history, has effectively cut Cuba off from key oil shipments in an effort to force regime change. The blockade has had devastating effects on the civilians The President and Secretary of State Marco Rubio say they want to help, leaving many desperate. Islandwide blackouts have roiled Cubans already grappling with years of crisis, and a lack of gasoline and basic resources has crippled hospital and slashed public transport. Experts say the anticipated shipment could produce about 180,000 barrels of diesel, enough to feed Cuba’s daily demand for nine or 10 days. Cuba has long been at the heart of geopolitical tug-of-war between the U.S. and Russia, dating back decades. The President on Sunday dismissed the idea that allowing the boat to reach Cuba would help Russian President Vladimir Putin. “It doesn’t help him. He loses one boatload of oil, that’s all it is. If he wants to do that, and if other countries want to do it, it doesn’t bother me much,” The President said. “It’s not going to have an impact. Cuba’s finished. They have a bad regime. They have very bad and corrupt leadership and whether or not they get a boat of oil, it’s not going to matter.” He added: “I’d prefer letting it in, whether it’s Russia or anybody else because the people need heat and cooling and all of the other things.” Associated Press reporters Megan Janetsky and Andrea Rodríguez contributed to this report. —Darlene Superville, Associated Press View the full article
-
UK government on verge of full nationalisation of British Steel
Talks with Chinese owner Jingye continue while losses mountView the full article
- Today
-
AI citations explained: how they work and how to get them
AI search is changing how visibility works. Users are getting direct answers instead of clicking links, which means fewer chances to drive traffic. In this shift, AI citations are becoming the new gatekeepers, deciding which sources get featured in answers. Over the past year, search has moved from ranking pages to selecting sources, pushing us from traditional SEO toward AI-driven visibility. In this article, we’ll explain what AI citations are, how they work, and how you can earn them. Table of contents What are AI citations? How AI citations impact brand credibility How AI citations work: a complete breakdown Strategies to get cited by AI models Tracking AI brand presence with Yoast FAQs on AI citations AI citations: The currency of the AI-driven web Key takeaways AI citations are references that search engines include in AI-generated answers, enhancing credibility and visibility This shift in visibility moves from traditional SEO ranking to AI-driven inclusion as a key factor for brand presence AI tools retrieve information from diverse sources, with citations coming from both top-ranking and deeper pages To earn AI citations, create valuable, structured content and establish topical authority across your niche Tools like Yoast AI Brand Insights help track your AI visibility and citation presence across platforms What are AI citations? Citations have always been a way to show where information comes from and why it can be trusted. The same idea now applies to AI-generated answers. AI citations are the references that search engines and AI tools include to support the answers they generate. When a tool like ChatGPT responds to a query, it often points to specific pages or sources that back up the information. These references act as signals of credibility, helping users understand where the answer is coming from and giving them a way to explore the original content. In simple terms, if your content is cited, it becomes part of the answer itself, and not just another link in the results. AI citations vs the blue link era If AI citations determine what gets included in answers, it’s worth asking how this differs from how search used to work. Because this isn’t just a feature update, it’s a shift in how visibility itself is earned. In the traditional model, ranking higher meant getting more clicks. In AI-driven search, being selected as a source matters just as much, if not more. AspectTraditional SEOAI citationsVisibilityBlue linksAi-generated answersTrafficClick-drivenInfluence-drivenAuthority signalBacklinksCredibility and accuracyUser actionVisit websiteConsume instant answers This doesn’t mean traditional SEO is going away. Rankings, indexing, and backlinks still play a critical role. However, how that value gets surfaced is changing. Instead of just competing for position on a results page, you’re now competing to be part of the answer itself. Do check out Alex Moss’s talk at BrightonSEO, 2025, on the evolution of search intent and discoverability. Where do AI citations come from? Before you try to earn AI citations, it’s important to understand where they actually come from. Because you’re not just competing with other blog posts, you’re competing with an entire information ecosystem. AI models pull their answers from a mix of sources: Web content: Blog posts, guides, landing pages, and long-form articles Structured sources: Platforms like Wikipedia, documentation hubs, and product data feeds Forums and UGC: Discussions from Reddit, Quora, and Stack Overflow First-party data: Brand websites, help centers, and official resources How the sources are selected is quite interesting. A recent analysis of Google’s AI Overviews found that citations don’t strictly come from top-ranking pages. In fact, only about 38% of cited sources rank in the top 10 results, meaning a large share comes from deeper pages or alternative formats. Another key insight by CXL: AI models tend to prioritize clear, early answers within the content, with a significant portion of citations pulled from the top sections of a page rather than from deeper sections. The takeaway is simple. AI systems are not just ranking content; they are selecting the most useful pieces of information across formats and sources. That means your content is competing not only for rankings but also for clarity, structure, and trustworthiness across this entire ecosystem. Types of AI citations Not all AI citations look the same. Depending on the query and intent, AI models pull in different types of sources to support their answers. Broadly, you’ll see three main types: Informational citations These are the most common. AI tools refer to blog posts, guides, and educational content to explain concepts or answer questions. If someone asks, “what are AI citations,” the sources cited will typically be long-form, explanatory content. Product citations These show up in commercial or comparison queries. For example, “best SEO tools” or “top project management software.” Here, AI models cite product pages, listicles, and review-based content to support recommendations. Multimedia citations AI doesn’t rely solely on text. Videos, images, and other visual formats can also be cited, especially when they better explain something than text alone. Think tutorials, walkthroughs, or demonstrations. How AI citations impact brand credibility AI citations don’t just drive visibility. They shape how your brand is perceived before a user even visits your website. When your content is cited in an AI-generated answer, some of that trust transfers to your brand. You’re no longer just another result on a page; you’re part of the answer itself. And that changes how users interpret your authority. This also means buyer decisions are starting earlier. Users may form opinions, shortlist options, or even make decisions directly from AI responses, without ever clicking through. If your brand isn’t cited, you’re not part of that consideration set. There’s also a strong signal of relevance at play. Being included in AI answers suggests that your content is not just optimized, but genuinely useful in context. It tells both users and algorithms that your brand deserves to be surfaced. Over time, this creates a compounding effect. The more your content is cited, the more your brand becomes associated with specific topics. That repeated exposure builds familiarity, authority, and trust. How AI citations work: a complete breakdown So far, we’ve talked about what AI citations are and where they come from. But how do AI systems actually decide what to cite? Let’s break it down. AWS At a high level, most AI-powered search systems follow a retrieval-and-synthesis process, often powered by approaches such as Retrieval-Augmented Generation (RAG). In simple terms, they don’t just generate answers; they find, evaluate, and assemble information from multiple sources before deciding what to cite. Here’s what that process looks like in practice: 1. Query understanding Everything starts with intent. The AI interprets what the user is really asking, whether it’s informational, navigational, or commercial. This step shapes what kind of sources it will look for. 2. Retrieval of sources Next, the system pulls in potential sources from multiple places: Web indexes Training data patterns Live retrieval systems (depending on the model) This is where your content first enters the consideration set. 3. Source evaluation Not all sources are treated equally. AI models evaluate them based on: Relevance to the query Authority and trust signals Clarity and structure of information Entity-level trust (how credible the brand or author is) When you look at these signals closely, they all point in one direction. Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T) play a central role in determining what gets cited. In other words, AI systems aren’t just looking for answers; they’re looking for reliable sources behind those answers. 4. Answer synthesis Instead of showing individual links, AI combines insights from multiple sources into a single, cohesive answer. This is where your content may be used, even if it’s not directly cited. 5. Citation selection Finally, the model decides which sources to: Explicitly cite (with links or references) Implicitly use (without direct attribution) This is the step that ultimately determines your visibility. How this differs across AI systems While the core process is similar, different AI tools prioritize different parts of this pipeline. AI systemsHow it handles citationsChatGPTLeans more on third-party sources and consensus, such as directories, reviews, and aggregator sites, rather than relying heavily on brand-owned content.PerplexityFocuses on retrieval-first behavior, pulling from a wide range of web sources and surfacing multiple citations to support transparency (strong emphasis on external validation).GeminiPrioritizes brand-owned and structured content, especially pages that are clearly organized and easy to interpret. Must read: Why does having insights across multiple LLMs matter for brand visibility? Key signals AI models use for citing content Even though the process is complex, the signals that increase your chances of being cited are surprisingly consistent: Well-organized structure: Clear headings, bullet points, and logical flow make it easier for AI to extract information Evidence-based reasoning: Content that references data, sources, or supporting claims is more likely to be trusted Timeliness and relevance: Fresh, updated content often gets prioritized, especially for evolving topics Authoritative voice and depth: Content that demonstrates expertise and covers a topic comprehensively stands out Topical consistency: Brands that consistently publish around a topic are more likely to be recognized as reliable sources The key takeaway here is simple: AI citations are not random. They are the result of a structured evaluation process in which clarity, trust, and relevance determine who is included in the final answer. Must read: How to use headings on your site Strategies to get cited by AI models So far, we’ve looked at what AI citations are and how models decide what to cite. The next question is the one that matters most: how do you actually get cited? Because this isn’t just about creating content, it’s about sending the right signals that your content is worth citing. Here are some strategies that can help you do exactly that: 1. Create citation-friendly content Citation-worthy content goes beyond surface-level answers. It offers original thinking, clear explanations, and real value, helping AI models support their responses with confidence. In other words, it’s not just optimized, it earns references by being genuinely useful. The following content types consistently get cited by AI models: Content typeWhat to writeWhy AI loves themOriginal researchStudies or data that answer new or unexplored questionsGives AI concrete evidence to support claimsCase studiesReal-world examples showing how something works in practiceHelps AI justify recommendations with proofThought leadershipOpinion-led content with unique insights or perspectivesAdds depth and diversity to AI-generated answersNews contentTimely, accurate coverage of recent developmentsFills gaps where training data falls short 2. Build topical authority (clusters) AI models don’t just evaluate individual pages; they evaluate how consistently you cover a topic. If you publish multiple pieces on a specific subject, each addressing different aspects, you signal depth, expertise, and reliability. That’s what topical authority is all about. And this is where E-E-A-T naturally comes into play. The more consistently you demonstrate experience and expertise in a niche, the more likely your content is to be trusted and cited. What to do in practice: Create clusters around a core topic (pillar page/cornerstone content + supporting content) Cover both broad and specific questions in your niche Go beyond basic answers, add expert insights, examples, or real-world context Keep your messaging and terminology consistent across content 3. Strengthen entity signals (brand, authorship, schema) AI systems evaluate content, but they also evaluate who is behind it. Strong entity signals help models understand your brand, your authors, and your credibility within a topic. The clearer these signals are, the easier it is for AI to trust and cite your content. What to do in practice: Build clear author profiles with expertise and credentials Maintain consistent brand mentions across your site and the web Use structured data (schema) to define authors, organizations, and content relationships Ensure your “About” and author pages clearly establish credibility 4. Earn external validation signals across the web AI models don’t rely on a single source of truth. They validate information by cross-referencing multiple sources across the web. That means your credibility isn’t built only on your website. It’s shaped by how consistently your brand shows up across trusted platforms. The more aligned and authoritative those signals are, the easier it is for AI systems to trust and cite your content. Think of this as building a web-wide validation layer that reinforces your brand through multiple independent sources. This is also where traditional SEO practices like link building evolve. It’s no longer just about backlinks, but about earning consistent, high-quality mentions that strengthen your entity across the web. What to do in practice: Contribute insights to reputable publications in your niche Earn consistent mentions across industry blogs, directories, and review platforms Build high-quality backlinks through a strategic link-building approach Be active in communities like Reddit, Quora, or niche forums Run digital PR campaigns that reinforce your brand narrative across sources 5. Keep content fresh and updated AI models prefer content that reflects current information. Outdated content is less likely to be trusted, especially for topics that evolve quickly. Regular updates signal that your content is still relevant and reliable. What to do in practice: Refresh key articles with updated data, examples, and insights Add new sections instead of rewriting from scratch where possible Clearly indicate updates (timestamps, revised sections) Prioritize high-performing or high-potential pages for updates Must read: How to optimize content for AI LLM comprehension using Yoast’s tools 6. Structure content for answer extraction AI models don’t read content the way humans do. They extract answers. Most AI-generated responses are built by identifying clear, concise answer blocks within content. And increasingly, users prefer this format. In fact, according to a poll by IWAI, 67% of users find AI tools more efficient than traditional search for getting answers. That shift makes one thing clear: if your content doesn’t directly answer questions, it’s less likely to be surfaced or cited. This means it’s not enough to include answers. You need to structure your content so those answers are easy to find, interpret, and reuse. What to do in practice: Lead sections with direct, concise answers before expanding Use headings that mirror real user queries and intent Break down complex topics into scannable, extractable sections Add summaries, definitions, or key takeaways at the start of sections Anticipate follow-up questions and answer them within the same content Tracking AI brand presence with Yoast By now, we know what AI citations are, how they work, and how to earn them. But here’s the real question: how do you know if you’re already being cited? And if not, how do you understand where your competitors are showing up and where you’re missing out? That’s the gap Yoast AI Brand Insights is built to solve. As AI-generated answers become a key discovery layer, most traditional analytics tools fall short. They can tell you about traffic, but not whether your brand is being mentioned, how it’s being perceived, or which sources AI systems trust when referencing you. That’s a critical blind spot, especially as AI answers increasingly shape user decisions before a click even occurs. Yoast AI Brand Insights helps you track and understand your AI visibility, citations, and brand mentions across platforms like ChatGPT, Gemini, and Perplexity, so you can move from guesswork to informed action. Here’s what it enables you to do: Sentiment tracking Understand how your brand is being perceived in AI-generated answers. The tool analyzes keywords associated with your brand and shows whether the overall sentiment is positive or negative, helping you spot tone issues and shifts over time. Citation analysis (brand mentions) See when and where your brand is being cited. More importantly, understand which sources AI platforms reference alongside your brand, so you can identify citation gaps and opportunities to improve your presence. Competitor benchmarking AI visibility is relative. This feature lets you compare your brand’s citations, mentions, and sentiment against competitors, helping you understand who is being surfaced more often and why. Question monitoring AI search is driven by queries. With question monitoring, you can track specific brand-related or industry questions and see whether your brand appears in the answers, giving you direct insight into where you’re visible and where you’re missing. AI visibility index Instead of looking at isolated metrics, Yoast combines signals like citations, mentions, sentiment, and rankings into a single visibility score. This gives you a clearer picture of how your brand performs across AI systems over time. The bigger picture here is simple: Yoast AI Brand Insights helps you understand your position in this new ecosystem, so you can strengthen your presence, close gaps, and ensure your brand is part of the answers your audience is already consuming. FAQs on AI citations AI citations can feel complex at first, especially as search continues to evolve. Here are answers to some of the most common questions to help you navigate them better. Are backlinks different from AI citations? Yes, they serve different purposes. Backlinks help your pages rank in traditional search, while AI citations determine whether your content gets included in AI-generated answers. In short, backlinks drive visibility on SERPs, while citations drive visibility within answers. If you want a deeper breakdown, check out this guide on AI citations vs backlinks. Do AI systems always provide citations? No, AI systems don’t always include citations. When responses are generated purely from pre-trained knowledge rather than retrieved sources, citations may not appear. To test this, I tried the following prompts on ChatGPT: Out of these, citations appeared in about half of the responses. A clear pattern emerged: Queries involving products, recommendations, statistics, or recent events were more likely to trigger citations Queries focused on definitions or general knowledge often did not include citations This shows that citation behavior depends heavily on the query type, intent, and context. Not every answer requires a source, but the more specific or evidence-driven the query, the more likely citations are to appear. How do I direct AI models to the most important content on my website? You can’t directly control what AI models choose to cite, but you can make it easier for them to understand and prioritize your content. One effective way to do this is by using llms.txt, a feature in Yoast SEO. It creates a structured, LLM-friendly markdown file that highlights your most important pages, helping LLMs better understand your site when generating answers. A smarter analysis in Yoast SEO PremiumYoast SEO Premium has a smart content analysis that helps you take your content to the next level! Get Yoast SEO Premium »Only $118.80 / year (ex VAT) Think of it as a way to clearly communicate which content matters most, so when AI systems look for reliable sources, your key pages are easier to interpret and surface. AI citations: The currency of the AI-driven web AI citations are changing how users discover and trust information. They don’t just complement rankings; they reshape them by deciding which sources become part of the answer itself. In many cases, users no longer need to click to explore. If your content is cited, you’re visible. If not, you’re invisible. This shift also changes what we optimize for. It’s no longer just about traffic; it’s about trust, relevance, and inclusion in the answer layer. As we explored in our recent read, Rethinking SEO in the age of AI, the central question for SEO is evolving. It’s no longer just, “Can Google find my website?” It’s now, “Does the AI have a reason to remember my brand?” The post AI citations explained: how they work and how to get them appeared first on Yoast. View the full article
-
Embrace Filming on Analog Video in 2026
We may earn a commission from links on this page. Cameras are more than just tools for taking photos and videos—they shape the way we think of the past. The distinct look of each major camera format defines its era: The murky colors of a vintage Polaroid picture are as era-defining as the stiffly posed portraiture of the late 1800s. And there are few things that will bring you right back to the 1980s and '90s than the fuzzy, low-resolution warmth of something filmed on video tape. I think that's why some people are so drawn to picking up old 30- or 40-year-old camcorders, even in 2026: You can record a modern scene using one of them today—perhaps someone using a smartphone—and when playing it back, it will look like they're straight out of 1994. I grew up with such a camcorder, a Canon ES50, which my family used for filming home movies, and I used for making my own short "films." My childhood was immortalized in this format, and in a way, their look defines how I picture the '90s. While that camcorder was sold or given away sometime over the years, my brother recently gifted me another one, and in puttering around with it, I've feel like I've been traveling through time. Whether you also have a similar experience, or you're curious about older video recording formats and want to dive in to actually putting images down on tape, it's surprisingly easy to pick up one of these old camcorders and get recording, even in 2026. There are a lot of camcorder formats to exploreWhile you can certainly pick up a working camcorder off eBay easily enough, you'll need to know what format that camcorder records to before getting started. If you don't buy the proper medium (i.e. tape), you won't get very far. There are a ton of camcorder formats in this world, but here are the major ones you'll likely encounter: VHS: If you lived through the VHS era, you're very familiar with this format. These camcorders use the same sized tapes you used to rent from Blockbuster, which makes them pretty convenient (by 1990s' standards, anyway): If you have a VHS camcorder, a blank VHS tape, and a VCR, you can easily shoot and watch back your footage right away. These camcorders, however, are huge. Anyone looking for a more compact camera should consider the other formats on this list. VHS-C: This is the same principle as VHS, only in a compact form (hence the C). In fact, because this is still VHS, you can play VHS-C tapes in a VCR, but you need an adapter to do so. (The tech behind this is actually very cool.) S-VHS (and S-VHS-C): These are the successors to VHS and VHS-C, offering higher-quality video in the same form factor. 8mm: These tapes are suspiciously similar in size to standard VHS-C, but with a key difference. As you might guess from the name, these are not VHS, which means you can't play them back on a VCR. You won't notice too much of a difference in quality though. Hi8: This is the "upgraded" 8mm format. They're the same size, but record in higher-quality video. Mini DV: Again, these tapes are similar in size to other mini formats, but with a huge discrepancy: they're actually digital, not analog. Where all the previous tape standards record magnetically to tape, Mini DV records the information as bytes, the same as it would to a hard drive. That means you can reproduce Mini DV tapes without losing any detail, but you also lose some of the analog charm. Where to buy camcorders, tapes, and accessories in 2026Camcorders are not hard to buy in 2026. In fact, you can buy new 4K models—but that's not why we're here. If you're looking for a camcorder from the '80s or '90s, eBay is your friend. There are tons of listings for different brands and formats, many for reasonable prices. The camcorder I grew up with is easy enough to find: As of this writing, there's on listed for $78.50 without a battery or charger, and one with the battery for $99.99. If you have a specific camcorder in mind, you might need to make some concessions: say, buying the camcorder by itself, then hunting down specific accessories that go with it. On the other hand, you might have better luck searching by format: When I searched eBay for VHS-C camcorders, for example, I found this JVC option with all the essential parts for $49.99. Finding accessories shouldn't be a problem either, depending on the part and the camera model. I recently needed a proprietary cable to connect a family member's camcorder to a TV, and by searching the camcorder model number and the part, was able to find one for sale online fairly quickly. If you have a VHS or VHS-C camcorder, you're going to need a VCR and an adapter. The adapters are easy and cheap to pick up, while VCRs range in price. Take some time looking for a VCR: You'll want one that seems functional and clean, so as to not damage your tapes. In short, if it was popular, someone is likely selling it somewhere, which is good, because a camera without all the right accessories is just a museum piece. When it comes to the tapes themselves, it's the same story. I would have assumed buying formats from more than 30 years ago would be a challenge, but even Walmart still sells 8mm tapes. My recommendation is to search Google Shopping for the tape format you need, and compare prices and listing across reputable outlets. (Though, again, eBay might be your best bet.) You don't actually need tapes to use retro-style camcorders, since you can buy add-on devices that let you capture footage directly to an SD card—but given I'm writing about the appeal of retro tech, I'm going to assume you want to record on tape. And as with all retro tech, you might find what you're looking for at secondhand stores and garage sales. People might have camcorders, accessories, and tapes in their attics ready to be sold on their front lawns; similarly, Goodwill might have camcorders in the mix with their other used technology. If you are looking for something specific, stick to the internet, but if you're just looking to get started with any old equipment, the hunt can be part of the fun. How to actually watch your camcorder recordings in 2026This next two sections are perhaps the trickiest parts of camcorder ownership in 2026. It's not hard to find a camcorder, and it's surprisingly uncomplicated to pick up tapes. Even recording isn't all that challenging. But actually watching this footage, and, god forbid, saving it somewhere else? These analog formats were not designed for our digital future. That said, it's perfectly possible, so long as you're prepared. As I mentioned above, VHS shooters will have the easiest time here, assuming all parts are in order: You can pop your VHS tapes into a VCR connected to a TV, and you're good to go. VHS-C shooters will have a similarly simple experience, but you'll need that adapter. But other formats are not compatible with a VCR. In fact, for many camcorder options, your only choice is to play back your footage on the camcorder itself. That doesn't mean you need to watch on its tiny screen or viewfinder (though you can, of course); instead, you can simply connect the camcorder directly to your TV. This will vary based on the camcorder model, but most will have some type of cable designed to plug into an output port on the camcorder and connect to the composite ports on your TV. No matter how you connect your tapes to the TV, you may have another hurdle here. If you have an older TV, there's no problem, but newer TVs don't always have the red, white, and yellow composite ports you need. My 2017 LG TV, for example, does not. As such, I had to buy this adapter in order to connect any type of composite input to my TV—including my 8mm camcorder. As with many of the suggestions here, you'll need to find your TV's make and model to find the right adapter, but once everything's hooked up, you're good to go. Preserving your tapes (and creating digital backups)We live in a digital world, and vintage cameras (apart from DV models) are analog. As much fun as it is to watch retro footage directly on a TV, you'll likely want to convert your tapes to something you can watch on your phone or computer. Perhaps the simplest—if the priciest—option is to send your tapes out to a service to convert them for you. These services take recordings of virtually any format—from 8mm tape to 8mm film—and send them back to you as digital files. That way, you don't have to worry about doing any of the work yourself. While that might work for some, there are two reasons I'm not a huge fan of this option. First, it isn't really an option if you frequently shoot new footage, since you'll need to continuously send these companies your tapes and wait for them to get back to you. They're really set up for families sending in all their old home videos at once, which is my second gripe: I really don't trust another company (or the mail) to handle my family's irreplaceable home videos. In any case, it isn't quite as challenging as it sounds to get footages from your tapes onto your computer to do whatever you want with them, but it will take some elbow grease—and what's involved may vary based on your camcorder and setup. YouTuber Audrie Storme has a great walkthrough for the process, which should apply to most situations you'll run into. In summary, you'll need: A capture card (something like this) that connects your camcorder directly to your computer. Capture software—Storme recommends OBS, as it's free—which will be doing the converting for you. The settings you'll use will be case dependent—Storme uses Mini DV, but you'll use different settings if you're converting, say, 8mm—so I'd look into OBS forums for your specific setup. Once you're connected, you're ready to record for posterity. Since these are tapes, that means letting the footage roll in real time. So sit back, relax, and watch your "modern" '90s-style clips unspool. Oh, and one final piece of advice: If your tapes have the option, make sure to flip the switch from "Rec" to "Save." That way, you won't accidentally record over any of your footage. View the full article
-
Iranian strike on Kuwaiti power and water plant stokes infrastructure fears
Attack at electricity and desalination facility marks latest escalation in Gulf war launched by US and IsraelView the full article
-
How KitchenAid updated its legendary kitchen mixer without sacrificing its design
The silhouette could not change. This was the main parameter guiding the designers and engineers at KitchenAid as they set out to upgrade one of the brand’s hero product, the stand mixer. Used by amateur and professional bakers for more than 70 years, the classic stand mixer is a staple of the kitchen appliance world, and much of its staying power has to do with the consistency of the product, which has changed remarkably little in all that time. Most notably, the mixer’s bowl-hugging form factor has defined it since the start. So when the company decided to integrate some new features and functions into an updated version of the mixer—the Artisan Plus Stand Mixer, now on sale from $599—the design team knew that any change must not affect that signature look. “We’re in a unique spot here because we are really attached to the silhouette. We really don’t want to change the outside, which is a challenging engineering function when you say no, the package is fixed,” says Joseph Snyder, a system architect at KitchenAid. What’s new about KitchenAid’s stand mixer The changes included in the new Artisan Plus are the biggest improvements to KitchenAid’s stand mixer in its 70-year history, according to the company. Features include a new, slower ingredient folding speed, continuous speed control for smooth transitions between speeds, an automatic light focused into the mixing area, and a special mixing wand that simultaneously stirs and wipes the inside of the bowl. “Now you’ve got to put all this functionality in there. So we had to do a lot of work to fit this new control in here because we really couldn’t make any changes to the outside,” Snyder says. “We had to do this all inside.” Adding these new features required the equivalent of industrial surgery. Snyder says the designers and engineers realized they had to make a major change to the internal workings of the mixer, replacing what Snyder calls “tried-and-true” analog electromechanical controls with a computer-like microcontroller. But those analog controls also defined the mixer, in a way, with its tuned-weight system dictating a range of 10 distinct speeds. For this new version, KitchenAid was careful not to totally redefine the speed controls. “Some users are very attached to the way they do it today,” Snyder says. “I do this thing at speed two, I do this thing at speed four.” Recognizing that other users wanted more nuance in their mixing speeds, they introduced continuous speed control, which allows users to dial up and down the speed seamlessly, rather than speeding up or down in a jarring (and potentially messy) step change. Adding the microcontroller inside the mixer allows it to have both the preset speeds and the seamless transition. According to KitchenAid’s market research, another feature users have been asking for is the now 11th speed on the mixer, the extra slow “folding” speed, for gently mixing delicate ingredients into the bowl. Folding speed already exists on other KitchenAid mixers, but was not possible in the stand mixer before this upgrade. “We saw people taking our mixer and turning it on and turning it back off, turning it on and back off,” Snyder says. “They were sort of making a pulse mode on their own. We said, ‘we can help these people.'” A better beater Another new feature that has been used in other KitchenAid appliances is the new “double flex edge beater,” the silicone-edged wand that continuously scrapes the inside of the bowl, eliminating the need for users to manually scrape ingredients back into the center with a spatula. Snyder explains that the beater has a unique twist to its design that helps push the ingredients down into the center of the bowl during the mixing process. The silicone edges help prevent the ingredients from pushing out and climbing up. But making this all possible generates extra load on the machine—load the old analog controls couldn’t handle. The new microcontroller is able to generate the power needed to use this special wand, and the interior of the mixer’s body was tweaked to keep the system cool while doing so. “This is something we wanted to do for quite a while, but without making other changes, we really couldn’t. We had to be able to change the air flow and make some structural changes,” Snyder says. This new wand, he notes, can’t be used on older models of the stand mixer. Some of these design features have been in the works for years; folding speed, for example, has been a company goal for more than a decade. But Snyder says that even though some of the features aren’t exactly new, they’ve found the right moment to be combined into this new version of the stand mixer without completely reinventing the wheel. “It’s a very mature platform,” he says. “You’re always trying to thread that needle of adding new features without stepping away from your brand and what people recognize when they see that across the room.” View the full article
-
Top Scheduling Trends Reveal Reluctance to Embrace Efficiency—and it’s Time for a Change
Small businesses are losing a lot of time scheduling meetings, and the problem is of their own making. Zoho Bookings partnered with Cint, the world’s largest global research marketplace, to launch a survey in March of 2025 asking how teams across departments and industries schedule meetings, both internally and with clients. Of the more than 1,700 respondents, 254 were from US-based small businesses, defined as having less than 250 employees. The results exposed a surprising amount of inefficiency still prevalent among small businesses. Despite acknowledging that AI can be a helpful scheduling tool and expressing comfort in the technology taking over this task, the vast majority of SMB respondents evaded dedicated scheduling software altogether, instead favoring old methods that are less efficient and more prone to error. Small businesses should take this survey data as a wake-up call to the value of technology, particularly for mission critical tasks like scheduling. Here’s more on what the survey found: Manual labor Despite advances in technology over the last few years, small businesses are sticking to tried-and-true methods for scheduling meetings, even if those ways have proven highly inefficient. According to the survey, 72.8% of SMB respondents primarily use email to schedule meetings and appointments, and 57.1% use phone calls. The process starts with gathering availability—another task that has yet to be optimized by the majority of small businesses surveyed. Zoho and Cint discovered that 51.6% of small businesses use email and messaging apps to check availability for meetings. And though 48% use a shared calendar dashboard, 43.3% still check colleague’s time manually. These trends feel more appropriate for a small business in its most nascent stage, before too many employees join the fray. But as companies grow, they would be wise to shift away from these time-consuming methods, which proves especially true for small businesses. Resources are severely limited and profit margins are razor-thin. The less time employees spend navigating cumbersome systems, searching through email threads, or tracking down responses, the better. Time to waste The survey further highlighted the toll that meeting scheduling takes on everyone’s time. According to its data, 46.5% of respondents were able to schedule meetings in less than one hour, but the majority were not, with 28.4% saying it took between 1-6 hours, 13.8% claiming to take 6-12 hours, and 8.3% taking up to 24 hours to get something on the books. Problems with scheduling are only half the battle. Almost half of respondents, 46.1%, claimed that back-and-forth scheduling hassles constitute the single biggest drain on their time. Once meetings are booked, 34.7% said no-shows were a significant problem, 31.9% viewed double bookings as a top challenge, and 23.6% had to deal with uneven team schedules. It’s not like meetings are only a once-a-week occurrence for small businesses. The vast majority of manual schedulers, 68.7%, said they maintain 1-5 appointments a day, with other respondents claiming more. Plus, as companies grow in size and complexity, it’s likely more meetings will need to occur throughout an average day. Benefits barely outweigh fears A third of small businesses surveyed, 34.7%, said they employ dedicated scheduling software, and these companies claim to have seen numerous benefits. The top five were scheduling automation, meeting management, employee productivity, team scheduling and coordination, and customer experience. All of these contribute not only to saving time but to growing a small business at its core. That’s not to say all small businesses that use scheduling software are content. Only 18.2% of those said their tool was meeting their needs. Of those who were unable to make the same claim, 40.9% want more AI capabilities added, 39.8% want more integrations, and 37.5% want better customer support. Available software options don’t capture all that small businesses would hope to achieve, but the demand still exists. The survey found that 59% of small businesses believed AI would be quite helpful for scheduling meetings, and 28.4% were neutral on the concept, perhaps not having seen it in action. Additionally, more than half of SMB respondents, around 53%, were comfortable with AI taking the scheduling reins, even if 51.2% of SMBs feared that AI scheduling would cause their companies to lose a human touch and 41.3% worried they’d lose control over decision-making. Scheduling success The barrier to AI adoption in scheduling software among SMBs isn’t technical skepticism; it’s a fear that automation will make a small business feel less like one—losing the personalization and camaraderie they’ve grown to value. As AI technology develops and folds into software of all stripes, the fear of lost autonomy should subside. Vendors are increasing the amount of context their AI agents receive, ensuring decisions are rooted in an employee’s reality within guardrails set by managers. Centralized dashboards unlock added visibility, allowing for changes to be made so the system can iterate on best practices. The survey demonstrates that small businesses can’t let themselves get in their own way. They believe scheduling software can save time and ensure meetings are as effective as they can be, but many have yet to take the leap, instead relying on outdated methods like email, spreadsheets, and endless phone tag. Small business growth relies heavily on teamwork, and there’s no better way to collaborate than having everyone in the same room, virtually or otherwise. AI-powered meeting software makes that happen. Image via Envato This article, "Top Scheduling Trends Reveal Reluctance to Embrace Efficiency—and it’s Time for a Change" was first published on Small Business Trends View the full article
-
A blueprint for Chinese global leadership
With the US destroying its own credibility, the opportunity is Beijing’s for the takingView the full article
-
How to Build a Music Library That You Actually Own
We may earn a commission from links on this page. It's ridiculous that the pace of my day can be ruined by the realization that a beloved song is no longer on the streaming platform I pay for. As someone who vividly remembers cassette tapes, CDs, and hoarding hard drives full of MP3s, it's a reminder of what we traded in when we said "yes" to streaming services years ago. I've been working on my Warped Tour playlist for the summer, but I keep running into a ton of licensing holes and grayed-out deep cuts. The gaps are everywhere, and it's harshing the vibe. It's another harrowing reminder that I merely rent whatever Spotify offers in its music library. That's why I started building my music library from scratch again. Here's how to find, buy, and own music in a world that has gone purely streaming. The best places to buy MP3sBuying and owning MP3s depends entirely on the type of music you like. If you're into mainstream mega hits or your favorite grungy band is on a big label, you can look to the major players to buy music: Apple and Amazon. Apple's iTunes Store (it's still called that!) remains one of the most robust among the "available everywhere" sect. It's also easy to buy high-quality 256 kbps AAC files, which are technically more efficient than regular MP3s and offer CD-like sound quality without being massive files. Just make sure you're purchasing the music through the purple iTunes icon and not the red one. Amazon's MP3s are DRM-free, usually 256 kbps, and you can usually find what you need there, even some more obscure offerings. If you're hoping to own digital and physical media, Amazon offers AutoRip. When you buy a physical CD or vinyl marked with the AutoRip logo, Amazon adds the digital version to your cloud library for free. You can stream these purchases through the Amazon Music app without needing a monthly subscription. Apple also lets you stream your iTunes purchases directly within the Apple Music app. Even if you don't pay for the monthly streaming service, you can dig into your legacy library and stream what you own within the platform. 7digital is greatly heralded as a long-standing hidden gem of the MP3-buying world. The service itself is a backend provider for other global music services, which is why they have so much access to music licenses. It's a British company, so if you listen to more international artists, this is a viable solution, as Amazon and Apple's offerings can be region-locked. 7digital also allows you to choose between MP3 and high-resolution FLAC files at checkout. All three of these services have been around long enough to have a legacy of longevity. If you lose your digital files, Amazon, Apple, and 7digital will let you re-download them through your account. How to buy music directly from artists Bandcamp remains the defacto standard for buying music directly from an artist. Credit: Screenshot by Florence Ion/Lifehacker If you're buying digital music to support the band or artist you love directly, try Bandcamp before you go anywhere else. It's one of the few music marketplaces where you can choose your flavor of file—MP3, FLAC, or WAV, in some cases—within a single transaction. Sometimes there is even merch and physical media, like LPs and CDs, that you can add to the cart. Bands like the site because it doesn't take a huge cut of what you're paying the artist for an album. The Bandcamp app also acts like a cloud locker for everything you've bought on the platform, so if you get a hankering for music you don't have on you, you can access it through the mobile app. Some bands and artists get around their labels by using platforms like Patreon or Substack to distribute music files directly to a monthly "fan club." Others lean into their labels by using direct-to-consumer online shops, like Polyvinyl, which notoriously prioritizes a 50/50 profit-sharing model with their artists. The digital store also features high-res, DRM-free files, ensuring the artist gets a fair share while you get a file you actually own. Where to buy higher quality digital musicEven if you aren't trading your Apple AirPods for a dedicated DAC (Digital-to-analog converter) and a pair of serious headphones, you may want to own even higher quality digital music, beyond 256Kbs. You'll want to seek 24-bit/192kHz FLAC files, which aren't available in mainstream digital music stores. Qobuz is a global streaming service that also lets you buy high-res tracks. Credit: Screenshot by Florence Ion/Lifehacker Qobuz is a global favorite for buying high-res tracks, and it's one of the easiest to use. You can buy what you want in the browser and then use one of the companion desktop apps to batch-download files. Qobuz even offers a streaming subscription that gives you massive discounts on future high-res albums and EPs you purchase. It's a good deal if you're actually planning to collect MP3s, and the streaming ability lets you preview an album before you commit. Other popular services include HDtracks, which has been around since 2008. Founded by a pair of audio engineers, the site specializes in definitive masters. It's a good choice if you have a more classical or legacy rock taste in music, as 24-bit remasters are usually available to own. Bleep is another service geared toward electronic and alternative music lovers, while Boomkat serves the more experimental and underground niche. How to RIP your own CDsPart of the fun of getting back into digital music is remembering that another way to collect it—and really solidify your ownership—is to go to your local record store or second-hand market for CDs. Yes, CDs, the compact discs that somehow endure even though the players aren't even standard in cars anymore. Since most modern laptops and computers no longer ship with optical drives in favor of slim, svelte designs, you will need a simple external CD/DVD drive that connects via USB-C or another method. Brands like Asus and LG still make these combination drives. They're easy to buy and aren't too pricey. Next, you'll need ripping software. For Mac users, the Apple Music app features the same built-in ripper that shipped with iTunes all those years ago, and it's still a viable option. If you want something more robust, there's X Lossless Decoder. It's simple, handles album art and track names quite nicely, and is a good idea if you're a stickler for metadata. If you're willing to spend a little money, dBpoweramp costs about $40 and works for both Mac and PC. Users like it because it pulls track information from multiple databases simultaneously to ensure metadata is solid. Windows users also have a plethora of options. Exact Audio Copy is one of the more popular free ripping suites. It cross-references global metadata to ensure you're grabbing audio that sounds like the original, though it does have a learning curve. I use Express RIP CD Ripper because I paid for a license long ago (around $40). It's fairly plain and works fine on most of the albums I recover from Goodwill. However, it doesn't handle compilations very well. I'm still working through a couple of albums in Mp3tag to get them to display the correct song titles. One major thing to keep in mind if you decide to become CD-obsessed: Always rip to a lossless format like FLAC (Windows) or ALAC (Mac). You can convert a high-quality file into a smaller MP3 later. But you can never expand an MP3 back into the full-fidelity sound of the original CD. Amicool External DVD Drive USB 3.0 USB C CD Burner CD/DVD +/-RW Optical Drive, Slim Portable DVD CD ROM Rewriter Writer Duplicator for Laptop Desktop PC Windows 11/10/8/7 MacBook Mac Linux OS Apple $21.83 at Amazon $32.99 Save $11.16 Shop Now Shop Now $21.83 at Amazon $32.99 Save $11.16 How to stream your musical library now that you own itNow that you own your own digital music, you'll likely want to play back those files away from the tethers of your home computer. If you've got a centralized storage solution at home and you want to connect your library to the internet, Plexamp lets you create your own private streaming service. You can log in to it from anywhere in the world with the mobile app and stream the files you own in high resolution. If you don't want to deal with home servers and want to carry your music with you, apps like Prism on iOS and Symfonium on Android are primed for power users. Prism connects to cloud services like Plex and iCloud and features an Apple-worthy interface, while Symfonium supports almost every music format and even lets you cast to other devices. You could also invest in a dedicated mp3 player, either an off-brand one or a revived old favorite, to carry your music with you. Finally, keep the 3-2-1 rule of file ownership in mind when managing your music collection: If the song doesn't exist in three places, it doesn't exist at all. There should be one copy on your main computer or home server, one on an external backup drive, and one in the cloud or physically stored on a bookshelf. Yes, it's maintenance work, but the trade-off is that you own what you listen to. View the full article
-
Daily Search Forum Recap: March 30, 2026
Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web. Google Search Console performance reports may be off again. A teen SEO wants to save his family business...View the full article
-
How to build a custom GPT for business (that your team actually uses)
The OpenAI GPT Store launched in January 2024 with more than 3 million custom GPTs. Ask any team how many they still use, and the answer is usually zero or one. Most business GPTs fail because they’re built like novelties rather than tools. They’re too broad, under-tested, and launched without a strategy, so they never become part of a team’s workflow. I’ve built and audited 12+ custom GPTs across marketing, SEO, and sales teams. The pattern is consistent: a small number get used daily, while most collect dust. Here’s how to build GPTs that do — from validating the right use case to structuring, testing, and launching in a way that drives real adoption. At a glance: The 15-minute version If you’re ready to jump in, you can start with these steps: Pick one task your team does 3x+ per week that takes 15+ minutes. Complete this sentence: “This GPT helps [role] do [task] by [method].” Write instructions in the Configure tab, not the Create tab. Upload a curated one- to two-page .md knowledge file, not a raw document dump. Add four specific conversation starters. Users who see specific options are significantly more likely to engage than those facing a blank input field. If they can’t immediately see what to do, they leave. Test with five questions before anyone else sees it. Share with three teammates. Watch them use it. Iterate within 48 hours. Want to see what a well-built business GPT looks like before building your own? Try Marketing Research & Competitive Analysis or MARKETING, both ranked in the GPT Store’s Research & Analysis category. I helped build these at Semrush and will reference them throughout, and they demonstrate the build patterns covered below. Need the full framework? Keep reading. Your customers search everywhere. Make sure your brand shows up. The SEO toolkit you know, plus the AI visibility data you need. Start Free Trial Get started with What a business GPT actually is (and what it isn’t) A business GPT is a custom version of ChatGPT configured to do one specific, recurring job for a defined role on your team. Not “an AI assistant.” Not “a helpful tool.” One job. Think of it like hiring. A generalist can help with anything. A specialist who does one thing incredibly well is worth 10 times more for that specific task, because they’ve already internalized the context, the standards, and the constraints you’d otherwise have to explain every single time. That’s what a well-built business GPT does. It already knows your brand voice, output format, and when to stop and escalate instead of guessing. I’ve built and audited 12+ custom GPTs across marketing, SEO, and sales teams, and the pattern is consistent: the ones that get used daily are tightly scoped and predictable. The ones that aren’t collect dust. The one-sentence test: If your GPT needs more than one sentence to explain what it does, the use case is still too broad. Narrow it until the answer is obvious. “A GPT that drafts on-brand responses to negative customer reviews using our escalation framework” passes. “A general customer support assistant” doesn’t. That specificity is what makes it useful at the planning stage, where most marketing GPTs fall short. The same pattern shows up across the best GPTs in the store. Most are novelties. These aren’t. Each demonstrates a build pattern you can apply. Marketing Research & Competitive Analysis Ranked No. 2 in Research & Analysis. Drop in a competitor, an industry, or a business challenge, and you’ll get structured frameworks, SWOT analyses, positioning gaps, and audience breakdowns backed by cited sources. The build pattern worth noting: breadth within a defined domain. Most research GPTs do one thing. This one covers the full strategic stack, from competitive analysis to market research to strategic planning, without losing focus because the scope is bounded by “research and analysis” rather than “marketing” broadly. MARKETING Ranked No. 4 in Research & Analysis. Covers 14+ disciplines, including paid search, programmatic, out-of-home, influencer, and retail media. The build spans the full media mix rather than specializing in one channel. It’s useful at the planning stage, where most marketing GPTs fall short. It also shows how conversation starters can guide users to high-value use cases immediately, rather than leaving them staring at a blank input field. Write For Me Consistently top five globally across all GPT Store categories. This is strongest for blog posts, articles, and long-form content. The build uses front-loaded conversation starters to narrow scope at the session level rather than baking rigid constraints into the instructions. That makes it flexible enough to serve thousands of different users without losing focus. Data Analyst (by OpenAI) Upload a CSV and receive charts, summaries, and insights without writing a single line of code. This is the clearest live demonstration of Code Interpreter used well. This build demonstrates what the capabilities toggle actually unlocks in practice. Open it first if you want to convince a skeptical stakeholder. Automation Consultant by Zapier Describe a workflow problem in plain English and receive specific Zapier automation recommendations. The business model pattern here is as instructive as the build pattern: a tool-native GPT that generates qualified leads by solving the exact problem its parent product addresses. This is worth studying if you’re thinking about GPTs as a distribution channel, not just a productivity tool. Canva Create and edit designs, presentations, and social graphics through conversation. Beyond the practical utility, Canva’s GPT is worth studying as a forward-looking example of where the category is heading. It has evolved from a simple GPT integration to a full native ChatGPT app integration, showing what a mature tool-native deployment looks like when a brand commits to the channel properly. Validate before you build The biggest waste in GPT development is building something nobody needed badly enough to actually use. Before writing a single line of instructions, score your idea across four dimensions. CriteriaLow (1 point)Medium (3 points)High (5 points)FrequencyMonthly or lessA few times/weekMultiple times dailyTime costUnder 15 minutes15-45 minutes1+ hours each timeConsistencyNot criticalModerateMission-criticalContext requiredGeneric info worksSome internal dataDeep internal knowledge Score interpretation: 16-20 points: Build it this week. 10-15 points: Worth a prototype. Below 10: Skip it. The ROI math won’t justify adoption. The math is simple. A 45-minute task done five times per week is 16 hours per month. Anthropic’s November 2025 productivity research found that the median AI-assisted task delivered an estimated 84% time savings, with most tasks falling somewhere in the 50-95% range. Even at the conservative end of that range, a well-scoped GPT returns eight to 12 hours per person per month on that one task alone. The St. Louis Fed’s October 2025 survey research backs this up: One-third of workers who use AI tools daily report saving at least four hours every single week. Multiply either number across a team, and the ROI case writes itself. Tip: Audit your team’s weekly standup notes or Slack threads from the last 30 days. Tasks mentioned repeatedly (especially ones people complain about) are your best GPT candidates. They’re already annoying enough to surface unprompted, which means adoption motivation already exists. Build it right with the 6-layer framework Every effective business GPT is built on six layers. Skip one, and the output feels half-baked. Add unnecessary complexity to one, and adoption drops. Layer 1: Use case (one job. Full stop.) This is the filter every other decision runs through. ❌ A general coding assistant. ✅ A code reviewer that checks React components against our team's style guide. ❌ A marketing helper. ✅ A campaign brief generator that outputs our standard five-section brief format from a single one-line input. If you find yourself adding “and also it should…” more than twice during the build, you need two GPTs, not one bigger one. This is why Marketing Research & Competitive Analysis works. It could easily have tried to write copy, plan campaigns, and do SEO analysis. Instead, it stays in its lane: research and competitive intelligence. That constraint is what makes the output reliable enough to use in real strategy meetings. Layer 2: Instructions (your most important investment) Most people underinvest here by an order of magnitude. Your system prompt isn’t a description of what the GPT does. It’s the operating system that controls how it thinks, behaves, and responds. A weak system prompt produces generic, unreliable output. A strong one turns a blank ChatGPT into a domain expert. Go straight to the Configure tab. ChatGPT’s conversational builder (the “Create” tab) is fine for quick setup but gives you almost no control over formatting, behavior rules, or conditional logic. The Configure tab is where you actually build the thing. If you’re already using ChatGPT for SEO workflows, you know how much the quality of your prompts determines the quality of the output. The same principle applies tenfold with system instructions. For a deeper dive on prompt construction for SEO specifically, check out our guide to ChatGPT for SEO. Structure your instructions in this order: Role definition: Who is this GPT? What’s its point of view? What does it know deeply? Behavioral guidelines: What should it always do? What should it never do? Output format: How should responses be structured? What’s the ideal length? Tables, bullets, prose? Brand voice: What language does your brand use? What language is off-limits? Escalation paths: When should it recommend a resource, a tool, or a human instead of answering? One formatting trick that actually works: For rules that are truly non-negotiable, write them in ALL CAPS. It sounds aggressive in isolation, but it works. The model reads formatting signals. “NEVER recommend a competitor product” lands harder than “try not to mention competitors.” Use it for your three to five most critical behavioral guardrails. Examples: ❌ Write professional emails to clients. ✅ You are a B2B sales rep at a SaaS company. Tone: confident, concise, no buzzwords. NEVER use the word "synergy." Format: Subject line, three short paragraphs, clear single CTA. ALWAYS end with a specific next step, not a vague "let me know." Budget 10-15 hours of system prompt iteration before you call a GPT production-ready. That’s not a typo. Test against normal cases, edge cases, and adversarial inputs — the kinds of things a skeptical user or an off-script question will throw at it. Layer 3: Knowledge files (what makes it yours) Without knowledge files, you’ve built a custom-named version of standard ChatGPT. The knowledge layer is what gives your GPT institutional memory: the brand voice, the internal frameworks, the context that doesn’t exist anywhere on the public internet. What to upload: Brand voice guides and style examples. Internal process docs and frameworks. Competitor positioning notes. Product one-pagers and FAQs. Past high-performing examples of the output you want. File format matters. Plain text (.txt) and Markdown (.md) outperform PDFs for retrieval accuracy. Never dump a raw 500-page document. The model can’t efficiently parse messy formatting or irrelevant context. The cheat sheet rule: If a source document is longer than 20 pages, use AI to distill it into a focused, five-to-10-page summary specifically for the GPT to reference. Shorter, curated context outperforms raw data dumps every time. The transcript trick most teams miss: If your company has recorded webinars, training videos, or internal demos, those transcripts are ready-made knowledge files. Open the video on YouTube, click “Show transcript,” toggle off timestamps, copy the full text, paste into a Google Doc, and download as .txt. A 45-minute video becomes a high-quality knowledge source in about 10 minutes. Layer 4: Capabilities (enable what you need. Nothing else.) There are three built-in toggles: Web Browsing, Code Interpreter, and DALL-E. Don’t enable them all “just in case.” Each one adds surface area for the model to go off-script. CapabilityEnable whenSkip whenWeb BrowsingGPT needs live data: prices, news, current URLsGPT should only draw from your uploaded knowledge filesCode InterpreterUsers will upload CSVs, run analysis, generate chartsGPT is purely text-basedDALL-EGPT creates visual assets as part of the workflowGPT is analytical or copy-focused Code Interpreter is the most underrated of the three. A GPT with it enabled can accept CSV uploads, run analysis, generate charts, and return downloadable files, replacing hours of manual reporting. If any part of your workflow involves structured data, this is worth experimenting with. A note on web browsing: Web-enabled GPTs will confidently pull and present outdated or wrong information. If accuracy is important, disable web browsing entirely and rely only on your curated knowledge files. You control what’s in them. You can’t control what the web returns. Layer 5: Actions (one integration for V1) API connections to external systems — CRMs, project management tools, databases, calendars — are where GPTs start to feel like real automation infrastructure rather than fancy chat interfaces. For V1, connect exactly one integration. Not five. Scope creep at the actions layer is where GPT projects stall before launch. Pick the single integration that would deliver the most immediate value, typically where the GPT’s output currently has to be manually copied somewhere else. Layer 6: Evaluation (test before anyone else sees it) Write five to 10 test questions before you share the link with anyone. Include normal cases, edge cases, and at least two adversarial inputs, the kinds of questions a frustrated user or an off-topic request would generate. ❌ Hello, what can you do? ✅ Here is a furious customer email accusing us of fraud. Draft a response using our de-escalation framework without admitting liability. Test cases should reflect the hardest version of the job, not the easiest. If the GPT can handle the edge cases, the normal cases will be fine. Get the newsletter search marketers rely on. See terms. The most common GPT mistakes (and exactly how to fix them) #MistakeWhy it failsThe fix1Scope too broadTries to do everything, does nothing wellOne GPT = one job. No exceptions.2No example outputs in instructionsGPT guesses your preferred formatInclude one to two “golden” examples of ideal output directly in your system prompt3Raw document dumpsModel can’t parse 500-page PDFs reliablyCurate five to 10-page Markdown cheat sheets instead4No conversation startersUsers stare at a blank prompt field and close the tabAdd four specific starters that showcase different use cases immediately5No evaluation before launchEdge cases surface publicly and erode trustWrite five to 10 test cases before sharing, including adversarial ones6Wrong capabilities enabledWeb Browsing introduces hallucination riskEnable only what the workflow actually requires7Build and forgetInstructions go stale as your business evolvesRevisit instructions monthly, update knowledge files quarterly The department playbook: Highest-ROI opportunities by team Start with the department that complains most about repetitive work. Their pain is your adoption fuel. A GPT that eliminates a universally-hated task markets itself through word-of-mouth faster than anything you could announce in a Slack channel. Marketing Campaign copy assistant: Input one brief. Receive ad copy, email subjects, and social captions formatted by channel. Upload your brand guidelines as the knowledge file. This replaces 30-45 minutes of copy concepting per campaign. Semrush integration opportunity: Feed in keyword data from Keyword Magic Tool to ensure copy is aligned with how your audience searches. Competitor messaging analyzer: Paste competitor copy or a landing page URL. Get a structured summary of their positioning, the gaps they’re ignoring, and angles your brand can own. Semrush integration opportunity: Pair with Traffic Analytics data to qualify which competitors are worth analyzing by actual share of voice. If you want to skip the build and get competitive intelligence right now, Marketing Research & Competitive Analysis handles exactly this workflow out of the box. Drop in a competitor and get a structured SWOT, positioning gaps, and audience breakdown in a single conversation. SEO Content brief generator: This turns a keyword into a structured brief covering audience, search intent, recommended outline, and competitor content gaps. It replaces 30-45 minutes of manual brief writing per piece. At 20 briefs per month, that’s 10 to 15 hours returned to your team. Semrush integration opportunity: Build the brief template around Semrush’s SEO Content Template output. The GPT populates the strategic rationale, Semrush provides the keyword and competitive data. Technical SEO audit assistant: Paste a page’s content and meta information. Receive a prioritized fix list with title tag rewrites, internal link suggestions, and schema recommendations formatted exactly the way your team tracks them. Semrush integration opportunity: Pull the audit inputs directly from Semrush’s Site Audit exports. If you’re already using ChatGPT for SEO work, our collection of SEO prompts for ChatGPT is a good starting point for building the system instructions for either of these GPTs. Sales Prospect research brief: Input a company name. Receive a pre-call brief with recent company news, likely buying signals based on firmographic patterns, and tailored talk tracks for the likely objections. A sales rep I worked with spent 20 minutes per prospect doing this manually before every cold call. The GPT produces the equivalent brief in 90 seconds. That means he spends his actual working hours on the only part that earns commission: the call itself. Win/loss analyzer: Upload anonymized CRM deal notes. Surface patterns in why deals close or fall apart: which objection categories are fatal, which talk tracks correlate with wins, where in the funnel deals die. Customer support Ticket response drafter: Paste a customer ticket. Receive an on-brand draft response using your de-escalation framework. Rep reviews and sends in three minutes instead of 12. At 30 tickets per day, that’s 2.5 hours returned to a support rep’s day. Policy Q&A bot: Upload your HR handbook or policy documentation. This will answer common employee questions instantly, reducing the repetitive Slack messages that eat 30-60 minutes from HR and ops leads per week. Operations OKR reviewer: Paste a team’s OKRs and get scores and rewrites. Are the objectives inspiring? Are key results actually measurable? Enforces rigor at scale without requiring a senior leader to manually review every team’s draft. Meeting structurer: Input a topic and attendee list. Output a tight agenda with pre-reads, decision points, and follow-up templates. For organizations where meeting bloat is a recognized problem, this one tends to spread fast. How to prevent your GPT from making things up Hallucination (the model generating confident-sounding incorrect information) is the single most-cited concern from teams considering custom GPTs. It’s a manageable risk if you build correctly. Add an explicit guardrail sentence in your instructions. Something like: “If you do not know the answer from the provided knowledge files, say so directly. Do not invent information. Direct the user to [specific resource] instead.” Simple. Effective. Dramatically reduces the instinct to fill gaps with plausible-sounding fabrication. Disable Web Browsing when accuracy matters. A web-enabled GPT will pull and confidently present outdated, incorrect, or hallucinated source material. If your GPT’s value depends on accuracy, including policy Q&A, compliance guidance, and product specs, turn off Web Browsing entirely and rely only on the knowledge files you’ve curated and can verify. Test for it systematically before launch. Ask your GPT questions you already know the answers to. Ask it something outside its defined scope. Ask an edge-case question that isn’t covered by your knowledge files. If it confidently fabricates rather than saying “I don’t know,” fix the instructions before anyone else encounters it. The tighter the scope, the lower the hallucination risk. This is another reason the one-job rule isn’t just about UX. It’s about accuracy. A GPT that knows it’s only supposed to answer questions about your return policy has far less surface area to go off-script than one configured as a general business assistant. How to launch so your team actually adopts it Building the GPT is half the job. The failure mode most teams hit isn’t a bad build. It’s a bad launch. A GPT nobody can find is a GPT nobody uses. Phase 1: Build Define your one-sentence purpose. Write layered instructions with examples. Upload focused knowledge files. Configure one API action maximum for V1. Resist the urge to expand scope. Phase 2: Test Create five to 10 golden test questions. Run a pilot with three to five real users. Don’t send them a link and walk away. Watch them use it, note where they stall, and iterate two to three rounds before wider release. The feedback from watching someone use your GPT for the first time is worth more than any amount of solo testing. Phase 3: Launch Write your GPT store or sharing copy around the outcome, not the technology. “Save 45 minutes on every content brief” outperforms “an AI-powered SEO assistant.” Add four conversation starters that showcase different use cases immediately. Users who see specific options to click engage at a significantly higher rate than those staring at a blank input field with no idea where to start. Phase 4: Promote Record a two-minute Loom showing a before/after on the specific task the GPT replaces. Share through your team Slack with that before/after story, not a feature list. Create a one-page “prompt pack” with the 10 highest-value starting prompts for your GPT. The discoverability principle: Pin your GPT in the team Slack channel. Add it to onboarding docs. Demo it at the next all-hands. If someone can’t find it and understand what it does in five seconds, they won’t come back after the first session. Measuring what actually matters Tracking total conversations is the floor, not the ceiling. Here’s what actually tells you whether your GPT is working: MetricWhat it tells youTargetReturn rateOnce is curiosity. Twice is value. Weekly is a habit.50%+ returning after first useConversation depthTurns per session; longer = higher utility4+ turns average for complex tasksTime saved per useSurvey users or compare task completion times30-70% reduction vs. manualTeam adoption rate% of target users engaging weekly60%+ within 30 days for internal GPTsDownstream action rateAre users taking the next step you wanted?Defined per use case The ROI one-pager: Hours saved per use × frequency per week × team size × average hourly cost = monthly dollar value. Build this at the 30-day mark. It’s the most powerful artifact you have for justifying continued investment, or making the case for the next GPT. Where most B2B teams are right now Organizations fall into one of five stages: Exploring: Team members use ChatGPT ad hoc. No shared GPTs exist. Experimenting: One or two people have built a custom GPT. Usage is informal and person-dependent. Standardizing: Three to five GPTs are deployed with proper instructions, knowledge files, and evaluation criteria. This is where shared value starts to compound. Scaling: GPTs are integrated into defined workflows across departments. Usage is tracked. Iteration is systematic. GPT-Native: GPTs are the default starting point for designing new workflows, not an afterthought. Most B2B teams are at Level 1 or 2. The biggest ROI jump happens between Level 2 and Level 3. That’s the moment GPTs stop being personal productivity experiments and start becoming team infrastructure. See the complete picture of your search visibility. Track, optimize, and win in Google and AI search from one platform. Start Free Trial Get started with What separates useful GPTs from the rest Custom GPTs are a workflow infrastructure decision. It compounds over time when scoped correctly, and quietly disappears when it isn’t. The teams getting real ROI from them aren’t building the most technically sophisticated versions. They’re building focused ones: scoped to one job, launched with enough intentionality that their team can actually find and use them, and iterated based on real usage data, not assumptions. Start with the task your team complains about most. Score it against the framework. If it hits 12 or above, you have your answer. Build it this week. Run it for 30 days. That’s when it gets interesting. Ready to build your GPT? Start with a blueprint The GPT Blueprint Generator on Thinklet walks you through the validation framework above, generates a custom system prompt for your specific use case, and outputs a ready-to-paste knowledge file, all in one session. It’s built specifically as the hands-on companion to this guide. Or, if you want to see what a well-built GPT feels like before you commit to building one, start here: Try Marketing Research & Competitive Analysis. Competitive intelligence, market research, and strategic planning in one conversation. Try MARKETING. Full-stack marketing strategy across 14+ channels. Describe a campaign goal and get a channel plan, KPIs, and creative direction. View the full article
-
Trump delivers a new threat to Iran’s vital infrastructure if a ceasefire isn’t reached ‘shortly’
U.S. President Donald The President on Monday threatened widespread destruction of Iran’s energy resources and other vital infrastructure, including desalination plants, if a deal to end the war with Tehran is not reached “shortly.” On the ground, the conflict showed no sign of letting up: Tehran struck a key water and electrical plant in Kuwait, and an oil refinery in Israel came under attack. Israel and the U.S. launched a new wave of strikes on Iran. The President’s new threat in a social media post and earlier comments in an interview with the Financial Times that suggested American troops could seize the country’s Kharg Island oil export hub highlight a frequent tactic. On the one hand, he has repeatedly said that talks with Iran are going well — though Tehran denies negotiating directly. On the other hand, he has continually ramped up his threats, as thousands more Marines and other U.S. troops pour into the Middle East. It remains unclear where the diplomatic effort facilitated by Pakistan stands. Iran’s attacks on its Gulf neighbors could add another element of uncertainty to any talks. The United Arab Emirates — which has long billed itself as a beacon of safety and stability in a volatile region — has been hard hit in the war, and increasingly is signaling it wants Iran disarmed in any ceasefire. Iran’s theocracy likely won’t accept that. The President says diplomacy is going well but suggests military expansion is possible In a social media post, The President said “great progress is being made” in talks with Iran to end military operations. But he bristled that if a deal is not reached “shortly” and if the Strait of Hormuz is not immediately reopened, the U.S. would broaden its offensive by “completely obliterating” power plants, oil wells, Kharg Island and possibly even targeting desalination plants that supply drinking water. The strait is a crucial waterway through which a fifth of the world’s oil is shipped in peacetime. In the interview with the FT, The President said his preference would be to “take the oil in Iran” — a move that would require seizing Kharg Island — the terminal through which nearly all of Iran’s oil exports pass. “Maybe we take Kharg Island, maybe we don’t,” he continued. “We have a lot of options.” The U.S. already has targeted military positions on Kharg once. Iran has threatened to launch its own ground invasion of Gulf Arab countries and mine the Persian Gulf if U.S. troops land on its territory. Iranian Foreign Ministry spokesman Esmail Baghaei on Monday acknowledged Tehran had been given a 15-point proposal from the The President administration, but said there had been no direct negotiations with Washington so far. Earlier, Iran’s parliament speaker, Mohammad Bagher Qalibaf, dismissed the talks in Pakistan as a cover to get more U.S. troops into the region. He said Iranian forces were “waiting for the arrival of American troops on the ground to set them on fire and punish their regional partners forever,” according to state media. Twice during The President’s second term, the U.S. has attacked Iran during high-level diplomatic talks, including with the Feb. 28 strikes that started the current war. Iran launches attacks on Israel and hits more infrastructure in Gulf states Sirens sounded at dawn near Israel’s main nuclear research center, a part of the country that has been targeted repeatedly in recent days. Israel’s military also said it had taken out two drones launched from Yemen, where the Iranian-backed Houthi rebels entered the war on Saturday with their first missile attack. Later, a fire broke out at an oil refinery in the northern city of Haifa, either because of a missile strike or debris falling from an interception. The blaze was quickly extinguished. Iran kept up the pressure on its Gulf Arab neighbors, as Saudi Arabia intercepted five missiles targeting its oil-rich Eastern province, Bahrain sounded a missile alert, and a fireball erupted over Dubai, United Arab Emirates, as a missile was intercepted by defenses. In Kuwait, an Iranian attack hit a power and desalination plant, killing one worker and wounding 10 soldiers, the state-run KUNA news agency reported. Amid Iran’s continuing campaign against its Gulf neighbors, an Emirati government official offered another signal Monday that the United Arab Emirates wants more than just a ceasefire. “We want a normal neighbor,” Noura Al Kaabi, a minister of state at the UAE’s Foreign Ministry, wrote in a column published by the state-linked, English-language newspaper The National. “An Iranian regime that launches ballistic missiles at homes, weaponizes global trade and supports proxies is no longer an acceptable feature of the regional landscape.” She added: “We want a guarantee that this will never happen again.” Israel’s military, meanwhile, launched a new wave of attacks on Iran, saying it was striking “military infrastructure” across Tehran. Explosions were heard in the Iranian capital and Iranian state media reported a petrochemicals plant in Tabriz, in the north, sustained damage after an airstrike. Iran confirmed on Monday that the head of the Revolutionary Guard’s navy, Rear Adm. Alireza Tangsiri, had been killed in an Israeli airstrike, as Israel claimed last week. In Lebanon, which Israel has invaded, an Indonesian peacekeeper was killed and three others were wounded when a projectile exploded near a village in the south. Over the weekend, Israeli Prime Minister Benjamin Netanyahu said the military would widen its invasion, expanding the “existing security strip” in that country’s south as it targets the Iran-linked Hezbollah militant group. In Iran, authorities say more than 1,900 people have been killed, while 19 have been reported dead in Israel. Two dozen people have been killed In Gulf states and the occupied West Bank. In Lebanon, officials said more than 1,200 people have been killed, and more than 1 million have been displaced. Six Israeli soldiers have died in Lebanon, while 13 U.S. service members have been killed in the war. Oil prices rise again as concerns of global energy crisis grow Iran’s attacks on the energy infrastructure of the region and its stranglehold on the Strait of Hormuz have threatened global supplies of oil, natural gas and fertilizer. They have sent fuel prices skyrocketing and given rise to growing concerns about an energy crisis. The President has said that Iran had agreed to allow 20 oil tankers through the Strait of Hormuz starting Monday as “a sign of respect.” There wasn’t any information on whether those ships were actually moving Monday. Brent crude oil, the international standard, was trading around $115 Monday, up nearly 60% from when the war started. Rising reported from Bangkok, Corder from The Hague, Netherlands. Associated Press writers Darlene Superville aboard Air Force One, Melanie Lidman, Tel Aviv, Israel, Giovanna Dell’Orto in Miami and Munir Ahmed in Islamabad contributed to this story. —Jon Gambrell, David Rising and Mike Corder, Associated Press View the full article
-
BP’s loses head of EV charging as it accelerates pivot back to oil and gas
Martin Thomsen’s departure comes ahead of arrival of oil major’s new chief executive this weekView the full article
-
Treasury curve steepens after weeks of flattening
Yields rebounded after the 30-year held near 5%, but the yield curve's flattening trend likely isn't over, according to the head of correspondent business development at AD Mortgage. View the full article
-
10 Key Elements of an Effective Customer Care Business Plan
When developing a customer care business plan, it’s vital to focus on ten key elements that can greatly impact your success. By setting SMART goals and segmenting your customers, you can tailor your strategies effectively. Enhancing communication methods and integrating technology can streamline operations. Furthermore, collaboration across departments and measuring customer success metrics are important for continuous improvement. Comprehending these components will lay the groundwork for a robust customer care approach, ensuring you meet evolving customer needs. What comes next? Key Takeaways Establish SMART goals centered around customer needs to align with business objectives and improve customer satisfaction metrics. Utilize customer segmentation to develop targeted strategies addressing unique pain points and preferences for tailored support. Implement technology solutions like CRM software to enhance efficiency and track customer interactions effectively. Foster cross-department collaboration to ensure consistent messaging and a seamless customer experience across all touchpoints. Regularly measure customer success metrics, such as NPS and CSAT, to drive continuous improvement in service quality. Build the Foundation To build a strong foundation for your customer care business plan, you need to establish clear and measurable goals that align with your overall business objectives. Utilize the SMART criteria to define these goals, ensuring they’re specific, measurable, achievable, relevant, and time-bound. This approach helps create a solid foundation for your customer success strategy, guiding your team’s efforts effectively. Focus on customer-centric metrics, like increasing your Net Promoter Score (NPS) by a certain percentage within a specified period. This drives customer loyalty and satisfaction. Moreover, encourage a shared comprehension among your team members about the importance of these goals, which promotes a unified approach to achieving success. Regularly review and adjust your goals to respond to changing customer needs and market conditions, keeping your retail shop business plan template pertinent and effective. This adaptability is essential for maintaining high standards in your customer care business plan and overall retail business plan. Understand and Segment Customers Grasping and segmenting your customers is essential for tailoring your support and services to meet their specific needs. Recognizing customer demographics and behaviors allows you to create targeted strategies that address unique pain points for each segment. For instance, using a retail store business plan template can help you outline how to gather and analyze customer data effectively. Regularly collecting feedback through surveys and reviews provides insights that inform your segmentation, ensuring your strategies remain relevant. In your retail business plan example, emphasize the importance of utilizing data analytics to identify customer preferences and trends. This enables you to anticipate the specific needs of different groups, driving customer satisfaction and loyalty. Effective segmentation can lead to increased retention rates, eventually contributing to your business’s overall success. By focusing on these elements, you set the foundation for a more engaged and loyal customer base. Enhance Customer Interactions Improving customer interactions is crucial for cultivating loyalty and boosting overall satisfaction, especially when you actively anticipate and meet customer needs. Proactive engagement can increase retention by 23%, underscoring the importance of addressing issues before they arise. Customized interactions, like personalized emails and product recommendations, make customers feel valued and understood, raising their satisfaction. By utilizing data analytics, you can identify customer behavior patterns, allowing you to create targeted strategies that effectively address unique pain points for different segments. Implementing a CRM system streamlines these interactions, ensuring all team members have access to pertinent customer data to support personalized service. Additionally, regularly measuring engagement indicators helps you identify at-risk customers, enabling timely outreach to strengthen relationships and mitigate churn. Speed Things up With Technology and Tools As businesses endeavor to improve their customer service, leveraging technology and tools becomes essential for boosting efficiency and responsiveness. Implementing Customer Relationship Management (CRM) software allows you to track customer interactions and manage data efficiently, improving team collaboration and customer insight. In addition, utilizing customer success platforms like Custify can automate workflows and monitor customer health metrics, streamlining your support process and enhancing response times. Data analytics tools, such as Tableau, provide valuable insights into customer behavior and trends, enabling you to tailor your strategies effectively. Feedback collection tools, like Canny, facilitate gathering customer opinions and suggestions, driving product improvements and service upgrades. Moreover, communication tools such as Intercom guarantee timely and effective customer support, addressing inquiries quickly, which can increase customer satisfaction by up to 23%. By integrating these technologies, you can greatly improve your customer care operations and build lasting relationships. Collaborate With Other Departments Collaborating with other departments is crucial for creating a seamless customer experience. By aligning messaging across teams, you guarantee that customers receive consistent information, which builds trust and reliability. Furthermore, sharing insights and data helps identify customer needs and opportunities, allowing your organization to respond effectively and improve overall satisfaction. Align Messaging Across Teams Effective communication across teams is crucial for delivering a consistent customer experience. By collaborating with sales teams, you can gain valuable insights into customer needs and preferences, leading to better-targeted customer care strategies and upselling opportunities. Aligning messaging between marketing and customer support guarantees a unified brand voice, which helps maintain customer trust and consistency in communication. Engaging with the product team allows you to prioritize customer feedback on feature requests and bug fixes, aligning product offerings with customer expectations. Regular coordination between customer service and support teams streamlines processes, creating an all-encompassing knowledge base. This alignment across departments promotes a cohesive approach to customer success, eventually improving customer satisfaction and loyalty metrics. Share Insights and Data Aligning messaging across teams not merely improves communication but furthermore sets the stage for sharing insights and data effectively. Collaborating with departments like sales and marketing gives you valuable insights into customer needs, leading to improved customer care strategies. Engaging the product team allows prioritization of feature requests based on real feedback, whereas coordination with the support team creates an all-encompassing knowledge base. Regular interdepartmental meetings encourage alignment on objectives, ensuring all teams work in the direction of improving customer satisfaction. Department Insight Provided Sales Trends in customer purchasing behavior Marketing Customer engagement data and campaign feedback Product Feature requests based on customer feedback Support Common issues faced by customers during interactions Document Your Processes Documenting your processes is vital for maintaining consistency in service delivery, which directly impacts the customer experience. By having clear procedures in place, you not just streamline training and onboarding for new team members but furthermore improve overall efficiency within your organization. Moreover, keeping this documentation updated allows for ongoing process improvements, ensuring your strategies adapt to evolving customer needs and market demands. Consistency in Service Delivery Creating a consistent customer service experience hinges on clear and thorough documentation of your processes. Documenting these processes guarantees your team provides a uniform experience, reducing variability and enhancing customer satisfaction at every touchpoint. With clear documentation, new team members can quickly grasp their roles and responsibilities, allowing for efficient onboarding. Establishing a set of customer service workflows promotes accountability, clarifying who’s responsible for each aspect of customer interactions. Standardized processes not only streamline operations but additionally enable your team to respond to inquiries more quickly and accurately. By regularly updating documentation based on feedback and customer interactions, you guarantee your processes remain relevant, adapting to evolving expectations and service standards. Training and Onboarding Efficiency Efficient training and onboarding are crucial for a successful customer care team, as they directly impact service quality and employee performance. Documenting your processes improves onboarding efficiency, offering new team members a consistent reference point for handling inquiries. Well-documented workflows can cut training time by up to 25%, allowing quicker adaptation to roles. Standardized procedures in a customer care playbook guarantee uniform experiences for customers. Regular updates prevent knowledge gaps, equipping new hires with the latest best practices. A centralized knowledge base makes information easily accessible, boosting response times and service quality. Benefits Impact on Training Time Outcome Clear Documentation Reduces by 25% Faster Onboarding Standardized Protocols Guarantees Consistency Uniform Customer Experience Regular Updates Keeps Information Current Informed Employees Process Improvement Documentation When processes are clearly documented, it guarantees consistency in customer experiences and minimizes errors, which in the end improves the overall quality of service provided by your customer care team. Documenting workflows not only assists in onboarding new team members effectively but also boosts overall efficiency. Here are some key benefits of process improvement documentation: It provides a clear reference point for team members to understand their roles and responsibilities. Regular updates facilitate easier adjustments to strategies, helping you remain agile in response to changing customer expectations. Accountability is improved, ensuring all team members are aligned in their customer care efforts. Measure Customer Success Metrics Measuring customer success metrics is essential for grasping how well your business meets customer needs and expectations. Key metrics like Net Promoter Score (NPS), Customer Satisfaction Score (CSAT), and Customer Lifetime Value (CLV) provide quantitative insights into customer perceptions and engagement. Regularly tracking NPS, which ranges from -100 to 100, allows you to identify areas for improvement and assess overall sentiment regarding your service. CSAT, often gathered through post-interaction surveys, helps you evaluate satisfaction immediately after a service experience, enabling timely feedback for adjustments. Analyzing CLV gives you an insight into the long-term value of acquiring and retaining customers, informing your marketing and customer service strategies. Establishing a routine for measuring these metrics encourages a culture of continuous improvement, ensuring your business remains responsive to evolving customer needs and expectations. Continuously Improve and Adapt To continuously improve customer care, you need to focus on ongoing data analysis and agile process adjustments. Regularly evaluating metrics like customer satisfaction helps identify areas for improvement, as you adapt quickly to feedback guarantees your services meet evolving customer needs. Ongoing Data Analysis Ongoing data analysis serves as a crucial element in refining customer care strategies, helping businesses stay responsive to their customers’ needs. By regularly tracking key performance indicators (KPIs) like customer satisfaction scores (CSAT), Net Promoter Score (NPS), and customer lifetime value (CLV), you can effectively assess your initiatives. Utilizing data analytics tools allows you to identify trends, enabling proactive adjustments. Key benefits include: Improved comprehension of customer behavior and pain points Creation of feedback loops for real-time service adaptations Enhanced customer retention rates through proactive engagement Regular reviews empower your team to make informed decisions, ensuring alignment with customer expectations and facilitating timely adjustments to service delivery methods. This ongoing analysis cultivates a culture of continuous improvement. Agile Process Adjustments Agile process adjustments are essential for businesses aiming to improve customer care by adapting swiftly to shifts in customer preferences and market dynamics. Implementing regular feedback loops lets you adapt your strategies based on real-time insights, nurturing a culture of continuous improvement. Analyzing customer success metrics, like Net Promoter Score (NPS) and Customer Satisfaction Score (CSAT), provides actionable data to pinpoint areas needing improvement. Agile methodologies promote quick iterations, enabling your team to effectively respond to changing expectations. Regularly reviewing customer experience maps guarantees alignment with customer needs at every touchpoint. Moreover, utilizing data analytics tools like Tableau helps you uncover trends in customer behavior, supporting informed decisions for necessary process improvements. Effective Communication Strategies Effective communication strategies play a crucial role in enhancing customer care and satisfaction. By focusing on clear and empathetic interactions, you can greatly improve customer experiences and nurture loyalty. Consider these key elements: Active Listening: Truly understand customer concerns, resolving up to 90% of issues simply by acknowledging their emotions. Clear Language: Use concise communication to minimize misunderstandings, potentially boosting satisfaction scores by 25%. Proactive Outreach: Contact customers before they reach out for help, reducing the likelihood of minor issues escalating, which can improve retention rates by 23%. Additionally, maintaining consistent communication across all channels, such as phone, email, and chat, guarantees uniform service quality. This consistency is crucial for upholding your brand’s reputation and enhancing customer loyalty. Implementing these strategies can lead to a more effective customer care approach, eventually benefiting your business in the long run. Training and Empowerment of Customer Service Representatives Training customer service representatives is vital for guaranteeing they possess the skills needed to handle a variety of customer inquiries effectively. Your training programs should focus on fundamental skills, including product knowledge, communication techniques, and empathy. This equips you to address diverse customer needs. Empowering you to make decisions during interactions can greatly improve customer satisfaction, as trust leads to better service and quicker issue resolution. Prioritizing ongoing training guarantees you remain adaptable to changing customer expectations and industry trends. Incorporating role-playing scenarios during training improves your problem-solving abilities, preparing you for real-life situations and boosting your confidence. Moreover, providing access to updated resources is important, as it allows you to deliver accurate responses and maintain consistency across all customer interactions. Frequently Asked Questions What Are the 7 Key Elements of Customer Service? The seven key elements of customer service include clear communication, responsiveness, empathy, knowledge, consistency, personalization, and feedback. Clear communication guarantees customers understand your offerings. Responsiveness means addressing inquiries swiftly. Empathy helps in connecting with customers emotionally. Knowledge guarantees your team can provide accurate information. Consistency builds trust, whereas personalization tailors experiences to individual needs. Finally, seeking feedback helps you identify areas for improvement, guaranteeing your service evolves to better meet customer expectations. What Are the 5 Pillars of Customer Success? The five pillars of customer success are vital for nurturing long-term relationships and satisfaction. First, you need to understand customer needs by researching their preferences and pain points. Next, set clear, SMART goals that align with your business objectives. Effective communication is critical, so practice active listening and empathy. Leverage technology through CRM systems to improve engagement, and finally, promote continuous improvement to adapt your strategies and maintain customer loyalty. What Are the 7 Key Components of a Business Plan? To create a thorough business plan, you need seven key components. Start with an executive summary that captures your mission and vision. Next, conduct a market analysis to identify your target audience and competition. Outline your organizational structure to clarify roles. Develop a marketing strategy to attract customers, and include financial projections like income statements. Finally, assess risks and define your funding requirements to guarantee a clear path to success. What Are the 4 C’s of Customer Care? The 4 C’s of customer care are Care, Communication, Consistency, and Convenience. Care involves showing empathy and comprehension regarding customers, which builds trust. Communication guarantees you engage in clear, transparent dialogue, addressing customer needs effectively. Consistency assures that customers receive the same high-quality service across all channels, reinforcing reliability. Finally, Convenience focuses on making interactions easy and accessible, utilizing technology to streamline processes and improve the overall customer experience. Conclusion In summary, implementing these ten key elements in your customer care business plan can greatly improve customer satisfaction and drive growth. By establishing SMART goals, segmenting your customer base, and utilizing technology, you streamline operations and improve interactions. Collaboration across departments, effective communication strategies, and thorough training for your team guarantee consistency and efficiency. Regularly measuring customer success metrics and adapting your approach will keep you responsive to market changes, ultimately nurturing loyalty and long-term success. Image via Google Gemini and ArtSmart This article, "10 Key Elements of an Effective Customer Care Business Plan" was first published on Small Business Trends View the full article
-
Why You Need an MP3 Player in 2026 (and What to Buy)
We may earn a commission from links on this page. One of the easiest ways to turn off the noise of the world is to outsource what the phone does to other devices—ones that aren't perpetually online or requiring a subscription to function. This led me to rediscover an old favorite: the MP3 player. MP3 players are now, not surprisingly, far more capable than they were two decades ago. They come in a variety of shapes and sizes, many of which emulate the design of favorite legacy MP3 players, and are manufactured by companies you've likely never heard of, like FiiO and AGPTEK. They come with support for the latest music codecs, Bluetooth, FM Radio, built-in speakers, and even features that smartphone manufacturers have long since dropped, like expansion slots and the standard headphone jack. But if you're really nostalgic for the old days, you can recommission a vintage Apple iPod or a Microsoft Zune. There are even creative modders who will customize a traditional one. Whatever way you decide to participate, embracing the standalone MP3 player means joining a movement in defiance of the current trend of streaming everything. Here's how I got started with an MP3 player, and where to start if you're interested in reviving your own for modern-day use. These MP3 players are a turnkey solutionLet's be realistic: MP3 players are no longer the de facto gadget category they once were. You can't find them at brick-and-mortar stores or online the way you used to. But you can find some reasonably priced ones on Amazon and other global gadget markets. The most common and well-rated brand you'll see pop up on Amazon is AGPTEK. The company is based in Brooklyn and sells nearly everything in the gadget spectrum. It's one of those "parts bin" brands, which have access to reliable, off-the-shelf components to make reasonably priced gadgets without the high cost. AGPTEK A02 MP3 Player with Bluetooth 5.3, 1.8 inch Screen Portable Music Player with Speaker, FM Radio, Voice Recorder, Supports Expanded Up to 128GB(Purple, 64GB) $36.99 at Amazon Learn More Learn More $36.99 at Amazon AGPTEK offers several music players, but its most popular is the A02. It became a legend for its 70-hour battery life, and it's the one I brought home because of its impressive feature set. The AGPTEK A02 uses the latest Bluetooth 5.3 codec. I can move around the house with the player in my pocket or leave it on my desk, and I still have clear sound on the other side of the house. It uses USB-C for charging and data transfer, and there's a microSD slot that supports up to 128GB of storage if you need more than its 64GB internal storage. There's a physical lock switch to keep you from accidentally skipping a track, too. The Innioasis Y1. Credit: Innioasis/Amazon Innioasis is another brand you'll see floating around—they're known for their iPod knockoffs. They sell touchscreen MP3 players, too, but it's the "iPod tribute" ones that get the most traction. The most popular model is the Y1, which looks like an iPod Classic. The $50 device comes with USB-C and Bluetooth, rather than relying on a 30-Pin proprietary charging cable like back in the day. You don't need special software to drag and drop music onto the device. Most models come with 128GB of internal storage. Innioasis's touch-screen models are hybrid digital music players: They have wifi access so that you can download Spotify playlists and Audible audiobooks. They can't access the internet outside of that hot zone, however, so if you want new music and you're out on a run, you'll have to wait until you get back to the internet. These high-end options for great for audiophilesThe Sony Walkman NW-A306 is a premium digital music player. It has high-end internals like Sony's proprietary S-Master HX digital amplifier, which reduces distortion and noise across different frequencies, and high-quality film capacitors that make audio sound clearer than a phone's standard DAC (digital-to-analog converter). These are usually parts reserved for Sony's high-end home stereo systems, but here they're shrunk into a smaller footprint. For those on a tighter budget who still want a full-featured MP3 player, the brand FiiO is the current darling of the audiophile community. The JM21 player is slim and runs a Qualcomm Snapdragon 680 chip, which you'd typically find in an Android phone. It has Dual Cirrus Logic CS43198 DAC chips, which means it delivers music with fully balanced audio, the same way Sony's music players do. It also features both a standard 3.5mm and a 4.4mm headphone jack if you have high-end studio cans you'd rather listen through. 160G MP3 Player, JadeAudio/FiiO JM21 Android 13 Hi-Res Music Player with Snapdragon 680, Bluetooth WiFi USB DAC 12.5H Playtime, up to 2TB (Black) $179.99 at Amazon $189.99 Save $10.00 Learn More Learn More $179.99 at Amazon $189.99 Save $10.00 Deciding between these two depends on whether you prioritize usability or power. Sony's players, like the NW-A306 and the slightly larger NW-ZX707, are small and offer up to 36 hours of battery life. They also feature a Sony-piloted AI-powered tool called DSEE Ultimate, which upscales low-quality audio to sound more high-resolution. That said, Sony users have complained that the device can feel sluggish compared to even a modern midrange Android device. It's not exactly high-powered. The FiiO is an easier sell because it just got refreshed. The JM21 has a 700 mW output, so it won't start crackling as your headphones get louder. Users prefer the interface to that of the Sony MP3 players, too. They've also mentioned that the soundstage, or the perceived "distance" between instruments, feels much wider. However, the MP3 player is a bit of a "brick" compared to Sony's lighter builds. The battery life is only 12-15 hours, and it doesn't have Sony's AI-assisted audio processing, so if your audio files are bad, you'll hear it all. Should you resurrect your iPod? If you aren't interested in buying something new, you might find yourself drawn to the do-it-yourself world of MP3 player revival. You take a twenty-year-old device, gut it, replace its parts with modern components, and then you have yourself Frankenstein's digital music player for the new age. Right now, iPods are the ones getting the most modding. The fragile, spinning mechanical hard drive of the original iPod Classic is typically the first thing to fail, so modders use iFlash adapters to load music from microSD cards instead. By removing that hard drive, you also make space for a battery, which can add more life to the iPod—some modders figured out how to run it for over 100 hours of continuous play. If you do decide to go this route, there are kits you can buy, like the Classic Connect 2, which lets you add Bluetooth, USB-C charging, and wireless charging to the classic iPod shell. Even if you have an older iPod lying around that's still working, you can break free from the ghost of iTunes by installing Rockbox. This open-source firmware, which finally landed a stable release in 2025, enables vintage iPods to play lossless FLAC files, add custom themes, and other shenanigans that you wouldn't consider doing to an mp3 player, but you can. People are also reviving Microsoft's original attempt at the "iPod Killer": the Zune. The Zune 30 is particularly popular for these modding projects because its internal layout is spacious. Modders can swap the original hard drive for a larger-capacity solid-state drive and drop in a bigger battery. And yes, they're bringing wireless charging to the Zune through aftermarket hacks, too. I am personally on a journey to attempt to revive a Sony Walkman NW-HD5. It has its own community of modders who have figured out how to replace the 1.8-inch drive with flash storage. That model also had a user-replaceable battery from the get-go, and you can find higher-capacity batteries on places like AliExpress to fully revive it. Rockbox also works with the Sony NW-HD5. It's a tinkering hobby that will keep me busy for a while. View the full article
-
Creating an Effective Software Satisfaction Survey
Creating an effective software satisfaction survey is vital for grasping user experiences and improving your product. By crafting clear, targeted questions, you can gather valuable insights on key features. It’s likewise fundamental to distribute the survey quickly after user interactions and personalize invitations to boost responses. Anonymity can encourage honest feedback, which leads to actionable results. As you explore the components of a successful survey, consider how each element contributes to a thorough comprehension of user satisfaction. Key Takeaways Incorporate diverse question types, such as rating scales and open-ended questions, to capture comprehensive user feedback. Ensure questions are clear and concise to minimize confusion and encourage honest responses. Distribute surveys soon after key user interactions to enhance the accuracy of feedback. Utilize multiple channels for distribution, including emails and in-app notifications, to reach a wider audience. Analyze results collaboratively, prioritizing impactful suggestions and measuring the effectiveness of any changes implemented. Understanding the Importance of Software Satisfaction Surveys Comprehending software satisfaction surveys is fundamental for any organization looking to improve its products and services. These surveys provide insights into user perceptions, allowing you to measure happiness and loyalty toward your software. By incorporating well-structured program survey questions, you can capture direct feedback on usability, functionality, and overall user experience. A well-designed survey for software not only highlights areas needing improvement but likewise tracks changes in user sentiment over time. This ongoing assessment aids in benchmarking progress and evaluating the impact of software updates. Furthermore, collecting user feedback demonstrates your commitment to valuing and responding to their needs, which can improve customer relationships. Implementing changes based on insights from your software satisfaction survey can greatly boost user satisfaction, leading to higher customer retention rates and reduced churn. Therefore, incorporating these surveys into your strategy is vital for long-term success. Key Components of an Effective Survey To gather meaningful insights from your software satisfaction survey, it’s important to include several key components that can improve the effectiveness of your questions. Here are five essential elements to take into account: Variety of Question Types: Mix rating scales, multiple-choice, and open-ended questions to capture diverse user insights. Clarity and Conciseness: Guarantee each question is straightforward, allowing respondents to easily understand and provide accurate feedback. Targeted Relevance: Focus on specific aspects of the software experience to gather actionable insights that align with user needs. Timing of Distribution: Send the survey shortly after significant software interactions to yield more accurate responses. Anonymity Assurance: Communicate the survey’s purpose and make sure users feel safe providing honest feedback without fear of repercussions. Crafting Thoughtful Questions for User Feedback How can you secure your software satisfaction survey effectively captures user feedback? Start by crafting clear and concise questions that respondents can easily understand, reducing any ambiguity in their answers. Incorporate a mix of question types—rating scales, multiple-choice, and open-ended questions—so you can gather thorough insights into user experiences and preferences. Focus your questions on specific software features and user interactions, targeting key areas like usability, functionality, and support to guarantee you collect actionable feedback. Use unbiased and neutral phrasing to avoid leading respondents in specific directions, allowing the data to accurately reflect genuine user sentiments. Finally, keep your survey length between 8 to 12 questions, making it quick and manageable for users. This approach improves response rates and encourages more participants to complete the survey, eventually providing you with valuable insights for enhancing your software. Best Practices for Survey Distribution and Engagement Gathering user feedback through surveys is only as effective as the methods you use to distribute them. To maximize your reach and engagement, consider these best practices when distributing your software satisfaction survey: Use a mix of distribution channels, like in-app notifications, emails, and social media. Time your surveys right; send them shortly after significant user interactions or milestones. Personalize survey invitations based on user behavior and demographics to increase relevance. Offer incentives, such as discounts or prize draws, to motivate participation. Monitor response rates and adjust your approach based on user feedback patterns. Analyzing and Acting on Survey Results Analyzing survey results reveals crucial insights into user satisfaction and potential areas for improvement within your software. You’ll want to aggregate data to identify trends and patterns that pinpoint critical feedback. Sharing these findings with relevant stakeholders is fundamental, as it encourages accountability and collaborative action on identified issues. Action Description Identify Trends Look for recurring themes in user feedback. Share Findings Communicate results with teams to promote action. Prioritize Feedback Focus on the most impactful suggestions first. Measure Impact Assess the effects of implemented changes. Frequently Asked Questions How to Create a Satisfaction Survey? To create a satisfaction survey, first, define your main objectives, like gauging overall satisfaction or pinpointing improvement areas. Next, choose a variety of question types, such as rating scales and open-ended queries, to gather thorough feedback. Make sure your questions are clear and concise, avoiding jargon. Finally, distribute the survey at opportune moments, like after purchases, and analyze the results to identify patterns that can inform actionable changes for enhancing user satisfaction. What Is the 5 Point Scale for Customer Satisfaction Survey? The 5-point scale for customer satisfaction surveys ranges from 1 (Very Unsatisfied) to 5 (Very Satisfied). This scale allows you to quantify customer sentiments effectively, making it easier to analyze overall satisfaction trends. A score of 4 or 5 typically indicates a positive experience, suggesting potential customer loyalty. What Are 5 Good Survey Questions? To gather meaningful insights, you should consider these five survey questions: First, “On a scale of 1 to 10, how satisfied are you with the overall experience?” Next, ask, “How likely are you to recommend our service to others?” Furthermore, inquire about valuable features with, “Which features do you find most useful?” Follow this with a value assessment: “Do you believe the service is worth its price?” Finally, ask for suggestions: “What improvements would you recommend?” What Are the 7 Steps to Creating a Good Survey? To create a good survey, start by defining your objectives to guide your questions. Next, craft clear and concise questions using straightforward language. Incorporate a variety of question types—like multiple-choice, rating scales, and open-ended—to gather diverse insights. Test your survey with a small group to identify any issues and make necessary adjustments. Finally, analyze the data thoroughly, focusing on actionable insights that can lead to meaningful improvements based on user feedback. Conclusion In summary, an effective software satisfaction survey is essential for comprehending user experiences and improving your product. By focusing on key components, crafting thoughtful questions, and employing best practices for distribution, you can gather valuable feedback. Analyzing the results allows you to identify areas for improvement, ultimately enhancing user satisfaction. Remember, engaging users through personalized invitations and maintaining anonymity can greatly impact response rates and the quality of the insights you receive. Image via Google Gemini This article, "Creating an Effective Software Satisfaction Survey" was first published on Small Business Trends View the full article
-
Adobe Illustrator just got the holy grail of illustration tools
At last, after seven months of public beta testing, Turntable is available today in the latest release of Adobe Illustrator. Presented at the 2024 edition of the Adobe Max conference as a sneak preview, the tool uses generative AI to transform any 2D vector illustration into a 3D object that you can turn around its vertical axis, as if it were on a clay modeling turntable. When it came out, its magicks left every Illustrator user cheering. If you have ever used Illustrator to craft a vector illustration—from a logo design to an animation character—it’s understandable why people were so excited. “The idea for Turntable originated from a consistent theme we heard directly from customers around the time and effort required to manually redraw characters and illustrations from multiple angles, often taking hours,” says Deepa Subramaniam, Adobe’s Vice President of Product Marketing for Creative Professionals. It’s a perfect use case of generative AI actually helping human creators to save an incredible amount of time without sacrificing their artistic vision. To understand Turntable’s wow factor, you need to understand why it is so hard to modify a 2D illustration. Drawing something complex with a stylus and a tablet using Photoshop requires very little time. It is as easy as drawing it with pencils and a notepad. But with Photoshop you can’t zoom in or out of a drawing infinitely without seeing pixels and jagged curves. With Illustrator, you can zoom-in infinitely, and it will always look perfectly smooth, but that benefit comes at a very high time cost. Unlike raster hand drawing, it’s very hard to create an image using mathematical lines and fills, requiring you to endlessly pull from little handles that shape curves add or subtract basic geometry like rectangles and ovals into a new, more complex shape. Doing that again and again, to change the perspective of a faux 3D object, or to create 40 different versions of a 2D character from different angles, is truly maddening. That is precisely why people love Turntable, a generative AI technology that runs in Adobe’s Creative Cloud to interpret what your mess of lines and shapes is—let’s say, an astronaut—and rotate it around 360 degrees to show its sides or back without you ever having to redraw the entire thing, all while keeping its infinitely-zoomable vector nature. Vital time saver When you feed a flat vector into this engine, it does not just blindly stretch the image. It perceives your mathematical anchors and curves like a constellation of stars mapped on a cosmic grid, intelligently recalculating the coordinates to reveal the hidden sides of the universe you built. The sheer shock of seeing an algorithm correctly guess the unseen geometry of a flat drawing stunned the industry. As the editorial team at CreativePro Network noted when it came out, the collective reaction “was a fascinating mix of pure shock, genuine excitement, and a healthy dose of professional skepticism.” Unlike the messy chaos of prompt-driven image generators that vomit random pixels, this utility has a laser-focused mandate. This singular focus turns a grueling marathon into a sprint, establishing Turntable as a brutal weapon against tight production deadlines. “Animation teams can quickly create character turnarounds for pitch decks, game designers can generate 360-degree assets for concept art and social media teams can create GIFs and micro-animations in seconds, all within Illustrator,” Subramaniam tells me. The Adobe ecosystem integration makes the workflow even more powerful, directly linking your rotating vector assets to animation pipelines. As Subramaniam tells me, “with seamless handoff to tools like After Effects, teams can move from design to motion without breaking their flow.” This means a character spun around in Illustrator can be instantly dropped into a motion graphics sequence. It processes the heavy lifting entirely in Adobe’s cloud, pulling from the monthly generative credits bundled into paid Creative Cloud subscriptions at a cost of 20 credits per generation. But while this all sounds like a bulletproof technological leap, the initial reality was much rockier, and early adopters quickly hit a wall of workflow-breaking limitations and strict structural demands during the public beta. Is Adobe Turntable ready for prime time? As it usually happens, after the Adobe Max stunning demos ended, the tool showed limitations in real-world beta testing. It initially demanded absolute perfection from the human operator. To prevent the algorithm from violently ripping the geometry apart during a rotation, users had to meticulously bind their layers together using Illustrator’s grouping functions. It was the digital equivalent of lashing pieces of a ship together to survive a hurricane; if your anchor points were loose, the AI would scatter them into chaos. When it became available, Redditor LukeChoice warned early testers that they needed to be “adhering to established best practices to achieve optimal outcomes.” Even if the artwork survived the spin, getting the results out of the software was a chore. Early testers ran into severe export friction, requiring convoluted workarounds just to save individual frames as usable standalone files. Subramaniam claims that they worked out these problems, which is why the tool left the beta testing phase now. According to the Adobe Community Release Notes, you “no longer need to group objects” before applying the effect. Furthermore, the development team shattered the artificial boundaries that previously capped rotations between -120 and +120 degrees. Now, the math pushes the vectors through a full 360-degree orbit in crisp 15-degree steps, as noted by contributors on the Illustrator subreddit. The excruciating export roadblocks were fixed too. Users can now deploy a single command to instantly dump every generated angle directly onto the canvas, forming immediate character reference sheets. Adobe also injected native GIF exporting straight into Illustrator’s Contextual Taskbar. These kinds of utilities represent a massive evolutionary leap for commercial artists, functioning as a hyper-efficient co-pilot that vaporizes hours of tedious labor. It is adequately fast too, as it runs on Adobe’s servers. By the way, each Turntable run will cost 20 AI-processing credits. When you run out of credits in your Adobe Creative Cloud subscription, that will cost you roughly $1.00 (via a $4.99 tier yielding 100 credits) down to just $0.10 (via a $9.99 tier providing 2,000 credits). Peanuts, considering the hours and days of work this tool may save you. Yet, for all these advantages, the underlying trajectory of this technology points toward a chilling final destination. In Spain we have a saying: “Bread for today, hunger for tomorrow.” I have no doubt that generative AI will inevitably render Turntable, and Adobe Illustrator itself, completely obsolete. And you can do your own math when it comes to consequences to the human illustrators. In the not-so-distant future, the act of visual creation will not require mapping mathematical vectors or coaxing an algorithm to rotate an astronaut. You will simply command a machine to manifest any concept, in any conceivable style, from any angle, with absolute precision and infinite resolution—bypassing the ability of the human hand or any human knowhow entirely. This existential reality is exactly why some creatives fundamentally reject this tool, no matter how much time it will save them today. That’s not the time that matters to them; it’s the little time they have left as creators after so much time spent becoming expert illustrators and designers. View the full article
-
This Wearable Action Camera Is $70 Off for Amazon's Big Spring Sale
We may earn a commission from links on this page. Deal pricing and availability subject to change after time of publication. Action cameras usually ask you to choose between size and capability—smaller ones are easier to wear but often compromise video quality or controls. The Insta360 Go Ultra tries to bridge that gap. It’s a tiny, wearable camera that can also behave like a more traditional action cam when you need it to. Right now, it’s down to $379.99 (originally $449.99) during Amazon’s Big Spring Sale (ending March 31), which is its lowest price yet, according to price trackers. It also helps that it earned an “excellent” rating from PCMag, which gives some context to what you’re getting here. Insta360 Go Ultra $379.99 at Amazon $449.99 Save $70.00 Get Deal Get Deal $379.99 at Amazon $449.99 Save $70.00 What makes this camera interesting in daily use is how flexible it feels. The main camera unit is small enough to clip to your shirt or wear on a magnetic pendant, so you can record hands-free without thinking about framing. When you want more control, you snap it into the Action Pod, which adds a 2.5-inch flip-up touchscreen and turns it into something closer to a GoPro-style setup. Video quality is a clear step up from older Go models—it shoots stabilized 4K at 60fps, and the larger 1/1.28-inch sensor helps in low light, so evening clips or indoor footage don’t fall apart as quickly. Stabilization, too, works well for walking or biking, and the wide field of view keeps things immersive. The Go Ultra also supports HDR, multiple shooting modes, and a tutorial-led app-based editing, so putting together a usable clip doesn’t take much effort, even if it takes a bit of getting used to. Video is limited to 8-bit color, so there’s not much room for heavy editing or color grading. You’ll also need a microSD card since there’s no built-in storage. Audio is decent for casual use, but wind and distance can affect clarity unless you connect an external mic or supported earbuds. Also, its battery life depends heavily on how you use it—the standalone camera lasts around 30 to 36 minutes at 4K60 before heating becomes an issue, while the Action Pod stretches that to about two hours, according to the PCMag review. Compared to something like the DJI Osmo Action 5 Pro, what you’re paying for here is the compact, wearable design rather than raw video flexibility. Our Best Editor-Vetted Amazon Big Spring Sale Deals Right Now Apple AirPods Pro 3 Noise Cancelling Heart Rate Wireless Earbuds — $199.00 (List Price $249.00) Apple iPad 11" 128GB A16 WiFi Tablet (Blue, 2025) — $299.00 (List Price $349.00) Samsung Galaxy Tab A11+ 128GB Wi-Fi 11" Tablet (Gray) — $202.00 (List Price $249.99) Sony WH1000XM6- Best Wireless Noise Canceling Headphones — $398.00 (List Price $459.99) Apple Watch Series 11 (GPS, 42mm, S/M Black Sport Band) — $299.00 (List Price $399.00) Blink Video Doorbell Wireless (Newest Model) + Sync Module Core — $35.99 (List Price $69.99) Fire TV Stick 4K Max Streaming Player With Remote — $34.99 (List Price $59.99) Amazon Kindle Colorsoft 16GB 7" eReader (Black) — $169.99 (List Price $249.99) Deals are selected by our commerce team View the full article
-
JPMorgan, Pimco say bond market is misjudging slowdown risk
Some of Wall Street's biggest bond-fund managers say financial markets are underestimating the risk that the US war in Iran will cause a sharp slowdown in an already sputtering economy. View the full article