Jump to content




Featured Replies

google-lens-1920-800x457.jpg

Google today announced Search Live, a new feature that lets users have real-time conversations with Search using their smartphone camera. The tool builds on Google’s work in visual search and AI, combining the capabilities of Google Lens and Project Astra.

How it works. Users can tap a new Live icon in either Lens or AI Mode. After pointing their camera at an object, they can ask questions about it. Search Live responds in real time with explanations, suggestions, and links to relevant websites, videos, and forums.

What it looks like. Here’s a video showing how Search Live works:

Why we care. Search Live highlights Google’s ongoing push toward multimodal search, where text, images, and now live video with voice are all combined. (Google says more than 1.5 billion people use Lens to search visually each month.) Search Live adds a new conversational layer to Google that helps searchers understand not just what they ask, but what they see.

Dig deeper. More Google I/O news from today:

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.