Google is rolling out a powerful new feature in its AI‑powered Search: Search Live, a voice‑activated, real‑time conversation mode that blends chat, audio, and eventually visual input. The feature is now being tested in AI Mode via Google Labs for Android and iOS users in the U.S., marking another step in the company’s push to make search more interactive and conversational.
To activate Search Live, enrolled users simply tap the new “Live” icon in the Google app. From there, questions can be asked aloud, like “How do I reduce wrinkles in a linen dress when packing?” to receive immediate spoken responses. You can follow up naturally (“And what if it still wrinkles?”) without restarting the interaction. The feature runs in the background, so conversations continue seamlessly even if you switch to another app.
Google is leveraging a custom-tuned version of Gemini (reportedly Gemini 2.5) which employs a “query fan‑out” approach – breaking your question into multiple sub‑queries to pull together diverse sources fast. As you converse, Search Live also displays clickable web links and maintains a transcript, allowing you to browse results or switch to typing at any point.
While the camera isn’t live yet, Google promises visual input integration “in the coming months,” enabling users to point at objects during voice chats. This echoes earlier demos from I/O, including real-time visual AI interaction under “Project Astra”.
Search Live is part of Google’s grander AI Mode vision: a conversational experience that goes beyond AI Overviews to support Deep Search, agent-like task execution, and personalized context based on your past queries or app data. Alongside new voice features, Google is experimenting with embedded ads, charts, and custom agents, signaling a shift from link-based results to dynamic, AI-driven assistance.
- Top 10 AI Consulting Firms You Need to Know in 2025 - July 17, 2025
- Meta Brings AI Summaries to WhatsApp Chats in the U.S. - June 26, 2025
- Google Tests Search Live: Real-Time AI Chat in Search - June 24, 2025