Google has taken a major leap forward in AI-powered search by introducing natural voice conversations to its experimental AI Mode. This groundbreaking feature transforms how users interact with search, allowing for fluid, back-and-forth dialogue just like talking to a human asssistant.
The new Search Live integration enables users to ask complex questions verbally and receive spoken responses while simultaneously viewing relevant web links. To start a conversation, users simply tap the “Live” icon in the Google app and speak their question. The AI will respond audibly and intelligently handle follow-up questions, creating a truly interactive experience.
This development comes as Google’s direct response to competing AI services like Perplexity and ChatGPT, significantly enhancing its voice search capabilities. The feature shines in real-world scenarios – imagine packing for vacation and asking “How do I keep clothes from wrinkling in my suitcase?” followed by “What if they still get wrinkled?” without having to repeat your context.
What makes this innovation particularly useful is its seamless integration into daily life. The conversation continues working even when you switch to other apps, and you can always view a text transcript or return to previous discussions in your AI Mode history. According to Google’s Liza Ma, this capability is powered by a custom version of Gemini AI specifically optimized for voice interactions while maintaining Google’s renowned search quality standards.
Looking ahead, Google plans to expand these live capabilities to include visual search – allowing users to ask questions about what their phone camera sees in real time. This upcoming feature, previewed at Google I/O, promises to make AI-assisted search even more intuitive and context-aware.