Google’s Gemini Live AI assistant will show you what it’s talking about
1 min read
Summary
Google is introducing several new features for its AI assistant, Gemini Live, aimed at improving the user experience.
One new feature will allow the assistant to point out specific items on a user’s screen, using their smartphone camera to help make selections.
For instance, if a user is trying to choose the right tool for a project, they can show the options to the camera and it will highlight on the screen which one they should use.
Google is also launching new integrations for Gemini Live, so it can interact with more apps, including Messages, Phone and Clock.
The company is also releasing a new audio model which will enable the chatbot to use the key elements of human speech, adapting its tone and speech pattern to the user and context.
This includes adopting an accent for a “rich, engaging narrative” if the user asks a question that would suit this type of response.