On Thursday, Google announced new search-related updates. In addition to the wide deployment of the new version AI Insights feature, which links to posts more importantwith – whomp, whomp – announcements On AI previews, Google showed off entirely new features.
Here’s what’s happening with Lens, Circle to Search, and Google’s all-new recipe search experience
1. Video and voice search with Lens
THE Lens A handy tool for finding visuals can now process video and voice. It works in the Google app by pointing the camera at something that is best understood with a moving image (like a school of fish swimming in a circle, as Google demonstrated in the demo). Hold down the shutter button to record the video and ask Lens a question, like “why are they swimming together?” » Lens will process the video and your question to give you an answer in the form of an AI overview, along with relevant resources to click on.
Crushable speed of light
Lens for video is currently only available to those with access to Google’s testing ground, Search Labs. Lens for photos with voice prompts is available for Android and iOS users through the Google app.
Lens now allows users to ask questions on videos with voice prompts.
Credit: Google
2. Circle to search for music
Circle to search is another visual search tool that debuted earlier this year. Android users can circle or scribble an object in an image and search for something on the page without having to switch apps. But now they can also search for a song in a video on social media or a streaming app in the same way. Obviously, you can’t physically circle a song, so instead you click on the musical note icon next to the search bar at the bottom of the page. Google’s AI model will then process the audio and extract information about the song. Circle to Search for music is available for Android users with compatible Google Pixel and Samsung Galaxy devices.
3. AI-Curated Search Results
Google search results will look like a little different. With recipes and meal ideas to get you started, open queries will display search results curated by Google’s AI models. Starting this week, in the United States, on mobile, users will begin to see the full page of results broken down into subcategories of the original query. The feature combines Google’s Gemini AI models with its core search algorithm to surface relevant topics based on a request for, say, vegetarian appetizer ideas. AI-curated search results are available on the Google mobile app.
Google will now use its AI models to display the best results related to your search.
Credit: Google
Topics
Artificial intelligence
Google