Thanks to a recent update, Google Lens now lets you add voice recordings to image searches, giving your searches essential context. This feature appears to have been discovered by Android Police after they encountered a new glowing animation on the app. According to the report, the glow was above the magnifying glass icon, and tapping on it would bring up a “Search by voice” tooltip on the screen, informing them of the change.
If you hold down the shutter button, it will expand and a message will appear saying, “Speak now to ask about this image.” You can then say a command or ask a question, and what you say will appear on the screen as a floating text transcript. Google Lens will begin searching as soon as you release the button, taking into account the capture.
Android expert Mishaal Rahman posted a demo of the feature in action, asking Google Lens to count how many blueberries were on his plate. Rahman says that questions will appear in Google Search, where “Gemini will try to provide an answer” in an AI summary.
You can now use your voice to add context to searches in Google Lens! Hold down the shutter button in Lens and you’ll see “speak now to ask a question about this image.” After you ask your question, release the button and Google Gemini will attempt to provide an answer. pic.twitter.com/uHkgjNQOogAugust 5, 2024
Performance
The key word in that last sentence is “attempt,” because Google Lens doesn’t always get it right. We got the update on our Android phone and immediately started testing it, and our results were pretty mixed.
In a test, we asked the app where we could buy a particular brand of sparkling water. Google Lens, with a little help from Gemini, showed us nearby stores that sold that brand. We then asked the app to identify the food we were eating, and the software correctly identified it as pico de gallo. However, when Google Lens was asked to count the buttons on a PS5 controller, it either failed or gave the wrong answer.
The tool works pretty well overall, though it can occasionally mess up a response. Still, the update works pretty well for something that was fixed quickly. Hints of it were first “spotted in development” about a month ago by an industry deep diver AssembleDebug on X (the platform formerly known as Twitter).
And now, it’s a full feature in no time. Android Police says the update is a server-side push, so it should be available on all Android smartphones right now. If you don’t see it, make sure you’ve installed the latest patches for the Google Search and Lens apps on your mobile device.
Check out Ny Breaking’s roundup of the best Android phones for 2024.