New Google Lens feature will let Android users (literally) search their screen

>

Google announced a series of AI upgrades to its search, maps, and Lens services at its recent presentation in Paris, with Lens taking advantage of a particularly useful new feature over the coming months.

Soon, Google Lens users on Android will only be able to search what they see in photos and videos through the Google Assistant. The integration works across countless websites and apps and allows people to learn more about the information contained in images — think building names, food recipes, or car models — without having to navigate away from those images. As Google explained in its presentation in Paris, “if you can see it, you can search it”.

Confused? See the latest Google Lens update in action via the tweet below, which shows a user identifying Luxembourg Palace through a video from a friend of the monument.

view more

Google hasn’t yet offered a date for the new feature’s arrival, though the company has promised to roll out the upgrade “in the coming months” (which, for our money, probably means February or March 2023).

Significant improvements are also on the way to Google’s Multisearch feature. The ability to add a text query to Lens searches is now available globally in all supported languages ​​and countries, and Google is also introducing the ability to find different variations (for example, shape and color) of objects captured with Lens.

As Google explained in Paris, “For example, you search for ‘modern living room ideas’ and see a coffee table you love, but you prefer a different shape, say a rectangle instead of a circle. You can use Multisearch to add the text ‘rectangle’ to find the style you’re looking for.” See the feature in action below:

view more


A new search era?

Elsewhere during Google’s recent showcase, the company announced a host of AI-powered updates to Google Search and Google Maps.

For example, Google will soon integrate its “experimental conversational AI service,” Bard, into Search to provide users with more accurate and convenient search results. As Google explained in Paris, you’ll soon be able to ask questions like, “what are the best constellations to look for when stargazing?”, and then dig deeper into what time of year is best to view them through of helpful AI suggestions.

The move follows Microsoft’s announcement of a redesigned, AI-powered Bing search engine that uses the same technology as ChatGPT.

As for Google Maps, the service’s Immersive View feature — which lets you explore landmarks virtually — is getting a significant upgrade in five major cities around the world, while the Live View feature — which uses your phone’s camera to help you help explore a city through a neat AR overlay – is set for similar expansion.

We’ll be testing all of the above features ourselves in the coming months, but for a quick look at everything else announced at the Google Paris showcase, check out our Google ‘Live from Paris’ live blog.

Related Post