New Google Lens feature will let Android users (literally) search their screen

>

Google announced a series of AI upgrades to its search, maps, and Lens services at its recent presentation in Paris, with Lens taking advantage of a particularly useful new feature over the coming months.

Soon, Google Lens users on Android will only be able to search what they see in photos and videos through the Google Assistant. The integration works across countless websites and apps and allows people to learn more about the information contained in images — think building names, food recipes, or car models — without having to navigate away from those images. As Google explained in its presentation in Paris, “if you can see it, you can search it”.