There's no doubt that artificial intelligence (AI) is turning society upside down, with ChatGPT and its rivals already changing the way we live our lives. But a new AI project just surfaced that can pinpoint the location where virtually any photo was taken – and it has the potential to be a privacy nightmare.
The project, called Predicting image geolocations (or PIGEON for short) was created by three students at Stanford University and is designed to help find where Google Street View images were taken. But when he was given personal photos that he had never seen before, he was even able to accurately find their locations, usually with a high degree of accuracy.
Jay Stanley of the American Civil Liberties Union says this has serious privacy implications, including government surveillance, corporate tracking and stalking. according to NPR. For example, a government could use PIGEON to find dissidents or see if you have visited places it disapproves of. Or a stalker can use it to find out where a potential victim lives. In the wrong hands, this kind of technology can wreak havoc.
Motivated by these concerns, the student developers decided not to release the technology to the rest of the world. But as Stanley notes, that may not be the end of the matter: “The fact that this was done as a student project makes you wonder what, say, Google could do.”
A double-edged sword
Before we start readying the pitchforks, it's worth remembering that this technology, if deployed responsibly, can also have a number of positive applications. For example, it can be used to identify places where roadworks or other maintenance is required. Or it can help you plan a vacation: where in the world could you see landscapes like those in your photos? There are also other applications, from education to monitoring biodiversity.
Like many recent developments in AI, it is a double-edged sword. Generative AI can be used to help a programmer debug code with great success, but it can also be used by a hacker to refine their malware. It can help you get ideas for a novel, but it can also help someone who wants to cheat on their studies.
But anything that helps identify a person's location in this way could be extremely problematic in terms of personal privacy – and have major implications for social media. As Stanley argued, it has long been possible to remove geolocation data from photos before uploading them. Now that may not matter anymore.
What is clear is that there is an urgent need for some form of regulation to prevent wider abuse, while the companies that create AI technology must work to prevent harm caused by their products. Until that happens, we will likely continue to see concerns about AI and its capabilities.