ChatGPT could be worse than cryptocurrency when it comes to scams
Meta, the company behind Facebook, has been the latest company to publicly warn about a growing pandemic of ChatGPT scams.
As Reuters reports (opens in new tab), the company has found over 1,000 malicious links that fraudulently claim to be associated with the popular AI (Artificial Intelligence) chatbot.
These discoveries led to Guy Rosen, Meta Chief Information Security Officer, to claim that “ChatGPT is the new crypto,” in reference to the spate of scams that quickly arrived during the cryptocurrency boom.
A growing concern
Sadly, as with other tech advancements that get a lot of media coverage and hype (as we also saw with cryptocurrency), scammers are using the growing popularity of ChatGPT and other AI chatbots like Bing Chat and Google Bard, to scam people, and Meta isn’t the only company to warn of this growing trend.
Alex Kleber, a researcher for the Privacy 1st blog, wrote up an extensive report on the sheer number and nature of faux ChatGPT clones in the Mac App Store (opens in new tab).
He claims that there are specific developers who are making apps with limited functionality, dressing them up with OpenAI and ChatGPT imagery to look official, using multiple developer accounts, and spamming the app store with these clones. They then quickly request a user rating to pump up their App Store rating. Kleber suggests that this makes it harder for legitimate developers to publish, list, and sell apps that might actually improve users’ ChatGPT experience.
This is part of a wider trend of fraudulent ChatGPT apps in app stores and online. According to Bleeping Computer (opens in new tab), there are full-on malware-laden apps and web pages that target Windows and Android devices, and are designed to deceive users into installing malware on their devices, or provide personal information, by pretending to be legitimate ChatGPT-powered apps.
Dominic Alvieri, a security researcher, outlined such an instance in a Twitter thread where a website that resembles the official OpenAI ChatGPT domain infects your device with malware that grabs your sensitive personal information (a process known as ‘Phishing’).
Google first page Chat GPT Google Play Store fake apps.Google search item apps 3 & 4 removed from the Google Play Store including fake Chat GPT Smart AI Chatbot…@Google @OpenAI @Microsoft pic.twitter.com/Ul3wbNpAPDFebruary 13, 2023
Alvieri also highlighted Google ads that advertise other fake ChatGPT apps on the Google Play Store, similar to the above-mentioned Mac App Store scams. The fact that these fake apps are being advertised, and therefore being given the air of legitimacy, is incredibly concerning.
Stay vigilant
Cyble, a research and intelligence lab, recently published a report (opens in new tab) not long after Alvieri’s discoveries further exposing how widespread these phishing scam sites and apps are, finding more examples of fake websites that look quite similar to the official ChatGPT site, but instead distribute various malware. Along the lines of Alvieri’s Google Play Store claims, and Meta’s findings, Cyble discovered over 50 malicious fake ChatGPT apps that attempt to harm devices once downloaded.
More worryingly, some of these sites will ask for your payment information, claiming to offer a subscription to ChatGPT Plus, which is an actual service OpenAI offers for $20/month that offers the removal of usage restrictions and other features. You should only purchase this from the official Open AI ChatGPT website. OpenAI has not made any official mobile or desktop apps for ChatGPT at present, and any app presenting itself as such is fraudulent.
While there are interesting things being done by third-party developers to modify and personalize your ChatGPT experience, it’s worth being vigilant and double-checking what the app, extension, or site you’re using is claiming to do, checking what other people and professionals are saying about it, and checking that it is made by a legitimate and/or verified developer.
It’s worth doing a couple of extra checks to make sure your information is kept safe while you’re out exploring the Wild-West-like frontier of AI and AI-assisted tools, and prevent yourself from falling victim to the multitude of crafty phishermen out there.