ChatGPT creator confirms a bug allowed some users to snoop on others’ chat histories
>
Are YOUR conversations safe? ChatGPT creator fixes a bug that allowed some users to snoop other people’s chat history
- Sam Altman, CEO of OpenAI, confirmed that ChatGPT had a “significant” problem
- A “small percentage” of users were able to view other people’s chat history
- This follows previous privacy concerns raised about the company’s data usage
The creator of ChatGPT has confirmed that a bug in the system allowed this some users sniff other people’s chat history.
Sam Altman, CEO of OpenAI, confirmed last night that the company was experiencing a “significant issue” that threatened the privacy of conversations on its platform.
The revelations came after several social media users shared ChatGPT conversations online that they had not participated in.
As a result, users were unable to view chat history between 8am and 5pm (GMT) yesterday.
Mr. Altman said: ‘We had a significant problem in ChatGPT due to a bug in an open source library, which has now been released and we just finished validating. A small percentage of users were able to see other users’ conversation history titles.”
On Monday, it was confirmed that a “small percentage” of ChatGPT users could view other people’s chat history
ChatGPT was founded in 2015 in Silicon Valley by a group of American angel investors, including current CEO Sam Altman.
It is a large language model trained on a huge amount of text data, which allows it to generate responses to a given prompt.
People all over the world have used the platform to write human-like poems, lyrics and various other written works.
However, a “small percentage” of users this week were able to see chat titles in their own conversation history that didn’t belong to them.
On Monday, a person on Twitter warned others to “be careful” of the chatbot that had shown them other people’s conversation topics.
An image of their listing showed a number of titles, including “Girl Chases Butterflies,” “Books on human behavior,” and “Boy Survives Solo Adventure,” but it was unclear which of those weren’t theirs.
They said, ‘If you’re using #ChatGPT, be careful! There is a risk that your chats will be shared with other users!
“Today I was shown another user’s chat history. I couldn’t see the content, but I could see the titles of their recent chats.”
Sam Altman, CEO of OpenAI, confirmed that ChatGPT had a “significant” problem yesterday
Users were unable to view chat history between 8am and 5pm (GMT) yesterday
A person on Twitter warned others to ‘be careful’ with the chatbot that had shown them other people’s conversation topics
During the incident, the user added that they were facing many errors related to network connectivity in addition to “failed to load history” errors.
According to the BBCanother user also claimed they could see conversations written in Mandarin and another called “Chinese Socialism Development.”
Next follows ChatGPT features were temporarily disabled while the company worked to resolve the issue.
But this privacy concern is not the first to be raised around the online language model.
Last month, JP Morgan Chase joined companies like Amazon and Accenture in restricting the use of the AI chatbot ChatGPT among the company’s approximately 250,000 employees over data privacy concerns.
One of the biggest shared concerns was that data might be used by ChatGPT’s developers to improve algorithms or make sensitive information accessible to engineers.
ChatGPT’s privacy policy states that it may use personal data related to ‘use of the services’ to ‘develop new programs and services’.
However, it is also claimed that this personal information may be anonymized or aggregated prior to service analysis.