ChatGPT is getting human-like memory and this could be the first big step towards general AI

ChatGPT is increasingly becoming your most trusted assistant, not only remembering what you told about yourself, your interests and preferences, but also applying those memories in future chats. It’s a seemingly small change that could make generative AI seem more human and perhaps pave the way for general AI, where an AI brain can function more like the gray matter in your head.

OpenAI has announced the limited test Tuesday in a blog postwhich explains that it tests the ability of ChatGPT (both in the free version and ChatGPT Plus) to remember what you say in all chats.

With this update, ChatGPT can remember casually and pick up interesting things along the way, like my preference for peanut butter on cinnamon raisin bagels, or what you explicitly tell it to remember.

The advantage of ChatGPT with a memory is that new conversations with ChatGPT no longer start from scratch. A new prompt could imply a context for the AI. A ChatGPT with memory becomes more of a handy assistant that knows how you drink your coffee in the morning or whether you never want to schedule meetings before 10 a.m.

In practice, OpenAI says the memory will be applied to future clues. If you tell ChatGPT that you have a three-year-old who loves giraffes, subsequent chats about birthday cards can result in card ideas featuring a giraffe.

ChatGPT will not simply mimic its memories of your preferences and interests, but will instead use that information to work for you more efficiently.

It can remember

Some might find an AI that can remember multiple conversations and use that information to help you out a bit obnoxiously. That’s probably why OpenAI lets people easily opt out of the reminders by using the “Temporary Chat” mode, which seems like you’re introducing a bit of amnesia into ChatGPT.

Just as you can delete internet history from your browser, ChatGPT allows you to go into the settings to delete reminders (I like to think of this as targeted brain surgery) or you can tell ChatGPT in conversation to forget something.

For now, this is a test among some free and ChatGPT Plus users, but OpenAI didn’t offer a timeline for when it will roll out ChatGPT reminders to all users. I didn’t find the feature live on my ChatGPT or Plus free plan.

OpenAI is also adding memory capabilities to its new app-like GPTs, meaning developers can build the capability into custom chatty AIs. These developers do not have access to memories stored in the GPT.

Too human?

An AI with long-term memory is a trickier proposition than an AI that has at best a transient memory of past conversations. There are of course consequences for privacy. If ChatGPT randomly remembers things about you that it finds interesting or relevant, should you worry about your data showing up in someone else’s ChatGPT conversations? Probably not. OpenAI promises that reminders will be excluded from ChatGPT training data.

OpenAI adds in its blog: “We take steps to assess and reduce bias, and prevent ChatGPT from proactively remembering sensitive information, such as your health data – unless you explicitly request it.” That might help, but ChatGPT needs to understand the difference between useful and sensitive information, a line that may not always be clear.

This update could ultimately have significant consequences. ChatGPT may already seem somewhat human in prompt-driven conversations, but the hallucinations and vague memories of, sometimes even, how the conversation started make it clear that more than a few billion neurons are still separating us.

Memories, especially information passed on to you casually during ChatGPT conversations, can change that perception. Our relationships with other people are largely determined by our shared experiences and memories of them. We use them to shape our interactions and discussions. It’s how we connect. We’ll definitely feel more connected in the end with a ChatGPT that can remember our aversion to spicy food and our love for all things Rocky Balboa.

You might like it too

Related Post