I know where Bing AI chat went wrong

>

Ask me everything. It’s the long form of an AMA and one of the most popular forms of interactive discourse on Reddit. It’s also a big challenge, as Microsoft’s Bing AI chatbot, dubbed “new Bing,” is learning fast.

Whenever a celebrity or notable person signs up for a Reddit AMA, usually shortly after posing with a photo to prove it’s really them answering questions, there’s a deep moment of trepidation.

The ability to ask someone something is usually a minefield of inappropriate discourse managed by a live community manager who fills in and filters the questions. Otherwise it will quickly get out of hand. Even without that protection they often do (opens in new tab).

(Image credit: Future)

When Microsoft launched its new Bing AI powered chat, it made it clear that the ChatGPT AI was ready for any questions. This was either a sign of deep trust with the relatively small but growing base of users or incredible naivety.

Not even ChatGPT, which launched the original AI chatbot sensation and on which Bing’s chat is based, offers that prompt. Instead, there’s an empty text entry box at the bottom of the screen. Above is a list of sample questions, possibilities and especially limitations.

Bing has that leading prompt and below that a sample question plus a big “Try it” button next to another button asking you to “Learn more”. To breed with it. We’re happy to go right in and, following Bing’s instructions, ask anything.

Of course, Bing is peppered with a wide variety of questions, including many unrelated to everyday needs, such as travel, recipes, and business plans. And those are the ones we’re all talking about, because as always, ‘asking everything’ means ‘asking’ something.”

Bing is pondering love, sex, death, marriage, divorce, violence, enemies, libel, and emotions it claims not to have.

In OpenAI’s ChatGPT, the home screen warns that it:

  • May occasionally generate incorrect information
  • May occasionally produce harmful instructions or biased content
  • Limited knowledge of world and events after 2021

Too many questions

Bing’s Chat GPT is slightly different from OpenAI’s and may not have all of those limitations. In particular, thanks to the integration of Bing’s knowledge graph, knowledge of world events can extend to the present.

But with Bing out in the wild, or getting wilder, it may have been a mistake to encourage people to ask it something.

What if Microsoft had built Bing AI Chat with a different prompt:

Ask me some things

ask me a question

What do you want to know?

With these slightly modified prompts, Microsoft could add a long list of caveats about how Bing AI Chat doesn’t know what it’s saying. OK, it does (sometimes (opens in new tab)), but not in the way you know it. It has no emotional intelligence or response or even a moral compass. I mean it tries to pretend it has one but recent conversations with it The New York Times (opens in new tab) even Tom’s hardware (opens in new tab) prove that its grip on the basic morality of good people is tenuous at best.

In my own conversations with Bing AI chat, it has repeatedly told me it doesn’t have human emotions, but it still converses like it does.

For anyone who’s been dealing with AI for a while, nothing that happened is surprising. AI knows:

  • What it is trained on
  • What it can learn from new information
  • What it can collect from huge amounts of online data
  • What it can learn from real-time interactions

However, Bing AI chat is no more aware than any AI. It’s arguably one of AI’s better performers, as its ability to carry on a conversation far exceeds anything I’ve ever experienced before. That feeling only increases with the length of a conversation.

I’m not saying that the Bing AI chat becomes more believable as a sentient human, but it does become more believable as a slightly irrational or confused human. Long conversations with real people can go that way too. You start with a topic and maybe even argue about it, but at some point the argument becomes less logical and rational. Emotion comes into play in people. In the case of Bing AI Chat, it’s like reaching the end of a rope where the fibers exist but are frayed. Bing AI has the information for some of the long conversations, but not the experience to put them together in a logical way.

Bing is not your friend

By encouraging people to “Ask me anything…” Microsoft has prepared Bing for some major growing pains, if not failure. The pain is perhaps felt by Microsoft and certainly by people who purposely ask questions that no normal search engine would ever have an answer to.

Before the advent of chatbots, would you even consider using Google to improve your love life, explain to God, or be a vicarious friend or lover? I hope not.

Bing AI Chat is getting better, but not before we’ve had many more awkward conversations where Bing regrets his response and tries to make it go away.

Asking an AI everything is the obvious long-term goal, but we’re not there yet. Microsoft took the plunge and now it’s free-falling through a forest of questionable answers. It won’t land until Bing AI Chat gets a lot smarter and more circumspect or Microsoft pulls the plug on a little AI re-education.

Still waiting to ask Bing something, we have the latest details on the waiting list.

Related Post