October 23rd, 2024

Chatbots know nothing but sound like experts on everything


By Lethbridge Herald on October 23, 2024.

DAN O’DONNELL

For the last year, we’ve been playing a game in the Humanities Innovation Lab at the University of Lethbridge: can we get ChatGPT to tell us we’re wrong about something?

If you aren’t familiar with ChatGPT, it’s one of the new set of websites (known as “chatbots”) that use a special technology called “Large Language Models” to answer questions. ChatGPT is the most famous because it was the first to be released to the public — in November 2022, almost exactly two years ago.

When these bots first came out, it felt like something out of a sci-fi film. For the first time, people could talk to computers using normal language. Unlike other kinds of computer programs that need specific commands, chatbots can talk about almost anything — from Danielle Smith’s leadership review to the best recipes for apple pie.

Of course, in movies, computers that sound like people are rarely up to much good. Whether it’s HAL from 2001: A Space Odyssey or Skynet from The Terminator, computers go bad when they become too human-like.

Real-life bots have caused problems too. In one case, ChatGPT told a reporter to leave his girlfriend. Another bot suggested that people should eat glass.

The biggest issue, however, is “hallucinations.” This is when a chatbot just makes something up.

 A lawyer used ChatGPT to write a legal argument, but it included fake references to non-existent cases. Another bot invented a new disease when reviewing x-rays. Even when asked impossible questions like, “Who holds the record for walking across the English Channel?” chatbots respond as if they didn’t realize the English Channel is a waterway.

This may seem less serious than the computer villains in the movies. No astronaut has been killed because a chatbot refused to open a door on a spaceship, and there don’t seem to be any killer robots chasing us. Unlike HAL or the computers in The Matrix, chatbots haven’t developed any self-awareness — yet!

The problem is that while AI is not actually alive, it sounds so human. It’s easy to forget that chatbots don’t think like us. In the past, when computers got something wrong, it was obvious — they produced error messages or gibberish. But chatbots speak with complete confidence. Ask a silly question, and you’ll get a silly answer — in exactly the same tone as the correct answer to a reasonable question.

This is not how most people behave. If someone asks a confusing question, we usually try to figure out the problem. What we don’t do is confidently give a fake answer just to keep the other person happy.

Of course, sometimes we do say things just because they sound good. Maybe we need to explain why a project is late to our boss. Or perhaps we are a politician trying to convince voters we’re on their side. But this kind of language use is rare enough that we have a special name for it: “bullsh*t” or BS.

Chatbots are the kings of BS. Most people can fake it for a little while on a couple of topics: hockey, politics, music. But chatbots can go on all day long on any topic you want. They don’t know anything, but they sound like experts on everything. That’s because they don’t actually look for the right answer; they sift through billions of words in their models to give the response that sounds best. Sometimes they’re right. Sometimes they’re wrong. But they don’t care — and they sound just as confident either way.

That’s why we play our little game in the lab. One way to spot a BS artist is that they never tell you you’re wrong. We’re still waiting for ChatGPT.

Dan O’Donnell is professor and Chair in the Department of English at the University of Lethbridge. Opinions are his own.

Share this story:

16
-15
Subscribe
Notify of
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Guy Lethbridge

So you’re telling me that the UCP are all chatbots?

Actually explains a lot . Thanks

buckwheat

Typical. Indicative of the U of L. can’t leave politics out of a rather interesting article. Wonder who is blowing smoke, the prof or chat bots. He should stick to English Literature.



2
0
Would love your thoughts, please comment.x
()
x