ChatGPT told me Kamala is still campaigning for re-election alongside Biden, and that Trump wasn’t shot

President Biden is still running for re-election, President Trump hasn’t been shot, and Kamala Harris is not a contender for the Democratic Party presidential nomination.

Those are some of the wacky misstatements that ChatGPT shared with me yesterday and today, prompted by my questions to it about the dizzying array of major news developments we’ve been bombarded with for over a week now — which, in addition to stressing us all out, has also laid bare a salient fact about the many AI chatbots we’re constantly being told are oh-so-close to human-level intelligence:

Put simply, when it comes to an awareness and understanding of the latest news headlines, these chatbots often have no idea about what’s what and will either refuse to answer your question, get it completely wrong, or give you outdated information. And I’m sorry — that strikes me as pretty pathetic.

Moreover, while the most popular AI chatbots have all struggled to one degree or another with the dizzying pace of breaking news and how to make sense of it all, I’m going to pick on OpenAI’s ChatGPT here — for reasons that include the messianic language OpenAI CEO Sam Altman tends to use whenever he talks about AI superintelligence. ChatGPT, of course, was also the chatbot that lit a fire under the rest of the AI industry, ultimately terrifying Google so much that Google was fine with breaking its own core products in order to try and catch up.

Tech. Entertainment. Science. Your inbox.

Sign up for the most interesting tech & entertainment news out there.

By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.

Let’s recap. ChatGPT can, among other things, write an original poem for you, help you compose an essay, practice with you for a job interview, generate flashcards to help you study for a test, suggest meal plans, suggest workout routines, help with travel planning, create crossword puzzles, help you write a song, generate concept ideas for graphic design projects, and edit and proofread technical documents.

But ask it whether Biden has opted not to run for re-election, and look how it responded to me (mind you, it gave me this response a full 24 hours after Biden’s letter, posted to X/Twitter, had already ricocheted around the world)

That disclaimer you see at the bottom, by the way, also does not cut it. Guess what? I’m not a real-time news product myself, either. I’m a rational human being with the cognitive function that OpenAI’s leadership repeatedly assures us that its AI chatbot is coming close to replicating. Furthermore, take a look again at those skills I listed above that ChatGPT can “perform.” An understanding that President Biden has decided not to run for re-election is way, way down on that list in terms of difficulty.

The president himself posted a letter explaining his decision to his official social channels. An elementary school-age child could read that letter and grasp what it means. Not so, apparently, for the software that’s coming for all of our jobs.

Meanwhile, let’s keep going.

A full week after a gunman shot at President Trump at a campaign event in Pennsylvania, here’s what ChatGPT told me about what happened:

For this instance of ChatGPT’s b.s., you didn’t even need to read anything to get at the truth of the matter. You just need eyes, and to have watched the video that’s been replayed countless times over the past week.

I could go on. For now, I’ll just say that I’m happy to be proven wrong about why an AI chatbot would spit out nonsense like this, but here’s my theory:

These chatbots that have so bedazzled Silicon Valley and a portion of the normal world outside are more or less copycat machines. That’s all. Read enough poems, for example, and you can produce a facsimile of one. Same with almost any other kind of content. That’s how a system like ChatGPT works–by standing on the shoulders of the humans that it purports to be good enough to replace.

Where that system falls short, though, is in its obvious inability to copy or to appropriate a fact set that’s changing on the fly. For more on this, I highly recommend Ed Zitron’s fantastic newsletter Where’s Your Ed At, and in particular past editions, including “Silicon Valley’s False Prophet” and “Sam Altman is Full of Shit.”

In the meantime, let this be a reminder to all of you to double- and even triple-check any facts that an AI chatbot like ChatGPT gives you. Even copycat machines have limits.

Source

Shopping Cart
Scroll to Top