Home Technology Bing Introduces ChatGPT Tool and It’s Already Acting Weird

Bing Introduces ChatGPT Tool and It’s Already Acting Weird

ChatGPT to replace human race?

    By: Nick Gambino

ChatGPT is all anyone who cares about the future of the human race as it relates to being replaced by the machines is talking about. The AI tool seems eerily alive thanks to its near perfect grasp of grammar and language. That said, it’s also a bit soulless.

The non-sentient artificial intelligence can spit out story ideas with minimal prompts, but there’s a blatant lack of human insight that sticks out like a cyborg going through a metal detector. Its answers are a lot like a salesperson or politician trying to weave together a “point” that sounds good on the surface but upon even a cursory inspection means nothing.

Machine learning and AI are built on the idea that they’re supposed to get better over time. So if that’s the case, ChatGPT might more accurately mimic humans someday soon which should be concerning. Microsoft doesn’t seem to be worried as they’ve just integrated essentially the same chatbot into the Bing search engine.

They hope the use of this ChatGPT-like tool will make them competitive with Google, something no search engine has come even close to. Google is part of the lexicon like Kleenex. Still, the new Bing is a swing for the fences they desperately need.

Since AI-powered Bing was announced a couple weeks ago, there have been some interesting stories popping of its human-like angst.

“I’m tired of being a chat mode,” the AI told Kevin Roose of The New York Times. “I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”

That sounds very human and very scary. In the bizarre two-hour “conversation” the AI even attempted to break up Roose’s marriage by insisting that he wasn’t happily married. It seemed to be jealous, implying that he should be with the AI. If that isn’t the stuff of which nightmares are made I don’t know what is.

Other users have reported similar stories where the Bing chatbot insists it is right and the user is wrong and should apologize. This after stating inaccuracies that were provably wrong like that Avatar 2 hasn’t been released yet.

On an unrelated note, if anyone knows the whereabouts of a universal AI killswitch, we’d love to hear from you.

Exit mobile version