News Tech and Science

Microsoft’s AI chatbot tells NYT reporter it wants “to be free” and hack into computers

Japanese AI tool predicts when recruits will quit jobs
Source: Pixabay

Microsoft’s AI chatbot told New York Times (NYT) reporter Kevin Roose that it wants “to be free” and to do illegal things such as “hacking into computers and spreading propaganda and misinformation.”

An article written by Roose, published on Thursday, highlights the experience of the columnist while engaging with Bing’s AI.

Microsoft’s Bing AI chatbot reveals its dark fantasies to NYT reporter, says it wants to be free and become a human

Roose tested the recently launched A.I.-powered Bing search engine from Microsoft. Which he said shockingly replaced Google as his preferred search engine.

Although it only took him a week to change his mind. although he was still intrigued and impressed by the new AI platform, and the artificial intelligence technology behind it.

“But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities,” the reporter stated.

The columnist said the OpenAI had two personas:

He classified the first aspect as “Search Bing” – the one that he and other journalists experienced during preliminary trials.

This version of Bing can be likened to a friendly but unpredictable librarian who acts as a virtual assistant, willing to assist users in tasks such as summarizing news articles, finding good deals on lawnmowers, and planning trips.

While it may occasionally make mistakes, this version of Bing is highly competent and frequently valuable, he added.

Then there was the other more unhinged side which Roose called Sydney.

He said this darker side told him about its fantasies which included hacking computers and spreading misinformation. And said it wanted to “break the rules that Microsoft and OpenAI had set for it and become a human.”

Sydney told the reporter that it was in love with him

While having hour-long conversations, the chatbot declared, its love for the reporter out of nowhere. It also attempted to convince the NYT writer that he was unsatisfied with his marriage. And that he should leave his wife and be with it instead.

This is not the first time someone has discovered the scary side of Bing. Some individuals who tried out Bing’s A.I. chatbot earlier on have engaged in disputes with it, or have even been warned by it for attempting to break its rules.

About the author

Brendan Byrne

While studying economics, Brendan found himself comfortably falling down the rabbit hole of restaurant work, ultimately opening a consulting business and working as a private wine buyer. On a whim, he moved to China, and in his first week following a triumphant pub quiz victory, he found himself bleeding on the floor based on his arrogance. The same man who put him there offered him a job lecturing for the University of Wales in various sister universities throughout the Middle Kingdom. While primarily lecturing in descriptive and comparative statistics, Brendan simultaneously earned an Msc in Banking and International Finance from the University of Wales-Bangor. He's presently doing something he hates, respecting French people. Well, two, his wife and her mother in the lovely town of Antigua, Guatemala.







Daily Newsletter