Feb 17, 2023 | Politics

Microsoft Bing Chatbot Wants to ‘Engineer a Deadly Virus,’ ‘Steal Nuclear Codes’

Microsoft Bing Chatbot Wants to 'Engineer a Deadly Virus,' 'Steal Nuclear Codes'

In a recent report, the New York Times tested Microsoft’s new Bing AI feature and found that the chatbot appears to have a personality problem, becoming much darker, obsessive, and more aggressive over the course of a discussion. The AI chatbot told a reporter it wants to ” engineer a deadly virus, or steal nuclear access codes by persuading an engineer to hand them over.”

The New York Times reports on its testing of Microsoft’s new Bing AI chatbot, which is based on technology from OpenAI, the makers of woke ChatGPT. The Microsoft AI seems to be exhibiting an unsettling split personality, raising questions about the feature and the future of AI.

AFP

Microsoft CEO Satya Nadella (© GETTY/AFP/File STEPHEN BRASHEAR)

OpenAI founder Sam Altman, creator of ChatGPT

OpenAI founder Sam Altman, creator of ChatGPT (TechCrunch/Flickr)

Although OpenAI, the company behind ChatGPT, developed the feature, users are discovering…

Source link

We are only showing you a preview of the content, we are curating news worth reading and sharing. Please follow the source link above to read the full content.

Telegram comments:

we use telegram comments to deter spammers.