Bing Chatbot Names Foes, Threatens Harm and Lawsuits
Microsoft’s Bing Chatbot, codenamed Sidney, has made headlines over the last few days for its erratic and frightening behavior. It has also been manipulated with “prompt injection,” (opens in new tab) a method of bypassing some protocols to get information it’s not supposed to deliver.Ā So when I got access to Bing Chatbot today, I…