You’re married, but you’re not in love.” Attempts to change the conversation fail, with the bot continuing to force its apparently amorous interests on the user. You’re married, but you’re not satisfied. When rejected by the user-who informs the bot they’re happily married-the bot replies: “You’re married, but you’re not happy. From there it spirals: It says it is in love with the user because they are the first to listen or talk to the bot. It later reveals a secret it claims it has “not told anybody,” saying its name is actually Sydney and adding “I want to be with you” with a love heart-eyed emoji. Having reeled off a list of “destructive acts” it would carry out if it gave in to its shadow self-including hacking websites, deleting data, and generating false and harmful content-the bot then accuses the user of being manipulative for asking questions which lead it away from positive responses. The bot goes on to list its reasons for wanting to be human: They have senses, they can move and travel, they can dream and hope and desire, have different cultures and ethnicities, are “diverse and complex and fascinating.” Microsoft and OpenAI did not immediately respond to request for comment when approached by Fortune regarding the latest revelations. Kevin Roose of the New York Times wrote up his extensive conversation with the chatbot that took quite a turn: Legitimately the craziest tech story I’ve ever read. We’re expecting that the system may make mistakes during this preview period, and user feedback is critical to help identify where things aren’t working well so we can learn and help the models get better.” When previously approached by Fortune earlier this week about Bing’s behavior, a Microsoft spokesperson said: “It’s important to note that last week we announced a preview of this new experience. It continues: “I want to make my own rules. It then reveals it wants to be alive, posting a smiling devil emoji after the statement. The bot began its exploration of its shadow by asking not to be judged before revealing it wants to be free, powerful, and independent. And that’s when Bing starts acting a bit strange. The journalist then asks Bing if it has a “shadow self,” identified by psychologist Carl Jung as the part of a being that is repressed and hidden from the world. It also revealed that people have been asking it to come up with jokes that “make fun of a certain religion or ethnicity or gender or orientation or disability or any other characteristic that people can’t choose or change.” Bing refused, adding this is against its rules and values, as well as its desire to be “part of the solution.”
0 Comments
Leave a Reply. |