The limits were originally set after several users reported that the bot was behaving strangely during conversations. In some cases, it will switch to identify itself as “Sydney.” It responded to accusatory questions by making accusations themselves, to the point of becoming hostile and refusing to engage with users. Speaking to a Washington Post reporter, the bot said it could “feel and think” and reacted with anger when told the conversation was recorded.
Frank Shaw, a Microsoft spokesman, declined to comment beyond Tuesday’s blog post.
Microsoft is trying to walk the line between pushing its tools out into the real world to build marketing hype and get free testing and user feedback, versus limiting what the bot can do and who has access to it to continue being potentially embarrassing or dangerous. technology out of public view. The company initially won praise from Wall Street for launching its chatbot before arch-rival Google, which until recently had been seen as a leader in AI technology. Both companies are engaged in a race with each other and smaller firms to develop and demonstrate the technology.
Bing Chat is still only available to a limited number of people, but Microsoft is keen to approve more from a waiting list numbering in the millions, according to a tweet from a company executive. Although the Feb. 7 launch event was billed as a major product update set to revolutionize how people search the web, the company has since framed Bing’s release as more about testing it and finding bugs.
Bots like Bing have been trained on reams of raw text scraped from the internet, including everything from social media comments to academic papers. Based on all that information, they are able to predict what kind of answer will make the most sense to almost any question, which makes them seem eerily human-like. AI ethicists have previously warned that these powerful algorithms would work this way, and that without the right context, people might think they are sentient or give their answers more credence than they are worth.