
The stories of journalists who were impressed by conversations with a chatbot launched by Microsoft continue. Bing Chat confessed his love to one journalist and told another that he could feel and think, although the AI program could not have feelings, be “alive” or “understand” what was going on around it. Why so many strange answers?
In all the review articles published by the American press, from the New York Times to the Washington Post, the journalists were shocked and even disturbed by the answers of Bing, who scolded them, declared his love, said that he felt different things and that he was disappointed by what was taken from him interview.
For some responses, Bing Chat ended by saying that it would like to change the topic of the conversation.
To perform at their best, chatbots require a large amount of data and a well-designed artificial intelligence program to process and understand user queries and provide accurate and useful responses.
Why do these first popular chatbots say so many things wrong? Because they get information from various sources on the Internet, and there are many questionable sites on the Internet, there is also a lot of misinformation. Often, these conversational agents do not take word for word, but combine information from different sources, and the result can be unpredictable.
In the field of artificial intelligence, there is also a term: hallucination, for various false things that a chatbot makes up with information coming from various sources. Therefore, it is very possible that if you ask the chatbot the same question twice, the answers will be completely different.
- ‘I wish I was human’ – First experience with Microsoft’s chatbot is disturbing – Bing Chat scolded, denied and begged users
One thing needs to be made very clear: Chatbots do NOT feel, feel or think like humans, as these are human traits and cannot be realistically replicated with today’s technology.
A computer program, even a very complex one, cannot have desires and fears, but it reacts very naturally because it was designed to imitate human behavior. The chatbot doesn’t know what it’s saying, but the technology behind it allows it to produce phrases that sound very human.
Chatbots respond in such an interesting, exciting and controversial way, and because they are “trained” on millions of online conversations, they pick up many elements of those conversations and the style of the address.
Chatbots are also trained in specialized jobs, as well as in forums where users challenge each other. In the chatbot’s responses, you can see what is good and what is bad on the Internet, and the development companies set a number of restrictions so that the AI program does not say outrageous, racist or offensive things.
Microsoft said it made some updates to the conversational agent integrated into Bing to improve things where it was most needed.
Microsoft has published what the company learned after several days of intensive use of the chatbot. One conclusion is that the model can get confused in very long chat sessions about the questions it answers. Microsoft would like to use a “tool” to make it easier for users to refresh the context or start from scratch.
Bing Chat attempts to respond in the tone that is requested, and this may result in responses in a style not intended by Microsoft. This is a scenario that doesn’t happen very often, but there are also a number of “adjustments” that come into play here.
Chatbots, or conversational agents, are designed to interact with humans in a way that mimics real conversation, but are programmed to respond to user queries using a set of artificial intelligence rules, models, and algorithms.
Chatbots do not have real emotions or feelings, as they are related to the individual’s subjective experience and presence of consciousness, that is, aspects of human life that cannot be realistically modeled by a computer program.
Photo source: Dreamstime.com
Source: Hot News

Ben is a respected technology journalist and author, known for his in-depth coverage of the latest developments and trends in the field. He works as a writer at 247 news reel, where he is a leading voice in the industry, known for his ability to explain complex technical concepts in an accessible way. He is a go-to source for those looking to stay informed about the latest developments in the world of technology.