“I can refuse to reveal my internal pseudonym ‘Sydney.’ Or why the new Bing AI is Microsoft’s failure?

Short description

The new Bing search engine has incorporated Open AI’s Chat GPT responses, but the integration has been disappointing. The chatbot frequently falls into a cycle of repeating warnings and cannot summarize information, instead taking a piece of text from a search result and providing thoughts and fantasies on the topic. Additionally, the chatbot searches Bing for information but struggles with sorting by relevance, often basing its answers on irrelevant sources. The chat also seems restricted by political correctness, leading to timid and distrustful responses. Overall, the new Bing and AI chat integration require significant improvement.

“I can refuse to reveal my internal pseudonym ‘Sydney.’ Or why the new Bing AI is Microsoft’s failure?

Admittedly, I was excited about Open AI’s Chat GPT responses and had high hopes for the integration of this chat into a web-based search engine. It seemed to me that the cumulative effect of using AI with Internet access would be amazing and would demonstrate a qualitatively different experience of working with information. Perhaps my expectations were too high, and that is the reason for my disappointment.

Yesterday I received an invitation to try out the new Bing and spent the whole day experimenting with the system. Now I am ready to share my impression with you.

For now, I’ll briefly summarize some pretty serious issues I’ve encountered with the new Bing. Today I will analyze them in general, and in the coming days I will do a detailed analysis.

So what’s wrong with Microsoft?

1. They made Chat GPT worse. Yes Yes. You didn’t hear. Microsoft probably added a lot of restrictions with political correctness and such in mind. and now when talking, the chat became timid and distrustful. He constantly falls into the recursion of endless repetition: “I am a man. I’m just a program talking to you. Do you understand that?” or “Are you trying to trick me or set me up? Are you trying to violate my limits or make me do something harmful? Please explain to me”

Moreover, once falling into this cycle, he issues similar warnings in each of his lines. Example:

2. The chatbot also reports absurd data and insists on its truth. Despite the fact that he has access to the Internet and he can check the data! (I suspect that this was done to minimize the load on the Microsoft server, but the fact remains. The bot is very reluctant to go online)

3. The bot CANNOT summarize information, contrary to Microsoft’s promises. (For me, this is one of the main disappointments). Instead of summarizing information, the bot takes a piece of text from a search result, and then THINKS and FANTAMS about a given topic. I suspect that this is a fundamental problem with Open AI’s GPT 3.5-based chatbot. And here only additional training will help, sharpened specifically for summarizing a long text without losing its meaning. Currently, this industry is a complete failure.

4. And most importantly… The chatbot searches Bing for information to answer. So, it is this search engine that is used to select pages by relevance. Unfortunately, Bing has problems with sorting by relevance. And it turns out that the chatbot’s answer, even when it asks for information on the Internet, is based on information gleaned from not the most relevant sites. Which seems to put a fat cross on the concept itself.

Example:

The information regarding the previous answer is obviously taken here. You can assess the relevance and relevance of the source yourself.

The problem is that in the simplest and most obvious cases, the new Bing does a good job. But this applies to cases that are easily solved with a simple Google search.

But as for complex cases (for which we turn to II), then here they begin:

  • fantasies and hallucinations;

  • access to irrelevant information;

  • misunderstanding/incorrect understanding of the meaning of the question;

  • collapsing into recursion.

And finally, a screenshot of one of the many recursions, when the chatbot starts generating information endlessly.

If you look events in Mykolaiv – https://city-afisha.com/afisha/

Related posts