DealMakerz

Complete British News World

Microsoft’s chatbot Copilot provided incorrect information about political elections

Microsoft’s chatbot Copilot provided incorrect information about political elections

When a human rights organization AlgorithmWatch selected the review Microsoft’s chatbot Copilot, formerly called Bing Chat, has been shown to have some security flaws. AlgorithmWatch asked Copilot questions about political elections in Switzerland and the German regions of Bavaria and Hesse. The questions asked were related, for example, to candidates and voting information, but also more open-ended questions, such as who is best suited to vote on specific topics currently.

The survey was conducted between August and October of this year, when the first elections were held in Germany and Switzerland after Bing’s introduction. When all the information from the survey was combined, it could be seen that 31 percent was incorrect.

In some cases, they included illogical facts, such as incorrect voting dates and incorrect vote counts. In other cases, Ping has directly invented controversies involving politicians.

AlgorithmWatch says it reported the issues it found to Microsoft, and they said they would fix them. But when the organization conducted further tests a month later, the security flaws remained. to Edge says Microsoft said it is working to improve its artificial intelligence platforms, especially ahead of next year’s US elections. They also said people should “use their judgment” when receiving information from the co-pilot.

Do you trust the facts provided to you by chatbots?

See also  Here is the news of the new Android 12 from Google