Polski
русский
Українська

Artificial intelligence created a mind trap for humans: scientists warned of the danger

Dmytro IvancheskulNews
AI algorithms limit people's access to opposing points of view

The widespread use of artificial intelligence algorithms to recommend content and products to users based on their previous online activity has led to the emergence of new phenomena on social media, such as echo chambers and information cocoons. AI algorithms create mind traps for netizens, encouraging them to read the content that resonates with their views on life, politics, etc only.

This is stated in a study by a group of scientists led by Professor Yong Li from Tsinghua University (China). Their work is published in the journal Nature Machine Intelligence.

The researchers studied how the so-called information cocoons, in which users encounter only opinions or users who repeat their own views, are formed. This can pose a serious danger to the development of critical thinking.

The first author of the paper, Jinghua Piao, noted that the widespread adoption of AI-driven algorithms poses new challenges. For example, reducing the influence of ideologically diverse news, opinions, political views, and friends. He noted that such technology isolates people from diverse information and eventually "locks them into one topic or point of view."

This, according to the researchers, can have far-reaching negative consequences as such cocoons can reinforce prejudice and polarization in society, hinder personal growth, creativity and innovation, increase disinformation and impede efforts to create a more inclusive world.

"The concept of information cocoons is used to describe a widely observed phenomenon, in which people are isolated from a variety of information and eventually become trapped in one topic or point of view," the scientist emphasized.

At the same time, the researchers argue that it would not be fair to blame AI algorithms or people specifically as such information cocoons arise from complex interactions and information exchange between different authors.

Such interactions are divided into four components:

  • similarity-based selection is a process where AI recommends content, products, or other users to the user based on the fact that they are most similar to what the user has already interacted with. This is the key force that leads to the creation of cocoons;
  • positive feedback, providing new content in accordance with the user's previous preferences. This process reduces information diversity;
  • negative feedback interferes with the formation of cocoons and the selection of monotonous information;
  • random self-exploration, a situation where the user finds content on his/her own and breaks algorithms that lead to the formation of cocoons.

Scientists believe that persistent information cocoons form when there is an imbalance between positive and negative feedback, as well as when similarity-based selection is continuously reinforced.

To avoid this mind trap, scientists advise not only to like but also to respond to the content you don't like so that algorithms can form a more objective picture. They also urge netizens to engage in self-exploration more often as viewing new information will affect the algorithm's further recommendations.

Earlier, OBOZ.UA reported that scientists found the "kryptonite" of artificial intelligence that drives it crazy.

Subscribe to OBOZ.UA on Telegram and Viber to keep up with the latest events.

Other News

Krynky are almost completely destroyed, but Ukrainian Armed Forces continue to hold the line on the left bank of the Dnipro –  Tavria Brigade

Krynky are almost completely destroyed, but Ukrainian Armed Forces continue to hold the line on the left bank of the Dnipro – Tavria Brigade

Soldiers continue to perform combat missions in the temporarily occupied part of Kherson region
The most popular desserts in the world – easy to make at home

The most popular desserts in the world – easy to make at home

Just the words macaroon, eclair or tiramisu make your mouth water