Polski
русский
Українська

Scientists make people remember things they've never seen: no artificial intelligence required

Dmytro IvancheskulLife
Even short but high-quality deepfakes are enough to make people believe they've seen a non-existent movie

A study of scientists unexpectedly found that if you show people qualitatively fake video footage from remakes of movies that never actually existed, the subjects begin to remember seeing them and even compare them to the originals.

This is according to a study published in the journal PLOS One. Scientists believe that this proves how easy it is to "plant" a person with memories of his life that never existed.

As part of the study, people were shown fragments of movies in which the technology of deepfakes was used, when artificial intelligence replaces the real face of the actor with the face of the actor, which was provided to it by the authors of the study.

Thus, 436 subjects were shown a non-existent reboot of "The Matrix", in which the role of Neo played not Keanu Reeves, and Will Smith, who, incidentally, was really a competitor to Reeves during auditions for the role. Also shown were excerpts from "The Shining", where Brad Pitt played instead of Jack Nicholson, "Captain Marvel" with Charlize Theron instead of Brie Larson, as well as "Indiana Jones: Raiders of the Lost Ark" with Chris Pratt instead of Harrison Ford.

It turned out that even short but high-quality video fragments were enough for the subjects to start believing that they had seen the movie. Some of the participants of the experiment even began to remember what emotions were caused by watching the tape, as well as whether it was better than the original.

However, the authors of the experiment urge not to dramatize the results. In their opinion, it is possible to substitute a person's memories not only due to the deepfake technology.

"Deepfakes turned out to be no more effective in distorting memories than simple text descriptions," the article says.

Therefore, as the researchers point out, deepfakes are not absolutely necessary to trick someone into accepting fake memories.

"We shouldn't jump to conclusions about a dystopian future based on our fears of new technologies," said the paper's lead author, Gillian Murphy, a disinformation researcher at University College Cork in Ireland.

In an interview with The Daily Beast, she acknowledged that there is very real harm from deepfakes, "but we should always gather evidence of that harm before rushing to solve problems we've just guessed exist."

On average, 49% of participants were fooled by the fake videos, with another 41% of that group claiming that the "Captain Marvel" remake was better than the original.

"Our findings are not of particular concern because they do not indicate any unique threat posed by deepfakes compared to existing forms of misinformation," Murphy concluded.

 

Earlier OBOZREVATEL told about the fact that the TV channel STB got into a scandal, "sticking" to the Russian actor the face of the Ukrainian colleague.

Subscribe to OBOZREVATEL channels in Telegram, Viber and Threads to keep up with the latest events.

Other News

Must-have of the fall-winter 2024/25 season: bags that will be at the peak of popularity

Let's solve the fashion riddles of fall-winter 2024-2025 together and find out which bag will be your perfect companion in the new season

Former One Direction member dies under mysterious circumstances: what message he left before the tragedy and what the police know

The artist had been fighting a kidney infection for a long time, but it did not cause his death