Artificial intelligence involved in generation of false information
Videos and news generated by AI flood the network. The consequences, for the uncritical consumer, go beyond misinformation
Cybersecurity specialists have been warning about the rapid development of AI and its implications for the manipulation of information on the Internet, particularly on social networks.
The use of deepfake or video that simulates a real situation is not new. They have been used, in particular, in political issues. Since the start of the Russia-Ukraine conflict, dozens of videos made with machine learning have flooded the Internet.
The superimposition of faces and voices make it look like real information or a situation, when in reality it is an AI manipulation.
Some of the advice from specialists on the subject aims to focus on key aspects, such as the blinking of the person in the video. In a deepfake, the person blinks fewer times and less quickly or naturally than in real life.
On the other hand, they indicate to be attentive to areas of the body beyond the face and neck, since in fake videos it is more difficult to modify the entire body. At the slightest detail that does not fit, it is proper to doubt.
Another aspect that could make it possible to identify a video created by AI is the movement of the mouth. It has been established that artificial intelligence algorithms still fail to clearly copy the inside of the mouth, teeth and tongue. Added to this are failures in the synchronization of the voice with the movement of the mouth.
Regarding the overall quality of the possible deepfake, it is important to pay attention to details such as jumps in playback, blurred edges, poor lighting, skin that is too smooth, changes in the background of the video.
(Reference image source: Usnplash, in collaboration with Getty Images)
Visit our news channel on Google News and follow us to get accurate, interesting information and stay up to date with everything. You can also see our daily content on Twitter and Instagram
Comments are closed.