Generative AI technology has reduced what is real and what is not. Old comments He is a photoshop – Earlier used to question the authenticity of an image – now it has been replaced He is Ai.
And as the technique gets better and better, it is difficult to know if something is AI-Janit. Even more worse is whether there is no real way to determine whether a piece of material is synthetic, people who are well aware of the media can tell but there are less smoking guns.
For example, take deepfake, an first reliable indication of authenticity is whether a video shows a visible pulse -like physical signals. But a new study by Humbolt University in Berlin says that it can still be thrown away.
Research published in Frontiers in Imaging It was found that some modern deep -skeks can produce videos that show human -like heart rate indicators. Detection tools that rely on identifying these subtle signals, such as a pulse, miscated fake video as real.
“Here we show for the first time that a realistic heartbeat and facial color can be changes in a realistic heartbeat and facial color,” says Peter Icert, professor of Hambolt and Study, Professor Peter Icert, the lead author of the study, “says. Per popular Science,
The study highlights a new challenge in the ongoing conflict to detect synthetic media. Deepfakes AI is used to manipulate images, videos, or audio files that can clearly appear real. While some uses are benign, technology has criticized for enabled the spread of non-conscience clear materials. According to 2023 report in WireMore than 244,000 deepfec porn videos were uploaded on the top 35 such websites in the same week. Equipment that makes someone’s face easy to include in clear materials, has made the problem more wider.
Deepfeck has also raised concerns around misinformation and fraud. The fabricated videos of public figures, both famous and not-are widely transmitted, widely broadcast. To address the growing problem, US Congress Recently passed Tech It Down ActWhich criminalizes the sharing of non -sensitive sexual imagination including people produced by AI.
Efforts to detect deepfec have traditionally rely on seeing visual anomalies such as unnatural eyelids or facial features. More recent systems have used distant photoplethysmography (RPPG), which is basically a method developed for telehealth, which is to detect signs of heartbeat by analyzing mild changes in the facial skin.
To check if this method still works, humbolt researchers trained a detecting model using the actual video of participants doing various tasks. After analyzing footage of only 10 seconds, the system can identify the heart rate of each person. However, when the same method was implemented on the Deepfech versions of those participants, the results were surprising: the detector identified the heartbeat in the manipulated video and marked them as authentic.
Researchers say, “Our experiments showed that Deepfac may demonstrate realistic heart rates, unlike previous conclusions.”
The study did not program Deepfeek to deliberately follow the heartbeat. Instead, researchers believe that synthetic clips from the original footage are inadvertently “inherited” pulses. Visual data from both real and fake videos showed almost identical light transmission patterns, suggesting that these microscopic signals were transferred during the video generation process.
“Small variations in the tone of the real person’s skin are moved together to the lamp with facial speed, so that the original pulse is repeated in a fake video,” Easter explains.
Although the study indicates a difference in current identification systems, researchers say the situation is not disappointing. Today’s deepfacks still reduce over time by mimicking more complex patterns in the blood flow on a person’s face. Other detection methods – such as trekking changes in pixel glow Or using digital watermark – Technology is being discovered by companies like Adobe and Google to complement traditional approaches.
Nevertheless, conclusions highlight the need for continuous updates to detect technology. As eisert and their team suggests, no indicator can be sufficient on its own for a long time.
Image Credit: Licensed through header photo Amount deposited,