In previous conflicts, authoritarian regimes have attempted to exploit their American prisoners of war for propaganda gain. These efforts often took the form of video and audio recordings as well as pictures of the POWs, despite such activities being in clear violation of the Geneva Conventions. The prospect of advanced digital capabilities such as deepfakes presents a significant new tool for potential adversaries in future conflicts. The American military must prepare for the prospect of these new technologies being used against their POWs in future conflicts.
Deepfake is the technique to manipulate video and audio to make it appear in a video that the person says or does something they never said or did. The technique utilizes preexisting audio and video of the target to create a video (potentially even a real-time, live video feed) where another person is controlling what is said by the deepfake subject, duplicating the targeted individual’s face, features, speech and vocal distinction. The end product is often not only plausible, but it can also be highly believable.
Several internet-famous deepfakes have surfaced on social media. The deepfakes of the Belgian visual artist Chris Ume gained international attention as he created compelling manipulated videos featuring what appeared to be Tom Cruise. The person is instead the actor Miles Fisher that Ume created to look and sound like Tom Cruise. Ume needed two months to create the Tom Cruise deepfakes, but he did not have access to Tom Cruise and couldn’t call him in to extract voice or features to speed up the deepfake creation. Today, a deepfake can be created in as little as five minutes. In a POW or captivity scenario, the captor’s access to the captive will render it very simple for the captor to create a deepfake of the captive.
From a POW and captive recovery perspective, this technology creates two distinct concerns.
The first concern is the release of a POW deepfake to the public. Even though a violation of the Geneva Conventions, such a deepfake could be manipulated and utilized to create narratives of war crimes, atrocities, rejection of the U.S. war effort, pleadings to end the war, and other propaganda. The videos and audio could be distributed back to the American homefront on a broad scale to undermine the American war effort and the will to fight, stress families, influence politicians, and create cleavage in communities to weaken support for the war.
The second concern is that the captor could show deepfakes to POWs in order to manipulate them while in captivity. The captor could utilize deepfakes to indoctrinate, psychologically destabilize and manipulate the captive’s mental state. This effect becomes more likely in a protracted conflict where captivity might continue for several years. Even if each individual deepfake could be brushed off as likely fake, it is likely that over time, the isolation and pressure from the surrounding conditions could induce a POW to accept the deepfakes as real.
In our view, the POW deepfake concerns need to be addressed in advance of potential conflicts where such tactics may be used. Planning and research initiatives should be commenced to address these increasingly likely possibilities. Preliminary efforts should include: (1) establishing ways to identify deepfakes shortly after their dissemination, (2) exploring the possibility of preparing a validated genuine video of all servicemembers as an aid to identifying deepfakes, with enough data that could be deposited before deployment (much like is currently done for ISOPREP), (3) preparing both the military and public at large in advance to the possibility of deepfakes, and (4) including deepfake information in POW training in order to prepare servicemembers for the possibility that deepfakes may be used against them in captivity.
Jan Kallberg is a research scientist at the Army Cyber Institute at West Point and an assistant professor at the U.S. Military Academy. Col. Stephen Hamilton is the chief of staff and technical director at the Army Cyber Institute at West Point and an associate professor at the United States Military Academy. The views expressed are those of the authors and do not reflect the official policy or position of the Army Cyber Institute at West Point, the U.S. Military Academy or the Defense Department.