“Fake news” has been made into a near-banality. Given every opportunity, President Donald Trump has used the term as a cudgel against critical news media outlets. However, misinformation remains a vital concern for anyone who gives a damn about the truth.
And we haven’t even seen the worst of the fake news trend yet.
Here’s a look at the potential future of online misinformation. It’s a video from researchers at Stanford University, Max Planck Institute for Informatics, and University of Erlangen-Nuremberg. Don’t skip watching the video; it must be seen to be believed.
With “Face2Face” video technology, people may be able to inhabit the body—or at least face—of another person in real time. The target actor (the person on the video) is manipulated by the movements of source actor (the manipulator) with stunning accuracy.
As the video runs, the manipulator scrunches his face, moves his mouth, and raises his eyebrows. The manipulator’s facial movements control the target actor in the video, creating a “real-time reenactment.”
The real-time reenactment is not perfect quality, as the mouth occasionally blurs and glitches appear as facial tics, but this technology has great potential to wreak havoc on the world of truth and information.
From the Face2Face paper abstract: “Reenactment is … achieved by fast and efficient deformation transfer between source and target. The mouth interior that best matches the re-targeted expression is retrieved from the target sequence and warped to produce an accurate fit.”
What will happen when this technology improves and is nefariously used by politicians, propagandists, and fake newsmen? I can only imagine its usefulness in the hands of a party like Kim Jong-un’s Workers’ Party of Korea in the quest to fire-up citizens against the U.S. or as a tool of Chechen leader Ramzan Kadyrov to galvanize his country in favor of his quest to seek and kill homosexuals. These are already successful ventures in both countries, of course, but tipping points of anger are important, especially in crowds and mass movements.
Of course, I shudder to think about how it can be used here in the U.S.
Many U.S. citizens are already easily convinced by grotesquely fake looking news sources. Most are pacified by the lies of politicians and business leaders with relative ease. What kind of hope do we have to improve as a country if the conversation turns to “was that video real or was it a fake?” Many are likely to instantly believe a manipulated video to be the truth.
It’s a dark time for anyone who cares about objective truth—or coming as close to objective truth as we can get. I still have hope that the internet can one day foster a worldwide conversation that will bring people together to discuss tough issues in an honest way. However, I worry about the potential the medium continues to have for lies and obfuscation.
The researchers who created Face2Face aren’t to blame, of course. If they didn’t create this technology, someone else would have—much like the researchers who recently created AI software that could tell if someone was gay or straight via a picture of their face. Their findings “expose a threat to the privacy and safety of gay men and women,” according to the researchers, which WIRED’s Sophia Chen said was akin to building a “bomb so they could alert the public about its dangers.”
In the case of Face2Face, researchers have built a bomb that has nuclear potential. I believe it’s important to be skeptical of any news source now, no matter what’s being reported. Face2Face adds a new level to the importance of that skepticism.
Remember: this isn’t the future we’re facing, this is our new present.