The internet has been abuzz recently with a swath of videos showing celebrities saying things ranging from funny to downright odd or alarming. This is the world of deepfakes, and MIT thought it proper to represent the power of the new deepfake video capabilities by making one of their own.
As you can see above, that is a small clip from the https://moondisaster.org project which is designed to simulate an alternate reality where the Apollo 11 astronauts did not land on the moon, but instead were stricken by tragedy and the ultimate result of the entire cred perishing on earth's largest satellite.
The video clip, while brief, is created by the MIT Center for Advanced Virtuality. While it may be cool to see how computer technology can be used for lifelike recreations or even film production, it can be disturbing to think that world events by trusted individuals, reporters, or world leaders could be potentially skewed by those wishing to cause disorder or chaos.
Deepfakes are nothing new to the world as one of the best is probably the team over at Corridor, as you can see from their work above. While faking human faces is more complicated, what they can accomplish is still cool enough and potentially alarming that we have to ask the question, how do we control it?
Something like this, imagine it showing up across social media feeds across the world, where those looking to cause political or social harm, could make someone say literally anything. I think this is one of the most alarming parts of the new digital media trend, is the concern of how do we know the difference? Quality will only get better as the tech improves, and we have to wonder not if, but when it will be used nefariously. Deepfakes can potentially be used to cause anything from a political stir to even likely targeting high-level entities such as a CEO's or representatives saying something they may have never done.
In this current age of quick judgment, these new Deepfakes, without some way to identify them, can paint a pretty bleak future for society.