In case you haven’t already seen what Deepfakes are all about, here’s a relatively harmless and entertaining demonstration of what our dark future holds:
The deepfake technology first surfaced in 2017 and even at the time of its first appearance, nearly every pundit paying even minuscule amounts of attention predicted they would have significant political ramifications. Late last year, sophisticated deepfake videos made enough of an impact that legislators and business leaders both called for regulation of the technology.
Deepfake Videos Deployed in Indian Election Campaigns
Though it wasn’t the first politically motivated deepfake video, India has the dubious distinction of being one of the first countries to see a series of deepfake videos distributed by a political party as part of their official campaign. The videos, which feature the opposition party BJP president fluently criticizing the incumbent government in multiple languages he does not speak, went viral on WhatsApp, reaching as many as 15 million people. While party officials and the communications firm behind the videos describes them as “positive campaigns,” watchdogs and fact-checkers are alarmed to the point of dubbing it a growing crisis.
As we approach our own 2020 elections and the battle over “fake news” and “alternate facts” become pivotal to voters, it has become painfully obvious why everyone is raising red flags on this issue. Skillful and almost imperceptible image and audio manipulation have been around for decades now. Coupled with the lightning spread of information the internet provides, spreading fakes has become so commonplace that every picture and recording is doubted as a matter of course, leaving the average human with very unsure footing. Once video is undermined as a reliable record, we are literally left with only trusting what we see and experience in person, making our global worldview tragically smaller and provincial, which is the exact opposite of what technology was supposed to do in the first place.
Image by Gerd Altmann from Pixabay