In 2019 I wrote about the arrival of deep fakes and posited that it might take an election being stolen before anyone in the country takes it seriously. Welcome to 2024 where someone engineered a robocall in New Hampshire designed to suppress the vote in that state’s January 23rd primary elections. The call featured what appears to be an artificial intelligence-generated clone of President Biden’s voice telling callers that their votes mattered more in November than in today’s primary. To put a nice ironic cherry on top, the robocallers seemed to have spoofed a phone number from a Democrat PAC that supports Biden’s efforts in New Hampshire. Here is the actual release from the NH Department of Justice website that signals the official investigation, in case you are skeptical of the above website’s veracity.
What this means for you
I imagine that regardless of which side of the political spectrum you sit on, this presents a very scary future where we cannot trust our eyes or ears or practically anything on the internet at a time when truth and objective reasoning are crucial. The technology to do the above is readily available and accessible, and it seems a small but influential number of us cannot be trusted to act responsibly with powerful technology. If you are thinking, “well, let them duke it out in their political battles over there, I don’t need to worry about AI fakes affecting me,” let me spin a “fanciful” situation for you to consider. Let’s say you have a disgruntled ex-employee who is looking to strike back at you or your company and decides to use the above tool to fake a harassing phone call from someone in company leadership to someone else in your organization. Do I even have to tell you that this service is likely already on offer in questionable corners of the internet? What can you do?
Make your voice heard in the upcoming elections by voting for leaders that represent your values (which are hopefully based on lifting people up instead of pushing them down). How do you know who that might be? Time to step up and ask directly. Don’t rely on third parties to put words in their mouths. It’s time for direct accountability, for you, me and them.
Register to vote. Get out and vote.
Image courtesy of Stuart Miles at FreeDigitalPhotos.net
If you’ve been reading my blog for any length of time, you’ve seen me describe the current state of security in a variety of colorful ways, but my favorite analogy is the one where I liken ourselves to jugglers with many objects in the air and with more being tossed in every minute by hackers and criminals. We lose if we drop a single item, but there is no “win” condition for juggling. If anyone has enough hands and arms to keep a lot of things in the air, it should be Facebook, and they have a lot going on, but in the end, they have come up short on another promise: transparency in sponsored advertising. Facebook’s never ending torrent of fake news was supposed to be somewhat dampened by a tool rolled out in May of this year called “Paid for by” which was built to bring some accountability to Facebook publishing tools heavily abused by political trolls leading up to the 2016 US elections, and surrounding numerous other political events since then.
Transparency or Lip Service?
Just ahead of the 2018 midterm elections, Vice.com investigators, through the “Paid for by” tool on Facebook, applied to purchase ads on behalf of all 100 US Senators. All 100 applications were approved, despite the ads being shared from fake political groups built specifically to test Facebook’s transparency tool, and the very obvious fact that Vice investigators are clearly not actual spokespeople for any sitting US Senator. The same tool also allowed the Vice team to buy ads on behalf of Vice President Mike Pence and the Islamic State, but curiously enough, not Hillary Clinton. Based on the amount of effort the Vice team exerted to circumvent the “Paid for by” verification tool, it’s clear that Facebook put an equal amount of effort into building this tool, i.e. virtually none. It’s unclear if the “Paid for by” tool was a token effort put up by Facebook to appease shareholders and lawmakers, or if the problem of fake news on Facebook is truly unsolvable, but if an organization as big and as powerful as Facebook can’t (or won’t) solve this problem, the only other solution is to completely ignore it as a source of news.
And that’s the other problem with elephants on the internet: because of their size, they are hard to ignore.
Image courtesy of Stuart Miles at FreeDigitalPhotos.net