I know some of you are Trekkies, and even if you aren’t a fan, you’ve more than likely heard the phrase, “You will be assimilated. Resistance is futile,” chanted by Star Trek’s hive-mind aliens, the Borg. Though they pale in comparison to some of the movies and series’ most iconic nemeses like Khan and the omnipotent Q, their constant drive to absorb beings and technology to improve the collective are proving to be hauntingly prescient when compared to certain modern-day companies seemingly bent on assimilating the internet to feed the AI beast.
“I am the beginning, the end, the one who is many. I am the Borg.”
When the Borg appeared for the first time on Star Trek in 1989, repulsion to their “otherness” came from our culture’s inherent dislike of the concept of individuality and freedom being made subservient to a collective will. While AI was not new to science fiction at the time – it had already become infamous decades before in the sci-fi classic 2001: A Space Odyssey – it was viewed as something maybe possible in the distant future. Luckily, we got Y2K instead of HAL when the new millennium rolled around, but now, just 20-ish years later, we are faced with the reality of web-crawling bots hoovering up everything on the internet to fuel “large language model” AI platforms. It’s hard not to draw comparisons to the Borg in this regard. Human content creators are already having to resort to legal measures against various companies for “assimilating” their original work into AI-generated copycat products that are being sold on platforms like Amazon (a company often compared to the Borg) or appearing in YouTube videos (another very Borg-like company), or in sound-alike songs on Spotify.
“We will add your biological and technological distinctiveness to our own. Your culture will adapt to service us.”
Star Trek: First Contact (1996)
Image by PIRO from Pixabay
The FBI held a press conference last week to confirm what we figured was already a thing the moment open-source AI projects started surfacing: threat actors are using artificial intelligence to write malware, build ransomware websites and to put more teeth in their phishing campaigns. And as if we didn’t need more nightmare fuel, the FBI also shared this little nugget: terrorists groups are using AI to research deadlier “projects” like more potent chemical attacks.
If you can dream it, you can build it.
Unfortunately for us, dreams aren’t limited to those of us who are just trying to make our way through life without hurting anyone while having some fun along the way. Criminals aren’t hampered by ethics or compassion, and neither are AI’s, even when the programmers try to put in safeguards. As I’ve always maintained, anything built by humans will be subject to our flaws, and I don’t know that I’m willing to trust that any AI that becomes self-aware will be able to differentiate between good and evil with the amount of garbage we have piled onto the internet. At this point, unless you happened to be a multi-billionaire with ethics and a hotline to folks in power, the best you can do is let your congress-critter know that we should be pumping the brakes on this runaway AI truck. While there have been some relatively feeble attempts from the established technology titans to put together something akin to a digital watermark that will help the rest of the world identify content created by an AI, there are probably hundreds of throne-contenders willing to ignore the rules for a chance at the top, humanity be damned, and you can bet that many of them already have their hands in the pockets of any government powerful enough to even try to regulate this technology.
Am I saying it’s time to start looking for bunker-friendly real estate in an under-developed country with robot unfriendly terrain? Not yet, but could we confidently say we would know when that moment has arrived, or maybe we’ve already crossed that threshold. Most of us can only cross our fingers and hope the future is more like Star Trek and nothing like Terminator.
Image Courtesy of Stuart Miles at FreeDigitalPhotos.net