Is artificial intelligence (AI) cursed? It seems to be accelerating us toward a dystopia that humanity isn’t ready for.
It's true that AI has had positive effects for some people. Twitter hustlers have an endless stream of new AI tools, giving them endless content about useless ChatGPT prompts that they can use to compile threads for shilling their newsletters. More significantly, AI has helped to streamline information — and is being used to detect cancer in some cases.
However, many have chosen to use AI to create content — and sometimes whole businesses — centered on the things that sci-fi warned us about.
Murdered children remade for ghoulish TikToks
“I was put into a washing machine by my father and put on the spin cycle causing my death,” says an AI-created toddler in one TikTok video. He stands in front of a washing machine and recounts an awful yet horrifyingly true story of a three-year-old murdered in 2011.
It’s the most awful use of generative AI. True crime-loving ghouls making TikToks sometimes using deepfakes of children who were killed — to detail how they were killed.
Thousands of similar videos plague TikTok with AI-generated voices and images of kids cheerfully laying out “their” gruesome murders. Some are delusional enough to think the videos “honor” the victims.
Thankfully, not all videos depict the real victims, but some do even though TikTok banned deepfakes of young people.
I’ve been getting those AI generated true crime tiktoks where the victims narrate what happened to them and I think it’s time we put the true crime community in prison— alexander (@disneyjail) June 1, 2023
Arguments can be made that the videos highlight stories worth telling to a younger audience with no attention span for longer content, but such “true crime” related media is often exploitative regardless.
Are AIs already trying to kill their operators?
AIs are coldly bloodthirsty — if skepticism is given to a recent backtrack from Colonel Tucker Hamilton, the chief of AI test and operations for the United States Air Force (USAF).
Hamilton spoke at a defense conference in May, reportedly detailing simulated tests for a drone tasked with search-and-destroy missions with a human giving the final go-ahead or abort order. The AI viewed the human as the main impediment to fulfilling its mission.
“At times the human operator would tell it not to kill [an identified] threat, but it got its points by killing that threat. So what did it do? It killed the operator [...] because that person was keeping it from accomplishing its objective.”
Hamilton said after it trained the AI not to kill humans, it started destroying a communications tower so it couldn’t be contacted. But when the media picked up on his story, Hamilton conveniently retracted it, saying he “misspoke.”
In a statement to Vice, Hamilton claimed it was all a “thought experiment,” adding the USAF would “never run that experiment” — good cover.
It’s hard to believe considering a 2021 United Nations report detailed AI-enabled drones used in Libya in a March 2020 skirmish during the country’s second civil war.
Retreating forces were “hunted down and remotely engaged” by AI drones laden with explosives “programmed to attack” without the need to connect to an operator, the report said.
Got no game? Rizz up an AI girlfriend
The saddest use of AI would be those who pay to “rizz up” AI chatbots — that’s “flirting” for you boomers.
A flood of phone apps and websites have cropped up since sophisticated language models, such as ChatGPT-4, were made available through an API. Generative image tools, such as DALL-E and Midjourney, can also be shoehorned into apps.
Combine the two and the ability to chat online with a “girl” that’s obsessed with you right alongside a fairly realistic depiction of a woman becomes real.
In a tell-tale sign of a healthy society, such “services” are being flogged for as much as $100 a month. Many apps are marketed under the guise of allowing men to practice texting women, another sign of a healthy society.
Most allow you to pick the specific physical and personality traits to make your “dream woman,” and a profile including a description of the e-girl is assumedly generated.
Whatever prompts given to write descriptors about the girl bots from its point of view — as seen on a few apps and websites — always seem overly focused on detailing breast size. Many generated girls describe a blossoming porn career.
Another whole subset of apps — invariably named some stylization of “rizz” — are AIs meant to help with flirty text responses to actual women on “dating” apps, such as Tinder.
Regardless of its misuse, AI devs will march on and continue to bring exciting tools to the masses. Let’s just make sure we’re the ones that are using it to make the world better and not something out of an episode of Black Mirror.
This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.