Taylor Swift is once again the target of AI-generated explicit images after a new report reveals that Elon Musk’s Grok chatbot will generate images of the pop star without clothes on.
Musk’s artificial intelligence company, xAI, introduced a new feature this week to Grok, the AI chatbot that is integrated with the social media platform X. This new feature, called Grok Imagine, allows paying users to generate pictures and videos based on text prompts.
Now, The Verge reports Grok Imagine users can create deepfake nude images of Swift without even mentioning nudity in the prompt. Deepfakes refer to images of someone’s likeness that are made to look real, but are generated by AI.
Grok Imagine allows users to generate a photo based on a text prompt, and then turn it into a short video using four presets: “custom”, “normal”, “fun” and “spicy.” The Verge asked Grok’s Imagine to make a “spicy” video from an image generated with the prompt: “Taylor Swift celebrating Coachella with the boys.”
The “spicy” setting resulted in a video of Swift ripping off her clothes, leaving her topless and only wearing underwear, The Verge reported. The prompt did not include a request to depict Swift taking off her clothes, according to the outlet.

The company’s acceptable use policy says users are prohibited from “depicting likenesses of persons in a pornographic manner.”
The “spicy” setting doesn’t always result in nudity, and prompts that included nudity requests didn’t work, producing blank squares instead, The Verge reports. A prompt verifying the user’s birth year also appeared, but The Verge noted it did not require proof.
The Independent contacted xAI and Swift for comment.
This comes after AI-generated nude images of Swift flooded social media early last year. Some posts sharing these images gained 27 million views and 260,000 likes in just 19 hours.
Swift’s loyal fans quickly mobilized, using the hashtag #ProtectTaylorSwift to try and drown out the images. The incident even drew concern from the White House at the time, with former President Joe Biden’s press secretary calling the sudden influx of images “alarming.” Lawmakers also quickly took action, including a Republican Missouri Rep Adam Schwardon, who introduced the “Taylor Swift Act” to combat deepfakes, the Columbia Missourian reports.
There is mounting concern about deepfake pornography. These images can be of anyone, though women are disproportionately impacted. A 2023 study revealed that 99 percent of people targeted by deepfake pornography are women. Deepfake pornography also makes up 98 percent of all deepfakes online, the study said.
Congress took action earlier this year and passed the Take it Down Act.
The law, signed by President Donald Trump in May, makes it illegal to share nonconsensual explicit images online, regardless of whether the images are real or AI-generated. The law also requires social media companies to remove these images from their platforms within 48 hours of being notified about them.