In a disconcerting turn of events, sexually explicit AI-generated images of Taylor Swift have circulated on X (formerly Twitter) over the past day, underscoring the rampant proliferation of AI-generated fake pornography and the significant challenges in preventing its dissemination.
One particularly prominent instance on X garnered over 45 million views, with 24,000 reposts, and hundreds of thousands of likes and bookmarks before the verified user responsible faced account suspension for violating platform policies. The post lingered on the platform for approximately 17 hours before being taken down.
As discussions about the viral post ensued, the explicit images began to spread across other accounts, with many still remaining visible. The incident led to a surge in the term “Taylor Swift AI” as a trending topic in some regions, amplifying the visibility of the images to wider audiences.
A report from 404 Media suggests that the origin of these images may be traced back to a group on Telegram, where users share explicit AI-generated content, often created with tools like Microsoft Designer. Members of the group allegedly joked about the images going viral on X.
X’s policies explicitly prohibit synthetic and manipulated media and nonconsensual nudity, making such content a violation of the platform’s guidelines. While X, Taylor Swift, and the NFL have not responded to requests for comment, X issued a public statement nearly a day after the incident began. The statement, while not explicitly mentioning the Swift images, emphasized the platform’s commitment to removing non-consensual nudity images and taking action against the accounts responsible.
Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely…
— Safety (@Safety) January 26, 2024
Swift’s fan base has criticized X for allowing many of the explicit posts to persist for an extended duration. In response, fans flooded hashtags associated with circulating the images with messages promoting authentic clips of Swift performing, aiming to overshadow the dissemination of the explicit fakes.
This disturbing incident highlights the real challenges in combatting deepfake porn and AI-generated images of real individuals. Some AI image generators have restrictions to prevent the creation of nude, pornographic, and photorealistic celebrity images, but not all services explicitly offer such safeguards. The responsibility of curbing the spread of fake images often falls on social platforms, a task exacerbated for X, which has faced challenges in maintaining effective moderation capabilities.
Notably, X is currently under investigation by the EU for allegations of being used to disseminate illegal content and disinformation. The platform is also reportedly being questioned regarding its crisis protocols following instances of misinformation about the Israel-Hamas conflict being promoted on the platform.
Also Read: