AI-generated Deepfake Image Targetting Taylor Swift: Taylor Swift is a renowned name in pop culture. The American singer-songwriter’s popularity has reached unprecedented levels, making her a mega star in the music world. Her tunes resonate globally. Throughout her career, Swift has not only gained popularity but has also earned a dedicated fan base who identify as ‘Swifties’. By a dedicated fanbase, I don’t just mean they cheer her up every time, but they also defend her when and where required.

Unfortunately, recently, Swifties had to shield their beloved artist. On Wednesday, January 24, an explicit picture of the singer went viral on the internet. Shockingly, these pictures were not the result of paparazzi shots or leaked images; rather, this highly explicit content, showing the singer in a disparaging light, was the result of an AI-created image. The disturbing AI-generated deep fake pornographic image was circulated by @Real_Nafu, showing the singer being sexually assaulted, purportedly by Chiefs fans at a game. This not only shocked netizens but also sparked outrage on the internet.

Although the singer has not taken any action against this hideous act yet, her outraged fans have taken the responsibility to defend their singing idol. The Swifties condemned the act and came to action to shield the singer by hook or crook. The fans tried to suppress the viral Taylor Swift deep fake picture by flooding the social media platform, X (formerly Twitter), with Taylor era tour videos.

 

View this post on Instagram

 

A post shared by Taylor Swift (@taylorswift)

Swift isn’t the first celebrity to become the target of this kind of malicious content. Many celebrities in the past have been vocal about experiencing the same kind of breach of privacy. According to Variety, last year Scarlett Johansson took legal action against an AI app that used her name and image in an online advertisement without her permission.

What happened with Swift is not only disgusting but is also a violation of someone’s personal life, as it takes a toll on the victim’s mental health as well. Unfortunately, the legal system has not caught up with this surfacing threat; as MSNBC reports, “There is no such [federal] crime that covers AI-generated nudes”.

While AI was introduced to make work easier and cut down technical toil, some people have used it to execute their menacing intents. The incident doesn’t just raise concerns about the use of AI, but it also sparks the age-old debate of whether technology is a bane or boon.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *