Taylor Swift’s Name Recovery in X Searches After AI Pornographic Content Flood

2024-01-30 13:40:57

CNN Indonesia

Tuesday, 30 Jan 2024 20:40 IWST

The keyword Taylor Swift’s name has recovered to be searched in the X search column following the flood of AI pornographic content. (AP/Natacha Pisarenko)

Jakarta, CNN Indonesia

Name keywords Taylor Swift has recovered to be searched in the X search column (formerly Twitter) following being blocked for some time.

Taylor Swift’s name was lost due to the flood of fake pornographic content adapted from it Artificial Intelligence (AI) or artificial intelligence.

This news was confirmed by Head of Business Operations

“Taylor Swift search access on the platform has been reactivated and we will continue to be vigilant regarding attempts to spread this content, and will remove it wherever we find it,” said Joe Benarroch in his official statement, as reported by VarietyMonday (29/1).

Since last weekend, Saturday (27/1), the Taylor Swift keyword search column has been displaying an error message that reads “There was an error. Please reload” for a few moments.

However, most users still say action X is simply blocking certain text sequences. At that time, keywords with the sound “Taylor AI Swift” might still be searched in the X search column.

At that time, Benarroch stated that the action was “temporary and will be carried out with great care due to the priority of overcoming this problem”.

This step was taken by X a few days following AI’s sexually charged content with Swift’s face went viral on

Fake pornographic content aka deepfake containing Taylor Swift’s face began to spread online since Wednesday (27/1) US time. NBC News reported content deepfake it attracted 27 million views on the uploader’s account before the account was suspended.

This phenomenon also invited comments from the SAG-AFTRA actors’ union, which released an official statement on Friday (26/1) US time. SAG-AFTRA condemned the image and called it “disturbing, dangerous and deeply concerning” content.

“The development and distribution of false images – especially those of an obscene nature – without someone’s permission should be declared an illegal act,” wrote the official statement, as reported by Variety.

The US government even provided its response to this problem. White House spokeswoman Karine Jean-Pierre pushed for urgent legislation to ban illegal AI content that contains fake pornographic images.

“We are shocked by the reports of the distribution of the images you just mentioned. Of course there should be legislation to deal with this problem,” said Jean-Pierre.

(far/to)

1706625929
#Taylor #Swifts #recovered #flood #pornographic #content

Leave a Replay