Taylor Swift deepfakes on X falsely depict her supporting Trump

One manipulated video appeared to originate from a pro-Trump X account that has over 1 million followers

Christopher Jue/TAS24/Getty Images for TAS Rights Management

Taylor Swift is being targeted again by deepfakes, with supporters of Donald Trump posting manipulated media falsely showing her supporting Trump and engaging in election denialism.

The viral images and clips on X, which have been viewed millions of times, come just over a week after fake nude images of Swift went viral on the platform, putting a spotlight on X’s inability to control the spread of malicious inauthentic media. X briefly blocked searches for “Taylor Swift” after the incident, and it has continued to struggle with controlling the spread of Swift deepfakes.

Some of the deepfakes showing Swift supporting Trump have content labels warning that the media is inauthentic, but many shares of those posts and reposts did not initially have the same labels. Deepfakes are real media that is manipulated, often with the help of artificial intelligence, that tends to target celebrities and high-profile women in particular.

The most prominent pro-Trump deepfake video of Swift uses a recent video of her on the Grammys red carpet, posing for cameras. But the video is edited to show her holding a sign that says “Trump won” and “Democrats cheated!” One post containing the video on X (formerly Twitter) has over 10.3 million views, according to X’s metrics. It has a community note that says that the video is edited and that Swift was not really holding the flag. 

NBC News found 13 other posts containing the fake video. Eight of them did not have community notes or manipulated media labels. One of the unlabeled posts had over 72,000 views as of publication, according to X’s metrics. None of those videos had been removed as of Thursday evening despite appearing to violate X’s policies against manipulated media. Some videos later received labels noting that the posts contained manipulated media.

A representative for X wrote “The team was made aware of this AI-generated video and we took action on almost 100 posts on February 4, 2024, under our Synthetic and Manipulated Media policy. We’re actively monitoring and when posts are found they will also be labeled as manipulated media.”

The manipulated media appeared to originate from a pro-Trump X account that has over 1 million followers and is enrolled in X Premium, which gives the account a blue verification check mark and the ability to make money from ads. Many of the video posts viewed by NBC News have links leading back to that account, which is one way for users to share videos, but while the original post has a manipulated media label, the reposts do not carry the label with them.

The X account that the manipulated media appeared to originate from has also posted numerous other deepfakes on X, some also involving Swift. A post on Monday contained an edited video of Swift’s album of the year acceptance speech at the Grammys, which appeared to use voice-cloning technology to make it sound as though she was saying “Trump won,” “F--- Joe Biden” and “Trump 2024 bit----, let’s go.” That post has over 750,000 views, according to X’s metrics, and it did not contain a manipulated media label or a community note indicating it is fake until after NBC News reached out to X for comment on Wednesday.

X appears to be the primary mainstream social media platform where manipulated media that shows Swift supporting election denialism is circulating. A search for “Taylor Swift Trump” on Instagram surfaced one post containing the video and no content labels or indications that it was fake. According to Instagram’s metrics, the post was viewed more than 6,700 times. 

On Facebook, a search for “Taylor Swift Trump” did not return the fake video, but it did return an AI-generated fake image showing Swift and Trump on a “date” posted in October. Meta, which owns both Instagram and Facebook, put a filter reading “Altered photo/video” with a link to a fact-checking article over the Instagram post after NBC News reached out for comment.

A search for “Taylor Swift Trump” on YouTube showed that the fake video of Swift holding the Trump sign was posted on YouTube 10 times in the past two days and that the video that used the voice clone of Swift was uploaded once. Five of those 11 videos had zero views as of publication. The most-viewed upload, showing Swift holding the fake sign, had 6,800 views as of publication. None of the videos were labeled as manipulated or false media. YouTube did not immediately respond to a request for comment.

A search for “Taylor Swift Trump” on TikTok brought up two videos containing the fake Swift edit. One of them had fewer than 1,000 views. Another video had 14,800 views, and the creator responded to comments asking about its authenticity by saying “It’s real” and calling it a “publicity stunt.” Both TikToks were removed after NBC News reached out for comment.

The social media platforms that NBC News found hosting the fake Swift videos without content warnings have all struggled to moderate disinformation — AI-generated and otherwise. For example, a January NBC News investigation found that a dozen YouTube channels had posted fake news about Black celebrities, some of which used AI-altered images in thumbnails and AI text-to-speech technology.

This story first appeared on NBCNews.com. More from NBC News:

Copyright NBC News
Contact Us