Column

AI deepfakes affect us all – but not in the same ways

Emma Lee | The Daily Orange

Our columnist argues that Taylor Swift’s exploitation via AI deepfake is horrific, but it shouldn't have taken this long for the pornography conversation to enter the spotlight.

Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.

I’ve never cared much for new, trendy technology – Bitcoin, virtual reality, non-fungible tokens (NFTs). Say any of these names in conversation and my eyes immediately start to glaze over. But, one advanced technology that has been making my blood boil since it debuted in our digital culture is the rise of deepfakes and deepfake pornography since 2017.

Admittedly, deepfakes aren’t as new as we’d like to think they are. The deliberate manipulation of photos has been around since at least the 19th century, not long after the invention of the camera. (Perhaps an early warning sign that we don’t always deserve the technology we invent.) Yet as photography has advanced with the rise of digital cameras and handheld phones, so have the ways to manipulate photos and videos, spread misinformation and strip consent from those being depicted.

Deepfake technology is the latest tool to do just that: digitally manipulating peoples’ likenesses by using AI. Almost every celebrity and politician you know has been the victim of deepfakes, including Barack Obama, Tom Hanks, Jennifer Aniston, Ellie Goulding, Scarlett Johansson and Ariana Grande – with women disproportionately being affected.

At the end of January, we got to see the worst of it in action when a series of AI-generated pornographic photos of Taylor Swift at a football game went viral on 4chan and X, formerly known as Twitter. One photo was viewed over 47 million times, according to X, and many spread to other platforms like Telegram and Facebook. Later discovered to have been created by a 4chan community using text-to-image AI software by Microsoft, the photos sparked backlash from fans, SAG-AFTRA and even White House Press Secretary Karine Jean-Pierre, who called the images “alarming.”



It quickly became clear that Swift’s influence was no longer just in the realm of the Kansas City Chiefs or the global economy. In direct response to the images, U.S. senators Dick Durbin, Lindsey Graham, Amy Klobuchar and Josh Hawley introduced a bill that would allow victims to sue creators and distributors of deepfake pornography, paving a way for them to seek justice, protection and safety.

Miranda Fournier | The Daily Orange

I’m glad to see that political legislation is slowly catching up to the fast-moving changes of modern technology. But one of my biggest points of frustration in this case is that, as usual, it takes a wealthy white woman like Swift being targeted for people to care about something that has been an issue long before it affected her.

I think about Xochitl Gomez, a 17-year-old Mexican American actress – a literal minor, a child – who, just two weeks before the photos of Swift went viral, spoke about deepfake pornographic videos of her that were being shared on X without her knowledge or consent. While Swift’s photos were taken down within a matter of hours, Gomez’s team has been trying to have the material removed since the beginning of January without success as of Feb. 12. It’s due to be an uphill battle since there are no federal laws protecting victims of nonconsensual pornographic material.

To think that one’s ability to be protected lies at the intersection of class, wealth, privilege and power is deeply infuriating to me; it leaves victims severely traumatized without any real way of pursuing legal action or getting justice, let alone getting the content taken down.

“This is violating. This is dehumanizing. And the reality is that we know, this could impact a person’s employability. This could impact a person’s interpersonal relationships and mental health,” activist and researcher Noelle Martin told Euronews in 2023 about how deepfakes particularly target girls and women as young as eleven.

And yet, this is the reality for so many young female celebrities today. From Olivia Rodrigo to Billie Eilish to Rachel Zegler, countless deepfake porn sites are out there advertising this kind of content of girls and women who are barely of age, all with no legal consequences for either the creators, distributors or viewers.

Some might argue that this is the price of being in the public eye; that at a certain point, celebrities don’t have a right to privacy when they’re willingly putting themselves and their images out there to be consumed. Not only does this set a dangerous precedent to blur together consensual and non-consensual content, but it also perpetuates the idea that not everyone deserves complete bodily autonomy, whether physically or digitally.

The reality is that no one deserves that kind of exploitation, but it’s clear that only certain people’s plights have the ability to garner outrage, encourage people to become advocates and get politicians to pay attention.

And let’s not forget that deepfake technology is also affecting those outside of the public sphere. Forget fake Instagram accounts where scammers steal your profile pic, bio and photos to harass and ask people for money. Just last year, a group of boys at a school in Spain were accused of running images of their female classmates through software to make it look like they were topless. They posted the images all over social media.

As it turns out though, there isn’t legislation in Spain to bring justice to the victims due to the ages of the suspects. In the months since, the Spanish government has introduced legislation to regulate the use of AI, but only time will tell how effective it will be in practice.

Unfortunately, cases like this are becoming less and less unique. It’s happening all over the world to girls and women, celebrities and regular people alike, and is being perpetuated in what’s being characterized as a new form of misogynyhttps://www.wsj.com/articles/can-the-government-regulate-deepfakes-11610038590

So while Swift shouldn’t have been exploited as she was, it shouldn’t have taken her being victimized for the country to care about this ongoing issue. If we’re going to make a difference in how we punish creators and distributors of deepfake pornography, we have to start by asking ourselves who we consider worthy of protection, who we want to fight for and what value we place in our own image.

Sofia Aguilar is a first-year grad student in the Library and Information Science program. Her column appears weekly. She can be reached at saguilar07@syr.edu.

membership_button_new-10





Top Stories