【Too Naughty to Say No (1985) - Remastered】
Being a woman online is Too Naughty to Say No (1985) - Remasteredscary. In the digital world, as in the real world, harassment and abuse affects women and people of marginalised genders every day.
What happened to Taylor Swift this week is a prime — and terrifying — example. AI-generated, pornographic images of Swift went viral on X (formerly Twitter), with one such post garnering over than 45 million views. Outraged fans quickly intervened, flooding the platform with positive posts and images of the singer attached to search terms like "Taylor Swift AI" and "Taylor Swift deepfake". Swift's dedicated fanbase also got the hashtag #ProtectTaylorSwift trending, with thousands of posts condemning the nonconsensual images.
This Tweet is currently unavailable. It might be loading or has been removed.
The incident certainly isn't an isolated one. Swift, one of the most recognizable figures on the planet, is the latest to have deepfakes weaponized against her. This has happened to K-pop stars, TikTok creators, journalists, and high school girls.
You May Also Like
SEE ALSO: Taylor Swift, Selena Gomez deepfakes used in Le Creuset giveaway scam
What is deepfake porn?
Deepfake porn is artificially-created images or videos generated by a specific kind of machine learning. These synthetic visuals depict people's likenesses superimposed into sexual acts without their consent. The act of sharing it is a form of image-based sexual abuse, and has been criminalized in some countries. Much like revenge porn, deepfake porn has caused harm for those it depicts, whether psychologically, personally, or professionally.
A 2023 study from Home Security Heroes, a research firm focusing on identity theft and digital harm, found that deepfake porn makes up 98 percent of all deepfake videos online. Further, 99 percent of deepfake targets are women.
Where does the law stand on deepfake porn?
When it comes to nonconsensual sharing of explicit images, some countries worldwide have or are implementing laws to protect survivors.
In England and Wales, sharing deepfake porn has been criminalized since June 2023, with the UK government announcing a crackdown on "abusers, predators and bitter ex-partners who share intimate images online without consent of those depicted."
In the U.S., 48 states and the District of Columbia currently have anti-revenge porn laws. Some states have been working to update their language to include deepfake porn under this umbrella, including Illinois, Virginia, New York, and California. However, regulations vary by state and, as some have pointed out, certain laws don't include the pressing issue of technology's role in the creation and proliferation of such images and videos.
However, many governments are failing to tackle the problem. "Most governments aren’t taking action. Most don’t have laws, or their laws are full of loopholes," according to #MyImageMyChoice, a campaign dedicated to amplifying the voices of those who have encountered image-based sexual abuse. "Most countries don’t have a framework around who is responsible for policing online spaces."
How has Big Tech approached this problem?
X, where the pictures of Swift have been distributed widely, explicitly forbids sharing "synthetic, manipulated, or out-of-context media." This includes content that deliberately intends to "deceive people" or those that falsely claim to depict reality. The company says it has a "a zero-tolerance policy towards such content", according to a post from its official safety account.
This Tweet is currently unavailable. It might be loading or has been removed.
Other platforms, like Reddit, also have policies preventing the sharing of intimate or sexually-explicit media without a person's consent.
Social media platforms and Big Tech have been put to the test when it comes to detecting and preventing deepfakes, despite policies. In 2021, Meta implemented a new tool to do so, partnering with the UK Revenge Porn Helpline's platform StopNCII.org. More recently, the parent company of Facebook and Instagram announced that any digitally-altered images pertaining to social, electoral, and political issues must be labelled — a policy intended to protect upcoming elections that some of the world's biggest democracies are gearing up towards.
Related Stories
- 5 times AI fooled the internet in 2023
- Meta shutters its Responsible AI team
- It's time to stop obsessing over celebrities' sexualities
- Google updates its policies on revenge porn in search results
Many platforms struggle to contain such content. The images of Swift, notably, were created and spread on a Telegram group chat, as discovered by 404Media. Telegram has failed to prevent this kind of content in the past. Others have too: just this month, NBC News found that nonconsensual deepfake porn featuring the likeness of female celebrities appears at the top of search engines like Google and Microsoft’s Bing.
Here's the real question: how does this keep happening?The alarming reality is that AI-generated images are becoming more pervasive, and presenting new dangers to those they depict. Exacerbating this issue is murky legal ground, social media platforms that have failed to foster effective safeguards, and the ongoing rise of artificial intelligence. International women's rights organization Equality Now detailed these factors in a January 2024 report, calling for "urgent and comprehensive responses from technological innovation, legal reform, and societal awareness" to tackle the undeniable rise of deepfake porn.
The circumstances surrounding Swift's case highlights the reality that this can happen to just about anyone – and before this kind of digital crisis worsens, it's time for a change.
If you have experienced sexual abuse, if you are based in the U.S., call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org. If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
If you are based in the UK and have experienced intimate image abuse (aka revenge porn), you can contact the Revenge Porn Helplineon 0345 6000 459.If you have experienced sexual violence and are based in the UK, call the Rape Crisis helpline0808 802 9999.
Topics Artificial Intelligence X/Twitter Taylor Swift
Search
Categories
Latest Posts
Tory Porn
2025-06-25 23:17The case for listening to music in the shower
2025-06-25 21:54Giant rainbow crossing returns to Sydney after controversial removal
2025-06-25 21:32Apple's contactless payments system is coming later in 2022
2025-06-25 21:25The Ministry of Politainment
2025-06-25 21:24Popular Posts
Who’s Afraid of the DNC?
2025-06-25 23:35The best electric scooters to buy
2025-06-25 23:11Woman saves 'sabretooth squirrel' from its own terrible teeth
2025-06-25 22:19Of Damages and Dog Whistles
2025-06-25 20:53Featured Posts
Getting Schooled
2025-06-25 23:30What the most successful people do, according to this meme
2025-06-25 22:40Australia makes landmark decision to ditch controversial tampon tax
2025-06-25 22:25Tesla now holds about $2 billion in Bitcoin
2025-06-25 21:41Hockey Sausage
2025-06-25 21:23Popular Articles
Mapping the Face of War
2025-06-25 22:37The Cat Art Show: 'It's no f*cking joke'
2025-06-25 22:24Alfa Romeo's Tonale hybrid SUV comes with an NFT
2025-06-25 21:38Weekly Bafflements
2025-06-25 21:14Newsletter
Subscribe to our newsletter for the latest updates.
Comments (241)
Ignition Information Network
General McMaster and the Miniskirts
2025-06-25 23:04Openness Information Network
Protesters disrupt Kirstjen Nielsen's dinner at a Mexican restaurant
2025-06-25 22:49Transmission Information Network
Apple's contactless payments system is coming later in 2022
2025-06-25 22:22Style Information Network
Tesla recalls 26,681 vehicles due to heat pump issues
2025-06-25 21:56Fun Information Network
The Fire This Time
2025-06-25 21:02