Skip to content

Your nudes might be on Pornhub without you even realising – here's how to protect your private pictures

In the age of ‘send nudes’, it’s time we took our cyber safety seriously.

Aysha* was watching a film with her flatmate on a Saturday night last year when she received an Instagram message from a man she didn’t know.

It said: ‘I believe explicit photos and videos of you and your boyfriend have been leaked online.’

“I get a few silly messages on social media, as everyone does, so I thought he just wanted my attention,” Aysha recalls. “Plus I didn’t have a boyfriend, so I thought it couldn’t be about me. I just ignored it.”

The man messaged again: ‘Genuinely, I think you need to see this.’

Attached to the message was a link, but assuming it was spam or a virus, Aysha continued to ignore it. Then he messaged again.

“He sent a screenshot which showed about 30 explicit photos and videos – all of me. My body, my face. I went into a state of shock; I really don't know how to explain that feeling. It was like I wasn’t in control of my body, I was just shaking uncontrollably and couldn’t catch my breath.”

The private content – which Aysha had taken five years prior when she was 21 with a guy she was seeing – had been accessed via an iCloud hack and shared on various porn sites, with her full name attached. When she checked online, she was trending on Pornhub.

Sending nudes is an increasingly common part of modern dating. In fact, 43% of young women have sent intimate or sexual images (for men, that’s 27%), according to a report published last month by Refuge. But there’s also been an unmistakable rise in these intimate images ending up online against the sender’s wishes. Figures obtained by the BBC from police forces across England and Wales show that the number of cases has soared by 117% between 2015/16 to 2018/19, from 852 to 1853.

You may know this sharing of private sexual imagery without consent as ‘revenge porn’, but there’s a growing demand to replace this terminology with language that is more accurate and inclusive, such as ‘non-consensual pornography’ or ‘image-based sexual abuse’. And since the coronavirus outbreak, the frequency of such abuse has risen even higher.

The Revenge Porn Helpline – launched in 2015 following the introduction of section 33 of the Criminal Justice and Courts Act, which made intimate image abuse a criminal offence for the first time – opened more than 200 cases in the first four weeks of lockdown, more than for any similar period in its five-year history. By early August, they’d received a staggering total of 1,700 cases, already surpassing the amount for the whole of 2019.

“We’ve seen a huge increase in image-based sexual abuse against women during lockdown,” explains Dr Kelly Johnson, Assistant Professor at Durham University’s Department of Sociology, who specialises in research on the subject. “More communication is taking place digitally, and that has likely increased the exchanging of sexual images.”

But Dr Johnson is quick to assert that we mustn't victim-blame; something that is too often seen in these cases with retorts like, ‘Just don’t take nudes in the first place’. “The problem here isn’t that people are choosing to send these images; it’s that someone has chosen to share them against their will, breaching their consent and trust,” she says.

Last month, former congresswoman Katie Hill spoke about when, in October 2019, her affair with a campaign staff member was exposed after nude photos of her began circulating online. She resigned two weeks later, less than a year into her term. Hill was California’s first openly bisexual woman to be elected to congress and, in her first ever race, flipped one of the last Republican footholds in the county to Democrat, all at the age of just 31. But sadly, it is the leaking of her nudes that she is remembered for. And as she revealed last month, it is what caused her to contemplate suicide.

“As a society, we need a whole shift to change the way we view sexual violence now, because it does involve technology and digital imagery,” says Dr Johnson. “Too often, victim-survivors are met with dismissals like, ‘It’s just a photo’ or ‘Just turn your phone off and move on’, but image-based sexual abuse can devastate lives. We talk about the idea of ‘social rupture’; when this massive breach of trust causes a loss of faith in others, it can be incredibly distressing and isolating. We shouldn’t underestimate the impact that this violation of consent and sexual autonomy can have on people.”

This is certainly true of Aysha’s experience. “For the two weeks after my images were leaked, it was just panic attack after panic attack,” she says. “I couldn’t sleep alone, and had to keep a bucket next to my bed. The police came round and took notes but didn’t have any advice for me. I kept messaging Pornhub and just wasn’t getting anywhere, until I lied that I was underage in the content, then it finally got removed.”

Yet to Aysha’s horror, the photos and videos were reuploaded within a matter of days.

“Pornhub have a ‘download’ button right next to the content so as soon as it hits the website, anyone can download it then reupload it,” says activist Kate Isaacs, who launched the #NotYourPorn campaign in May 2019 to raise awareness and hold the porn industry to account. As far as Aysha is aware, her nudes and videos are still being circulated online.

A spokesperson for Pornhub tells GLAMOUR that they have “extensive team of human moderators that work around the clock to review and remove illegal content”, and that they use Vobile, a software which victims can use to ‘fingerprint’ the non-consensual content, and which then “scans new uploads for potential matches to unauthorised materials”. But according to Kate, this isn’t good enough. “Vobile fundamentally doesn't work because 1) you need the original footage to be able to fingerprint it, which most victims don’t have access to, and 2) it is so user-unfriendly that many can’t even work out how to fingerprint. Pornhub entirely shift the responsibility onto victims.”

As for the human moderators, Pornhub reportedly told #NotYourPorn last summer that they had 12 people checking content. “That is a ridiculously low number given the amount of content that is uploaded every minute,” says Kate, “especially considering that non-consensual content would need to be removed before anyone can view it and press that ‘download’ button’.”

If this is the level of safeguarding that a massive company like Pornhub have in place, it must be even more difficult to get non-consensual content removed from smaller porn sites, of which millions exist. There are also countless reports of women struggling to get their nudes removed from social media sites like Facebook and Instagram. But given that the 2015 legislation made image-based sexual abuse a crime, how are websites still getting away with it?

“The problem is that the law only applies to individuals, not commercialised companies, so it’s actually completely legal for porn sites to be hosting and profiting from non-consensual content,” explains Kate. “This is proven in the categories that are openly listed on porn sites; Pornhub, for example, have categories like ‘revenge porn’, ‘leaked sex tapes’, ‘secret camera filming’, ‘stolen Snapchat teens’ – and it’s dressed up as ‘fantasy’. In reality, there’s no way for them to tell if that content is consensual or not. That’s why we’re fighting for regulation of the commercialised porn industry, as there’s currently no regulatory body that holds them accountable."

Even when it comes to charging an individual with sharing non-consensual nudes, there’s another legal loophole making it almost impossible for victims to get justice. “The legislation [‘Disclosing private sexual photographs and films with intent to cause distress’] means that you have to prove that someone shared it with the intent to cause distress to the victim,” says Dr Johnson. “So if you report someone for sharing your nudes against your will, they could just say to the police, ‘I didn’t think she’d ever find out’ or ‘I was just doing it for a laugh’. This problematic evidential threshold means that perpetrators can come up with a half-plausible defence, and avoid being charged.”

Many victims also decide not to go ahead with charges, whether through lack of police support or fears regarding anonymity. Like Aysha, Isabel* found out that her nudes had been shared to an online forum when someone messaged her on social media. “I knew my ex had posted the images and was sharing the link around because a mutual friend told me on Facebook,” she says. “I felt so degraded; I can’t put into words what that sort of violation does to your self-worth and your mental health. My sister told me it was illegal and that I needed to go straight to the police.”

But when Isabel looked into it, she discovered that the law classifies the non-consensual sharing of nudes as a communications offence, not a sexual offence, which means victims aren’t granted anonymity. “I was lucky in a sense because my images weren’t, as far as I’m aware, posted to a big, famous site like Pornhub,” explains Isabel, “but it meant that I didn’t end up going to the police because of the risk of my identity being made public and even going viral. I just couldn’t face it, and I was already so exhausted mentally.”

In June 2019, the government announced a review into the law which means victims could be granted automatic anonymity in British courts, like other sexual abuse victims. But the Ministry of Justice said the review won’t report back until summer 2021.

Though as is clear with Aysha’s case – and the cases of so many other women – it isn’t always a vengeful ex who posts nudes non-consensually; it can be someone you don’t even know. Chances are, you remember the celebrity photo leaks of 2014 when almost 500 nudes – including those of Jennifer Lawrence, Kaley Cuoco and Kirsten Dunst – were leaked and posted online from what is believed to be an iCloud hack.

“Hackers were able to access iCloud accounts using the victims' emails and passwords,” says Caleb Chen, an internet privacy advocate at Private Internet Access. “Where they got this information from is anyone's guess but most people think it's due to phishing attacks.”

So are everyone’s nudes automatically on the cloud, and how do we stop them from being hacked into?

“When you take a photo on an iPhone, it encourages you to back it up on iCloud – a bunch of servers run by Apple – and many users have accepted having all their photos backed up onto the cloud, whether during their phone set-up or later, and then forgotten about it,” explains Chen. “When the photo is sent to the cloud, it is generally encrypted in some way so the cloud provider can't see what the contents are. The issue is that cloud back-ups can be accessed with an email and password, and those are often not as secure as people think.”

Aside from making sure your password is strong and unique (many of us are guilty of using the same passwords across multiple logins, increasing the risk of a data breach), Chen’s advice is to “definitely not store nudes in the cloud,” he says. “Controlling what goes into the cloud is generally as easy as making sure that your phone isn't automatically backing up into the cloud – here’s an easy guide on how to disable it.

“If you do store your nudes in the cloud, it's possible to upload them in an encrypted file format that requires a password to unlock, so even if hackers make it into your cloud account like happened to the celebrities in 2014, they're unable to see anything without a password (here’s how to do it with Dropbox). Plus, do always back-up your phone yourself rather than going to a shop – even the non-dodgy corner shops are risky! – because any time you hand your device to somebody else, there’s a risk of data breach.”

This advice is something Aysha wish she’d known at the time. “I didn’t understand at first because the images had been deleted from my iPhone – I was like, ‘How can they steal images I don’t have?’ – but a hacker had got hold of my iCloud details, and the images were still on the cloud. That’s something you’re never told. We need to talk about that more.”

Opening up the conversation around both private intimate imagery and the public porn industry is something that urgently needs to happen. “Until we start talking about porn and accept it as an industry, the law won’t change and there will be no government body regulating it,” says Kate. “The lack of open conversation also means many victims don’t want to come forward and admit they’ve had their nudes leaked, because they feel so ashamed and alone. That has got to change.”

Aside from making sure our intimate content is safe, that is Aysha and Isabel’s main message for other women – if this type of abuse happens to you, you are not alone. “This is happening to women everywhere,” says Aysha. “But we have to remember there is nothing to be ashamed of – we are the victims. Image-based sexual abuse is a massive breach of consent, and it’s time it was taken seriously.”

If you have had your intimate images shared without your consent, remember that you are not alone and there is help available. Get in touch with the Revenge Porn Helpline on [email protected]. There is also a step-by-step guide on notyourporn.com which should be followed before taking any action.

Due to the huge rise in cases mentioned in this article, the Revenge Porn Helpline (who are funded by the Home Office) urgently need more resources. If you would like to show your support, visit revengepornhelpline.org.uk to donate.

Written by Ali Pantony.

This article originally appeared on Glamour UK.

Share this article: