Survivors of online sexual abuse are fighting for justice
A new UK bill to tackle digital abuse won’t protect women or sex workers from harm, say both survivors and rights activists
Madison Elliott had been doing online sex work for a few months when she googled her ‘cam girl’ alias on a whim in 2019. A sea of videos popped up – private sessions that had been secretly recorded and uploaded to underground video-sharing porn sites (‘tube sites’) without her consent. “My stomach sank, I thought ‘oh my god’,” recalled Madison. “I panicked, naturally.”
Madison is one of many image-based sexual abuse (IBSA) survivors who told openDemocracy they feel let down by current controls. Her attempts to remove the videos failed, the police were unable to help, and her physical and mental health suffered.
The UK government is attempting to tackle digital abuse with its new landmark Online Safety Bill. This introduces a “duty of care” for tech platforms that allow users to post content, including social networks such as Facebook, Instagram and Twitter and video-sharing sites such as YouTube, OnlyFans and Pornhub. They would have to prevent the distribution of material promoting racism, terrorism and child abuse, while also protecting children from viewing “harmful” material and adults from “legal but harmful” content.
However, a coalition of women’s and equal rights groups, who are fighting alongside survivors of violence against women and girls (VAWG), says it won’t do much to protect women. Women are 27 times more likely than men to be abused online. One in five women polled by Amnesty International experienced online harassment or abuse, with Black women targeted most.
Get one whole story, direct to your inbox every weekday.
The latest draft of the bill makes no mention of VAWG. And a recent report on the bill by a joint committee of MPs and peers failed to make specific recommendations to ban online abuse against women and girls, despite recognising how extensive it is.
Madison: ‘It shattered me’
When Madison discovered that her videos had been shared without her consent, she reported them to the cam sites she used, and tried to move on. One site deleted them, another removed them from search results, some sites ignored the requests.
But a year later, in 2020, things got worse when the clips surfaced on more than 70 other sites, going viral on WhatsApp in her tiny Welsh hometown. “It was out of hand,” she says. “With these WhatsApp chats, it’s just instant.” Some of the videos racked up more than 8,000 views and Madison became consumed with panic, developing severe anxiety and losing weight. “It shattered me, I was shaking.”
She reported the case to the police but says they were unable to help.
“I felt powerless,” she said over a video call to openDemocracy. “I’m under no illusion. I’m an online sex worker and I’m doing these things for money. But if you do something [and it’s used without your consent] you’re still being violated of your consent.”
‘Revenge porn’ was criminalised in England and Wales in 2015, but campaigners say the law is not fit for purpose. Flaws include a requirement to prove the perpetrator’s motivation to cause distress, and anonymity for victims is not guaranteed.
Sexual images are proliferating non-consensually across social platforms and tube sites, with 4,100 cases of IBSA reported to the Revenge Porn Helpline by December 2021.
If you don’t understand the specific ways women and girls experience harm online, you can’t tackle the problem effectively
The proposed bill could be instrumental in helping people like Madison. But Elena Michael, co-founder of anti-IBSA campaign #NotYourPorn, says that it won’t tackle intimate image abuse unless it specifically identifies online VAWG. “If you don’t understand the specific ways women and girls experience harm online, you can’t tackle the problem effectively,” she told openDemocracy.
Campaigners are calling on the government to outlaw VAWG as a specific harm. They also recommend that the bill list all forms of IBSA as harmful, with commercial tube sites specifically addressed, motivation requirements removed and automatic anonymity for victims.
The current law is “inadequate” in tackling online VAWG, explained Rebecca Hitchen, head of policy and campaigns at End Violence Against Women (EVAW). She is “very concerned” about online VAWG not being specifically targeted in the new bill, which she sees as a “hugely missed opportunity”.
“It feels empty because this is a prime opportunity for the government to put their money where their mouth is, to show it cares about women and girls,” she told openDemocracy, noting government ministers’ pledges to tackle VAWG after the recent high-profile murders of Sabina Nessa and Sarah Everard.
The new bill should force platforms to be more responsive to complaints and remove non-consensual content immediately, adds Hitchen. Currently, “it’s up to survivors to find where images are and where they’re hosted. When there are takedowns, they often crop up elsewhere.”
Parliament has to respond to the committee’s report on the draft bill by mid February.
Ruby: ‘It’s such a beast’
When Ruby (who does not want to use her full name) found out that intimate photos of her had been posted without her consent to an anonymous ‘imageboard’ network in 2020, she immediately reported it to the police.
But the investigation was “problematic from the start”, she says, with officers unsure how to categorise the crime. “We don't think they did a great job, but looking at the gaps in the law we understand why it wasn't taken further,” she said.
The site remains online; it changes its URL every few weeks and is difficult to track. “It’s such a beast,” the teacher from the West Midlands told openDemocracy.
Ruby soon realised that dozens of other women in her area were also victims. She started a WhatsApp group with 18 other survivors to push for changes to the law. She is keen for the Online Safety Bill to pre-empt changes in technology, with “watertight” wording and no room for interpretation: “We want whatever goes in to be really explicit and say ‘this is abuse and it doesn’t matter what the intention was’.”
However, free speech advocates and tech companies including Google warn that fear of criminal sanctions will risk service providers “over-removing content at scale”.
Sex workers’ rights
Adult content creators, who face the non-consensual reuploading of their material as well as adult content bans at the whim of tech platforms, are also concerned.
Venus, a London-based sex worker who co-runs activist organisation Sex and Rage UK, fears that the vague language in the draft bill invites censorship. “They’re basically saying anything that can be deemed harmful or psychologically damaging will be removed,” she says. "But that could be anything.”
The potential reporting of adult content as “harmful” could prevent sex workers from doing their jobs, risking psychological harm and livelihoods, she told openDemocracy. “[Removals are] really harmful for the public perception of sex work. It continues to push sex work underground and validates stigma against sex workers and sexual content.”
#NotYourPorn’s recommendations consider the experiences of sex workers, who “don’t even get a glance in this legislation”, according to Michael. “But the more people are willingly ignorant about the industry, the more exposure to exploitation and harm you put [sex workers] in.”
“It shouldn’t have to be a trade-off for one person’s rights against another,” Michael added. “We should be setting the foundations to serve everyone […] the bill does not understand [that].”
Tackling online abuse while balancing multiple interests is a long and complicated process. Despite feeling let down and traumatised, IBSA survivors are determined to speak out, and feel that people are finally starting to listen.
Still, their fight isn’t over. “My videos are constantly being reuploaded,” said Madison. “I'm all over the internet.” But she can’t bring herself to keep requesting takedowns. “It’s so retriggering. Something needs to be done.”
Get our weekly email