Holding tech companies accountable for tech-facilitated sexual violence

On the 20th of January 2026, School of Sexuality Education officially deactivated our account on X, formerly known as Twitter. This decision was made in response to personal accounts of X’s new AI bot ‘Grok’ being used to forcibly and unconsensually remove the clothes of women and girls in photographs. As an organisation who serves as a preventative tool against gender-based sexual violence, we no longer want to be a part of a platform that has enabled exactly this kind of abuse. 

The harms enacted on X reflect a concerning trend in AI use for online sexual abuse. It is important to consider that previous developments in technology have come with new ways to perpetuate harms. Tracking apps can be used for safety, but also for manipulation. Social media can be used for genuine connection, but also to spread hate and leak personal information. In our collaborative research project with Prof. Jessica Ringrose, Prof. Kaitlyn Regehr and Betsy Milne, we found repeated patterns of technology facilitating harms like image-based sexual harassment and abuse in young people. Maximising visibility and spreadability are the driving forces of social media, which comes with great risk of reproducing and spreading harm. This is something that leaders in social media have a responsibility to actively regulate with the development of technology, to protect the rights of people who use their services. 

Action was not taken soon enough to stop the sexual abuse created by X’s AI bot. It took pressure from our government to bar this feature, although this came nearly half a year after first reports emerged of ‘Grok’ being used to create deepfakes of women. On the part of the UK government and of those working for X, this was not fast enough, allowing for the online sexual abuse to continue, affecting many more people. It is clear that those with decision making power at X do not have the best interests of its users at heart. 

Image by Evie K

We did consider what it would mean for us to stay on a platform like X and act as a counter-voice in that space. But our focus is on the work we do with young people and educators, delivering in-person inclusive, anti-racist, feminist relationships, health, and sex education (RSHE). We use social media as a space of solidarity, to provide resources and to raise awareness of key issues surrounding RSHE. This does not work in an unregulated space full of hate, misinformation, and mischaracterisation, especially when this very abuse isn’t held accountable. A big part of our work is teaching empathy, equality, and accountability, which a platform like X has repeatedly shown it does not seem to value.

Many educators and caregivers can find the news about platforms like X and AI usage particularly alarming, and we are right to be concerned. Our concern is what makes us hold these tech companies accountable, and demand action from our elected representatives. But in our personal lives, there are many ways we can reclaim our autonomy from these technological spectres.

Firstly, we must recognise that young people have a right to be online. Like any space, there is potential for harm. But our study with young people on their experiences online showed that it can also be an opportunity to access helpful resources, deepen relationships, and reach supportive communities.  Keeping them safe does not involve banning phones and social media. It helps school’s managements but erodes students’ trust and ability to access support

Secondly, we must teach young people how to be active bystanders. This means building their understanding of how they can look out for each other as well as themselves. Teaching proper media literacy assists this, helping them navigate online spaces safely and consensually. It’s crucial in preventing harm, teaching young people to be responsible members of their communities.

Finally, we must invest in inclusive, comprehensive RSHE. A key finding in our study, that RSHE is essential for young people to be equipped to navigate online sexual harm and abuse. Most importantly, better digital sex education is something young people are calling for themselves. They understand that knowing what their rights are and how harm is perpetrated allows them to become another voice who can identify when spaces, offline and online, are becoming abusive ones. When big social media giants refuse to regulate the spaces they control, our young people want the tools to self-regulate and keep each other safe.

Here are some things you can do to help equip your child to navigate online spaces: 

  • Engage in judgement-free, continuous conversation with them about online spaces, their interests and concerns as well as harms these spaces may pose.

  • Help them to understand these online spaces as not just social media but also as businesses with interests other than that of your child’s wellbeing. 

  • Teach the importance of consent broadly to include uses of technology.

  • Be open to considering your own uses of online spaces—we are all learning constantly as technology develops!

Speak to one of our unembarrassable team members to find out how our work can support you.