In the past fortnight, the generation of a deluge of sexually explicit, non-consensual intimate images of women and children by Grok – the chatbot embedded in Elon Musk’s platform, X – has pushed the government into uncharacteristically robust action. The technology secretary has brought into force the new offence of creating sexually explicit deepfakes, passed via the Data (Use and Access) Act last summer, and reiterated her commitment to banning nudification apps via the crime and policing bill, currently in the Lords. The prime minister has given Ofcom, who opened a formal investigation into the platform last Monday (12 January), his full backing to use all their enforcement powers if necessary and promised at PMQs to “strengthen existing laws and prepare for legislation if it needs to go further”. Serious discussions are now underway as to how and when to strengthen the Online Safety Act and, particularly, to fill gaps relating to AI chatbots.
But, despite increasing numbers of its own backbenchers exiting the platform, the government is resisting calls to stop using it for its own departmental communications. The argument given by ministers is that it’s important for the government to “keep a voice on a platform that’s used for so many people”. Even before Musk crossed a very large red line in recent weeks – then called the UK government fascist for complaining about it – the number of UK adults on X had been falling significantly since he bought Twitter and rebranded it in October 2022. There were 13% – or 5 million –fewer adults on the platform in June 2025, compared to two years earlier. With 19m monthly UK users still on the site (in June last year at least), Ofcom notes that it “remains the largest microblogging service”; its rivals Threads (4.3 million UK users) and Bluesky (2.7 million) lag far behind.
But even if there is an argument that X is still important in terms of reach, what does the government’s ongoing presence there mean in practice? If the government is there, then other organisations have to be there too: charities, NGOs, campaigners who want to engage with, influence or hold the government to account have no choice but to stay. If the government posts on there first, before posting on other social media platforms, then that’s where the journalists and Westminster watchers have to stay. (It’s notable that – at the time of writing – no ministerial departments have accounts on Bluesky, despite many of the politicos and commentators who thrived on the Twitter of old having migrated there in recent years.) If the government is there, it drives more engagement (and money) to the platform while putting its ministers at risk of their content appearing alongside illegal content or non-consensual imagery: researchers have found Grok-generated explicit content next to posts from, among others, the health secretary and foreign secretary.
If the government is there, it implicitly accepts that the degradation and dehumanisation of women in public life – including many of its own MPs, such as Jess Asato – is a small price to pay for “keeping a voice” on the platform. If the government is there, it makes a mockery – as the Online Safety Act Network and others argued in an open letter to the prime minister last week – of the commitment the PM made, less than a month ago, that the target to halve VAWG in a decade (a key manifesto commitment) would take “a whole-of-government, whole-of-society effort. It is the first step in a truly national endeavour that prioritises prevention, tackling the root causes of this violence, while relentlessly pursuing its perpetrators and supporting its victims and survivors.”
If the government is there – and, at the very least, if it is not also posting syndicated content on similar, alternative platforms before it goes onto X – it is propping it up as a legitimate, neutral social media platform in the UK, when it is anything but. Musk may now have taken steps, in the face of the threats from the government and Ofcom, to stop Grok continuing to facilitate industrial-scale abuse of women and girls. But the line has been crossed. If the prime minister and his political colleagues aren’t prepared to vote with their feet now, then departmental comms directors might pause to consider what even baser level of abhorrent abuse or violence has to occur on that platform before that decision is made.
Maeve Walsh is the director of the Online Safety Act Network and a former senior civil servant