Ofcom writes: Ofcom has set out the next steps in its investigation into X, and the limitations of the UKs Online Safety Act in relation to AI chatbots.
Ofcom was one of the first regulators in the world to act
on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people, including children, which may amount to criminal offences.
After contacting X on 5 January ,
giving it a chance to explain how these images had been shared at such scale, we moved quickly to launch a formal investigation on 12 January into whether the company had done enough to assess and mitigate the risk of this imagery spreading on its social
media platform, and to take it down quickly where it was identified.
Since then, X has said it has implemented measures to try and address the issue. We have been in close contact with the Information Commissioners Office, which
is launching its own investigation. Other jurisdictions have also launched investigations in the weeks since we opened ours, including the European Commission on 26 January.
Our investigation remains ongoing and we continue to
work closely with the ICO and others to ensure tech firms keep users safe and protect their privacy.
Not all AI chatbots are regulated
Broadly, the Online Safety Act regulates user-to-user services,
search services and services that publish pornographic content.
Chatbots are not subject to regulation at all if they:
only allow people to interact with the chatbot itself and no other users (i.e. they are not user-to-user services);
do not search multiple websites or databases when giving responses to users (i.e. are
not search services); and
cannot generate pornographic content.
We are not investigating xAI at this time.
When we opened our investigation into X, we said we were assessing whether we should also investigate xAI, as the provider of the standalone Grok service. We
continue to demand answers from xAI about the risks it poses. We are examining whether to launch an investigation into its compliance with the rules requiring services that publish pornographic material to use highly effective age checks to prevent
children from accessing that content.
Because of the way the Act relates to chatbots, as explained above, we are currently unable to investigate the creation of illegal images by the standalone Grok service in this case.
Where we are in our X investigation
In our investigation into X, we are currently gathering and analysing evidence to determine whether X has broken the law, including using our formal
information-gathering powers. The week after we launched our investigation, we sent legally binding information requests to X, to make sure we have the information we need from the company, and further requests continue to be sent.
Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way, and they can expect to face fines if they fail to do so.
We must give any company we investigate a
full opportunity to make representations on our case. If, based on the evidence, we consider that the company has failed to comply with its legal duties, we will issue a provisional decision setting out our views and the evidence upon which we are
relying. The company will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision.
We know there is significant public interest in our investigation into X. We are
progressing the investigation as a matter of urgency. We will provide updates and will be as open as possible during this process. It is important to note that enforcement investigations such as these take time -- typically months.
We must follow strict rules about how and when we can share information publicly, as is the case for any enforcement agency, and it would not be appropriate to provide a running commentary about the substantive details of a live
investigation. Running a fair process is essential to ensuring that any final decisions are robust, effective, and that they stick.
While in the most serious cases of ongoing non-compliance we can apply for a court order requiring
broadband providers to block access to a site in the UK, the law sets a high bar for such applications, and a specific process must be followed before we can do this. It would be a significant regulatory intervention and is not one we are likely to make
routinely, given the impact it could have on freedom of expression in the UK.