The UK government unveiled a fresh law on Tuesday aiming to combat the generation of AI-produced sexually explicit deepfake images. Under this proposed legislation, individuals engaging in such activities may be prosecuted and face an unlimited fine, even if the images are not extensively disseminated but are generated with the purpose of causing distress to the targeted individual. This move is part of a larger initiative by the government to strengthen legal safeguards for women.

In the last ten years, the proliferation of deep learning image generation technology has made it increasingly simple for individuals using standard personal computers to fabricate deceptive pornographic material by substituting the faces of the performers with those who have not consented to such actions. This practice gave birth to the term “deepfake” around 2017, named after a Reddit user known as “deepfakes” who shared AI-manipulated pornographic content on the platform. Since then, the concept has evolved to comprise entirely new images and videos crafted solely through neural networks that have been trained on pictures of the subjects.

This issue is not confined to the UK. In March, the emergence of deepfake nude images of female students in Florida resulted in charges being filed against two boys aged 13 and 14. The proliferation of publicly available image synthesis models like Stable Diffusion since 2022 has heightened the sense of urgency among US regulators to take action against (or at least penalize) the creation of non-consensual deepfakes. The UK government is pursuing a similar agenda.

“Under the proposed legislation, individuals who produce these repugnant images without consent could face criminal charges and potentially an unlimited fine. If such images are subsequently shared widely, the offenders may even face imprisonment,” as per a statement from the UK Ministry of Justice. “This new law will establish that creating sexually explicit deepfakes, even with no intention of sharing but solely to cause alarm, embarrassment, or distress to the victim, will constitute a criminal offense.”

Recent developments saw the contentious Online Safety Act criminalize the dissemination of non-consensual deepfake images. The new bill, still pending parliamentary approval to be enforced, would mark the first instance of creating sexually explicit deepfakes involving non-consenting adults being deemed illegal in the UK (the distinction here lies between sharing and creating). The government asserts that current legislation already covers the creation of sexual deepfakes involving minors.

Furthermore, the government is contemplating reinforcing current laws to enable charges for both producing and circulating deepfake content, possibly leading to more severe penalties from the Crown Prosecution Service (CPS).

In a statement, Minister for Safeguarding Laura Farris MP emphasized the government’s position, declaring, “The creation of deepfake sexual images is contemptible and utterly unacceptable, regardless of whether the images are disseminated. This new offense sends a clear message that fabricating such material is unethical, often misogynistic, and illegal.”