Brace for yet another expansion to the UK’s Online Safety Bill: The Ministry of Justice has announced changes to the law which are aimed at protecting victims of revenge porn, pornographic deepfakes and other abuses related to the taking and sharing of intimate imagery without consent — in a crackdown on a type of abuse that disproportionately affects women and girls.
The government says the latest amendment to the Bill will broaden the scope of current intimate image offences — “so that more perpetrators will face prosecution and potentially time in jail”.
Other abusive behaviors that will become explicitly illegal include “downblousing” (where photographs are taken down a women’s top without consent); and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.
The government describes the planned changes as a comprehensive package of measure to modernize laws in this area.
It’s also notable as it’s the first time it has criminalized the sharing of deepfakes.
Increasingly accessible and powerful image- and video-generating AIs have led to a rise in deepfake porn generation and abuse, driving concern about harms linked to this type of AI-enabled technology.
Just this week, the Verge reported that the maker of the open source AI text-to-image generator Stable Diffusion had tweaked the software to make it harder for users to generate nude and pornographic imagery — apparently responding to the risk of the generative AI tech being used to create pornographic images of child abuse material.
But that’s just one example. Many more tools for generating pornographic deepfakes remain available.
From revenge porn to deepfakes
This has led to some targeted changes over the years. For example, the government made ‘upskirting’ illegal via a change to the law that came into force back in 2019. While, in March, it said ‘cyberflashing’ would be added as an offence to the incoming online safety legislation.
However it has now decided further amendments are needed to expand and clarify offences related to intimate images in order to make it easier for police and prosecutors to pursue cases and to ensure legislation keeps pace with technology.
It’s acting on several Law Commission recommendations in its 2021 review of intimate image abuse.
This includes repealing and replacing current legislation with new offences the government believes will low the bar for successful prosecutions, including a new base offence of sharing an intimate image without consent (so in this case there won’t be a requirement to prove intent to cause distress); along with two more serious offences based on intent to cause humiliation, alarm, or distress and for obtaining sexual gratification.
The planned changes will also create two specific offences for threatening to share and installing equipment to enable images to be taken; and criminalize the non-consensual sharing of manufactured intimate images (aka deepfakes).
The government says around 1 in 14 adults in England and Wales have experienced a threat to share intimate images, with more than 28,000 reports of disclosing private sexual images without consent recorded by police between April 2015 and December 2021.
It also points to the rise in abusive deepfake porn — noting one example of a website that virtually strips women naked receiving 38 million hits in the first eight months of 2021.
A growing number of UK lawmakers and campaign groups have been calling for a ban on the use of AI to nudify women since abusive use of the tech emerged — as this BBC report into one such site, called DeepSukebe, reported last year.
Commenting on the planned changes in a statement, deputy prime minister and justice secretary, Dominic Raab, said:
We must do more to protect women and girls, from people who take or manipulate intimate photos in order to hound or humiliate them.
Our changes will give police and prosecutors the powers they need to bring these cowards to justice and safeguard women and girls from such vile abuse.
Under the government’s plan, the new deepfake porn offences will put a legal duty on platforms and services that fall under incoming online safety legislation to remove this type of material if it’s been shared on their platforms without consent — with the risk of serious penalties, under the Online Safety Bill, if they fail to remove illegal content.
Victims of revenge porn and other intimate imagery abuse have complained for years over the difficulty and disproportionate effort required on their part to track down and report images that have been shared online without their consent.
Ministers argue the proposed changes to UK law will improve protections for victims in this area.
Commenting in another supporting statement, DCMS secretary of state, Michelle Donelan, said:
Through the Online Safety Bill, I am ensuring that tech firms will have to stop illegal content and protect children on their platforms but we will also upgrade criminal law to prevent appalling offences like cyberflashing.
With these latest additions to the Bill, our laws will go even further to shield women and children, who are disproportionately affected, from this horrendous abuse once and for all.
One point to note is that the Online Safety Bill remains on pause while the government works on drafting amendments related to another aspect of the legislation.
The government has denied this delay will derail the bill’s passage through parliament — but there’s no doubt parliamentary time is tight. So it’s unclear when (or even whether) the bill will actually become UK law, given there’s only around two years left before a General Election must be called.
Additionally, parliamentary time must also be found to make the necessary changes to UK law on intimate imagery abuse.
The government has offered no timetable for that component as yet — saying only that it will bring forward this package of changes “as soon as parliamentary time allows”, and adding that it will announce further details “in due course”.
UK to criminalize deepfake porn sharing without consent by Natasha Lomas originally published on TechCrunch
Source: TechCrunch Japan