UK regulator wants to ban apps that can make deepfake nude images of children

6 months ago 7

The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal.

 "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone — a stranger, a classmate, or even a friend — could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children’s Commissioner Dame Rachel de Souza. "There is no positive reason for these [apps] to exist."

Source: https://www.engadget.com/cybersecurity/uk-regulator-wants-to-ban-apps-that-can-make-deepfake-nude-images-of-children-110924095.html?src=rss

Read Entire Article

Disclaimer of liability !!!

NEWS.SP1.RO is an automatic news aggregator. In each article, taken over by NEWS.SP1.RO with maximum 500 characters from the original article, the source name and hyperlink to the source are specified.

The acquisition of information aims to promote and facilitate access to information, in compliance with intellectual property rights, in accordance with the terms and conditions of the source.

If you are the owner of the content and do not wish to publish your materials, please contact us by email at [email protected] and the content will be deleted as soon as possible.