UNICEF: More and more children are becoming victims of sexual deepfakes

Арестова Татьяна World
VK X OK WhatsApp Telegram
According to researchers, one in every 25 children becomes a victim of criminals using sexual content deepfakes. This conclusion was reached by experts from UNICEF, the international organization ECPAT, and Interpol during a study that covered 11 countries.

UNICEF notes that this issue is so serious that there may be at least one child in every classroom who has fallen victim to such crimes.

Deepfakes are images, videos, or audio files created using artificial intelligence (AI) technologies that look or sound real, and are increasingly used to create sexualized content involving minors.

“Children understand the risks associated with the use of AI. In some of the countries involved in the study, about 66% of children expressed concern about the possibility of fake sexual images or videos being created featuring them,” the researchers reported.

The creation of deepfakes with sexual content is considered a form of violence against children, UNICEF specialists emphasized.

“The Foundation supports initiatives by AI developers that implement safety measures by default and reliable protection mechanisms to prevent the misuse of their technologies,” the organization noted.

However, many AI models are developed without adequate safety measures. The problem is exacerbated when generative AI tools are integrated into social networks, where processed images can spread rapidly.

“AI developers must ensure safety at the design stage. All digital platforms are required to prevent the dissemination of materials containing images of sexual violence against children created using AI, rather than simply removing them after abuse has already occurred,” UNICEF emphasizes.

The photo on the main page is illustrative: pngtree.com.
VK X OK WhatsApp Telegram

Read also: