Can AI image generators be policed to prevent explicit deepfakes of children?
Related Articles
-
Apple pulls AI image apps from the App Store after learning they could generate nude images
Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.” According to a new report from 404 Media, Apple has removed multiple AI apps from the App Store that claimed they could “create nonconsensual nude images.” more
-
OpenAI improves prevention for Generative AI misuse
As artificial intelligence (AI) continues to advance and integrate into various aspects of our lives, the importance of ensuring the safety of vulnerable populations, particularly children, has become increasingly evident. Generative AI, a subset of AI technologies capable of creating original content such as text, images, and videos, presents unique challenges in this regard. The […]
-
Cops are Using AI Cameras to Generate Police Reports
A police tech company that makes body cams and tasers has released a new futuristic product that has some people unnerved: an AI camera that generates police reports from audio. [Read More]