An AI Image Generator’s Exposed Database Reveals What People Really Used It For
1 min read
Summary
Security researcher Jeremiah Fowler discovered an open AI database belonging to South Korean firm GenNomis, which offered image generation and chatbot tools to users.
More than 95,000 records were exposed, including prompt data and images of celebrities aged to appear as children.
There was also AI-generated child sexual abuse material, as well as pornographic images of adults, face-swap images and what appeared to be photographs of real people which had been used to create explicit images.
The database has now been closed off, but not before more than 45 GB of data was left open.
Professor Clare McGlynn, a law expert from Durham University, said the case demonstrated the extent of the market for AI technology that could create such abusive images.
The exposure of the data shows the lack of controls or guidelines on the use of such AI technology.