TikTok’s parent company ByteDance has turned off the AI-image generation capabilities of one of its AI apps after 404 Media found that users on 4chan found a way to use the AI app to generate porn, despite policies and guardrails in the app designed to stop the creation of that type of content.
So far, it doesn’t appear that users have figured out how to easily create nonconsensual AI-generated images of celebrities using Cici AI, as they often do when using these apps. I have seen a series of pornographic images of two different celebrities that users claim were made with Cici AI, but was unable to reproduce those images in my testing. But the news shows how users are able to easily bypass guardrails in generative AI tools designed to prevent certain types of content, even when they are produced by the biggest technology companies in the world.
In January, Forbes first reported that ByteDance had quietly launched the app, called Cici AI, “in the past three months,” as well as two other generative AI apps, all of which are at least partially powered by OpenAI’s ChatGPT, accessed through a Microsoft Azure license. Cici AI, which is developed by a ByteDance subsidiary named Spring (SG) Pte. Ltd, is an “AI assistant” which functions like many other chatbots and can answer questions and generate images based on written prompts.
Earlier this month, I started to see AI-generated pornographic images in the Telegram and 4chan communities that had previously spread the AI-generated nonconsensual sexual images of Taylor Swift that went viral on Twitter in January, and which were generated with Microsoft’s Designer tool.
The Cici AI-generated images are far more graphic than the images users were able to generate with Microsoft’s Designer before Microsoft closed the loophole that allowed users to generate those images, but so far I have not seen it generate the likeness of a real person.
https://archive.vn/B1sr0