Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
Seen similar stuff on TikTok.
That's the big problem with ad marketplaces and automation, the ads are rarely vetted by a human, you can just give them money, upload your ad and they'll happily display it. They rely entirely on users to report them which most people don't do because they're ads and they wont take it down unless it's really bad.
Okay this is going to be one of the amazingly good uses of the newer multimodal AI, it’ll be able to watch every submission and categorize them with a lot more nuance than older classifier systems.
We’re probably only a couple years away from seeing a system like that in production in most social media companies.
Nice pipe dream, but the current fundamental model of AI is not and cannot be made deterministic. Until that fundamental chamge is developed, it isnt possible.
I have to constantly remind people about this very simple fact of AI modeling right now. Keep up the good work!