By Hadas Gold, NCS
(NCS) — Elon Musk’s Grok synthetic intelligence chatbot will no longer edit “images of real people in revealing clothing” on the X platform, the corporate confirmed Wednesday night, following world outrage after Grok was discovered to be complying with consumer requests to digitally undress images of adults and in some circumstances kids.
“We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis. This restriction applies to all users, including paid subscribers,” X wrote by way of its Safety staff account.
Within the final week xAi, which owns each Grok and X, restricted picture technology for Grok on X to paying X premium subscribers. Researchers and NCS’s staff had noticed that in latest days, Grok’s X account had modified the way it responded basically to consumer’s picture technology requests, even for these subscribed to X premium. X’s publish on Wednesday night confirmed these modifications.
However, researchers at AI Forensics, a European non-profit that investigates algorithm, stated they noticed “inconsistencies in the treatment of pornographic content generation” between public interactions with Grok on X and personal chat on Grok.com.
X reiterated on Wednesday that they “take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary. Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.”
On Wednesday Musk said in a publish on X that he was “not aware of any naked underage images generated by Grok. Literally zero.” Grok “will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state,” he added.
However, researchers stated that whereas absolutely nude images had been uncommon, the most important subject was Grok complying with consumer requests to change images of minors and place them revealing clothes, together with bikinis and underwear, in addition to in sexually provocative positions. Creators of these varieties of non-consensual intimate images might nonetheless be topic to felony prosecution of Child Sexual Abuse Material and are doubtlessly topic to fines and jail time underneath the Take it Down Act, signed final yr by President Donald Trump.
On Wednesday, California Attorney General Rob Bonta announced an investigation into the “proliferation of nonconsensual sexually explicit material produced using Grok.”
Grok remains to be banned in Indonesia and Malaysia in consequence of the picture technology controversy. UK regulator Ofcom introduced Monday it has launched a formal investigation of X, though Prime Minister Keir Starmer’s workplace stated Wednesday he welcomes reports X is addressing the difficulty.
The-NCS-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.