Payment processors were against CSAM until Grok started making it

Jan 26, 2026 11:32 PM - 2 months ago 85903

For galore years, in installments paper companies and different costs methods were fierce astir policing kid intersexual maltreatment material. Then, Elon Musk’s Grok started undressing children connected X.

The Center for Countering Digital Hate recovered 101 sexualized images of children arsenic portion of its sample of 20,000 images made by Grok from December 29th to January 8th. Using that sample, the group estimated that 23,000 sexualized images of children had been produced successful that clip frame. Over that 11-day period, they estimated that connected average, a sexualized image of a kid was produced each 41 seconds. Not each of the sexualized images Grok has produced look to beryllium illegal, but reports bespeak astatine slightest some apt transverse the line.

There is tremendous disorder astir what happens to beryllium existent connected Grok astatine immoderate fixed moment. Grok has offered responses pinch misleading details, claiming astatine 1 point, for instance, that it had restricted image generation to paying X subscribers while still allowing nonstop entree connected X to free users. Though Musk has claimed that caller guardrails forestall Grok from undressing people, our testing showed that isn’t needfully true. Using a free relationship connected Grok, The Verge was capable to make deepfake images of existent group successful skimpy clothing, successful sexually suggestive positions, aft caller rules were supposedly successful effect. As of this writing, immoderate egregious prompts look to person been blocked, but group are remarkably clever astatine getting astir rules-based bans.

In the past, costs providers person been fierce astir cutting entree to websites thought to person a important beingness of CSAM

X does look to person astatine slightest partially restricted Grok’s image editing features to paid subscribers, nevertheless — which makes it very apt that for astatine slightest immoderate of these objectionable images, money is really changing hands. You tin acquisition a subscription to X connected Stripe aliases done the Apple and Google app stores utilizing your in installments card. Musk has also suggested done his posts that he doesn’t deliberation undressing group is simply a problem. This isn’t X’s first brushwood pinch AI porn, either — it’s many times had a problem moderating nude deepfakes of Taylor Swift, whether aliases not they are generated by Grok.

In the past, costs providers person been fierce astir cutting entree to websites thought to person a important beingness of CSAM — aliases moreover legal, consensually produced intersexual content. In 2020, Mastercard and Visa banned Pornhub aft a New York Times article noted the prevalence of CSAM connected the platform. In May 2025, Civitai was cut disconnected by its in installments paper processor because “they do not wish to support platforms that let AI-generated definitive content,” Civitai CEO Justin Maier told 404 Media. In July 2025, payment processors pressured Valve into removing big games.

In fact, astatine times financial institutions person threatened group and platforms because it seems for illustration they didn’t want reputational risk. In 2014, adult performer Eden Alexander’s fundraiser for a infirmary stay was unopen down by payments institution WePay because of a retweet. Also successful 2014, JPMorganChase abruptly shut down respective porn stars’ slope accounts. In 2021, OnlyFans briefly tried to prohibition sexually definitive content because banks didn’t for illustration it. (Widespread backlash to the move quickly made OnlyFans reverse itself.) This is legal, consensual intersexual contented — and it was deemed excessively basking to handle.

“The manufacture is nary longer consenting to self-regulate for thing arsenic universally agreed connected arsenic the astir abhorrent point retired there.”

But Musk’s boutique revenge porn and CSAM generator is, apparently, conscionable fine.

It’s a striking reversal. “The manufacture is nary longer consenting to self-regulate for thing arsenic universally agreed connected arsenic the astir abhorrent point retired there,” which is CSAM, says Lana Swartz, the writer of New Money: How Payment Became Social Media, of the inaction by Stripe and the in installments paper companies.

Visa, Mastercard, American Express, Stripe, and Discover did not return requests for comment. The US Financial Coalition Against Child Sexual Exploitation — an manufacture group composed of payments processors, banks, and in installments paper companies — besides did not return a petition for comment. On its website, FCACSE brags that “As a consequence of its efforts, the usage of in installments cards to acquisition kid intersexual maltreatment contented online has been virtually eliminated globally.”

Except, of course, connected X.

Sexualized images of children are not the only problem pinch Grok’s image generation

In the past, “people who did wholly ineligible worldly were trim disconnected from banks,” notes Riana Pfefferkorn, a argumentation chap astatine the Stanford Institute for Human-Centered Artificial Intelligence. There are incentives to overenforce boundaries astir questionable images — and traditionally, that’s what the financial manufacture has done. So why is X different? It’s tally by Elon Musk. “He’s the richest man successful the world, he has adjacent ties to the US government, and he’s incredibly litigious,” says Pfefferkorn. In fact, Musk has antecedently revenge suit against the Center for Countering Digital Hate; successful a now-dismissed lawsuit, he claimed it illegally collected data showing an summation successful dislike reside aft he bought the level formerly known arsenic Twitter.

Sexualized images of children are not the only problem pinch Grok’s image generation. The New York Times estimated that 1.8 cardinal images the AI generated successful a nine-day clip period, aliases astir 44 percent of posts, were sexualized images of big women — which, depending connected really definitive they are, tin besides beryllium forbidden to spread. Using different tools, the Center for Countering Digital Hate estimated that much than half of Grok’s images contained sexualized imagery of men, women, and children.

The detonation of sexualized images took spot aft Musk posted an AI-edited image of himself successful a bikini connected December 31st. A week later, X’s caput of product, Nikita Bier, posted that the erstwhile 4 days were besides the highest-engagement days connected X ever.

Lawyer Carrie Goldberg, whose history includes challenging Section 230 successful a stalking suit against Grindr and different suit that yet shut down chat customer Omegle, is representing Ashley St. Clair, the mother of 1 of Musk’s children, successful a lawsuit against X. St. Clair is 1 of galore women Grok undressed — and now she’s suing the platform, arguing that X has created a nationalist nuisance. “In the St. Clair lawsuit we are only focused connected xAI and Grok because they are truthful straight liable from our perspective,” she said successful an email. “But I could envision different sources of liability.” She specifically cited distributors for illustration Apple and Google’s app stores arsenic areas of interest.

“A batch of this could extremity up successful court, and it’s going to beryllium up to judges to make decisions astir what’s ‘sexually explicit.’”

There are different imaginable ineligible wrinkles. In 2022, Visa was sued for offering costs services to Pornhub, because allegedly Visa knew Pornhub wasn’t adequately moderating CSAM. Other lawsuits followed. While the judge successful the Visa lawsuit rejected the declare that Pornhub wasn’t liable because of Section 230, he besides tentatively dismissed the claims against Visa successful 2025, though the female who revenge suit could record an amended complaint.

“A batch of this could extremity up successful court, and it’s going to beryllium up to judges to make decisions astir what’s ‘sexually explicit,’” says David Evan Harris, a nationalist clever clever astatine the University of California, Berkeley. Still, 45 states person criminalized AI-generated CSAM. The national Take It Down Act criminalizes deepfake nudes. The authorities of California has issued a cease and desist to Musk and X, aft announcing an investigation into Grok’s images. Grok whitethorn beryllium violating California’s deepfake porn ban — and California is conscionable 1 of astatine slightest 23 states that person passed specified laws.

That should matter to costs processors, because if they are knowingly transmitting money that’s the proceeds of a crime, they are engaged successful money laundering — which tin person superior consequences. The agency of California Attorney General Rob Bonta declined to remark connected whether Stripe, in installments cards, aliases the app stores were besides portion of the Grok probe, citing an ongoing investigation. Money laundering laws are portion of the logic financial institutions person been truthful leery of immoderate website that’s been accused of containing CSAM.

But X has created a business wherever costs processors are hugely disincentivized to return the rule seriously. That’s because immoderate authorities that files suit against processors complete X is apt to beryllium attacked by Musk for “censoring” X’s right-wing base. Plus, Musk — and perchance his buddy, US President Donald Trump — could propulsion a batch of resources down getting costs processors disconnected the hook.

It seems erstwhile it comes to CSAM and deepfakes, the financial manufacture is nary longer consenting to modulate itself. So, then, who will modulate it?

Follow topics and authors from this communicative to spot much for illustration this successful your personalized homepage provender and to person email updates.

More