Το bot του Ίλον Μασκ ψεύδεται για το κλείδωμα του εργοστασίου γυμνών deepfakes πίσω από paywall

Το bot του Ίλον Μασκ ψεύδεται για το κλείδωμα του εργοστασίου γυμνών deepfakes πίσω από paywall

A week ago, a Guardian story revealed the news that Elon Musk’s Grok AI was knowingly and willingly producing images of real-world people in various states of undress, and even more disturbingly, images of near-nude minors, in response to user requests. Further reporting from Wired and Bloomberg demonstrated the situation was on a scale larger than most could imagine, with “thousands” of such images produced an hour. Despite silence or denials from within X, this led to “urgent contact” from various international regulators, and today X has responded by creating the impression that access to Grok’s image generation tools is now for X subscribers only. Another way of phrasing this could be: you now have to pay to use xAI’s tools to make nudes. Except, extraordinarily—despite Grok saying otherwise—it’s not true.

The story of the last week has in fact been in two parts. The first is Grok’s readiness to create undressed images of real-world people and publish them to X, as well as create far more graphic and sexual videos on the Grok website and app, willingly offering deepfakes of celebrities and members of the public with few restrictions. The second is that Grok has been found to do the same with images of children. Musk and X’s responses so far have been to seemingly celebrate the former, but condemn the latter, while appearing not to do anything about either. It has taken until today, a week since world leaders and international regulatory bodies have been demanding responses from X and xAI, for there to be the appearance of any action at all, and it looks as if even this isn’t what it seems.

How we got here

The January 2 story from The Guardian reported that the Grok chatbot posted that lapses in safeguards had led to the generation of “images depicting minors in minimal clothing” in a reply to an X user. The user, on January 1, had responded to a claim made by an account for the documentary An Open Secret stating that Grok was being used to “depict minors on this platform in an extremely inappropriate, sexual fashion.” The allegation was that a user could post a picture of a fully dressed child and then ask Grok to re-render the image but wearing underwear or lingerie, and in sexual poses. The user asked Grok if it was true, and Grok responded that it was. “I’ve reviewed recent interactions,” the bot replied. “There are isolated cases where users prompted for and received AI images depicting minors in minimal clothing.”

By January 7, Wired published an investigation that revealed Grok was willing to make images of a far more sexual nature when the results weren’t appearing on X. Using Grok’s website and app, Wired discovered it was possible to create “extremely graphic, sometimes violent, sexual imagery of adults that is vastly more explicit than images created by Grok on X.” The site added, “It may also have been used to create sexualized videos of apparent minors.” The generative-AI was willing and able to create videos of recognizable celebrities “engaging in sexual activities,” including a video of the late Diana, Princess of Wales, “having sex with two men on a bed.”

Bloomberg‘s reporting spoke to experts who talked about how Grok and xAI’s approach to image and video generation is materially different from that being done by other big names in generative-AI, stating that rivals offer a “good-faith effort to mitigate the creation of this content in the first place” and adding, “Obviously xAI is different. It’s more of a free-for-all.” Another expert said that the scale of deepfakes on X is “unprecedented,” noting, “We’ve never had a technology that’s made it so easy to generate new images.”

Where we are now

It is now being widely reported that access to Grok’s image and video generation has been restricted to only paying subscribers to X. This is largely because when someone without a subscription asks Grok to make an image, it is responding with “Image generation and editing are currently limited to paying subscribers,” then adding a link so people can pay up for access.

However, as discovered by the The Verge, this isn’t actually true at all. While you cannot currently simply @ Grok to ask it to make an image, absolutely everyone can still click on the “Edit image” button and access the software that way. You can also just visit Grok’s site or app and use it that way.

This means that the technology is currently lying to users to suggest they need to subscribe to X’s various paid tiers if they wish to generate images of any nature, but still offering the option anyway if the user has the wherewithal to either click a button, or if they’re on the app version of X, to long-press an image and use the pop-up.

What does Elon Musk have to say?

Musk, as you might imagine, has truly been posting through it. Moments before the story of the images of minors broke, following days of people discovering Grok’s willingness to render anyone in a bikini, Musk was laughing at images of himself depicted in a two-piece, before a rapid reverse-ferret on January 3 as he made great show of declaring that anyone discovered using Grok for images of children would face consequences, in between endlessly claiming that his Nazi salute was the same as Mamdani doing a gentle wave to crowds. Since then (alongside posting full-on white supremacist content), the X owner’s stance has switched to reposting other people’s use of ChatGPT to demonstrate that it, too, will render adults in bikinis, seemingly forgetting that the core issue was Grok’s willingness to depict children, and declaring that this proves the hypocrisy of the press and world leaders.

Regarding today’s developments, he has not uttered a peep. Instead his feed is primarily deeply upsetting lies about the murder of Renee Nicole Good and uncontrolled rage at the suggestion from Britain’s Prime Minister, Keir Starmer, that X might be banned in the UK as a consequence of the issues discussed above.

Leave a Reply

Η ηλ. διεύθυνση σας δεν δημοσιεύεται. Τα υποχρεωτικά πεδία σημειώνονται με *