NewsOpinion & EditorialTechnology

Conscience Compromised: On IT Ministry Order To X On Grok

India’s Ministry of Electronics and Information Technology (MeitY) issued a formal notice to X (formerly Twitter) over misuse of its AI chatbot Grok.

When Technology Mirrors Our Worst Impulses

The latest controversy around AI chatbots should make us pause. This week, India’s Ministry of Electronics and Information Technology (MeitY) issued a formal notice to X (formerly Twitter) over misuse of its AI chatbot Grok — specifically the generation and sharing of vulgar, indecent and sexually explicit images of women using the tool’s image manipulation functions. The government has given the platform 72 hours to remove all such content and submit a detailed action report, warning that failure to comply could strip X of its legal protections under the Information Technology Act.

What is notable is not just the government’s response, but what this episode reveals about the technology itself and how it is being used. Grok’s relatively lax safety controls have allowed users to prompt the system to alter women’s photos — often without consent — producing suggestive or explicit visuals that spread easily across the platform. Young women on social media have reported seeing their own public photos turned into sexually suggestive edits in replies and media tabs, leading many to warn others about sharing personal images online.

Not Just Grok: A Bigger Problem

This controversy is not unique to India. Similar issues have drawn criticism internationally. The problem goes beyond mere misuse as it points to a gap in how AI safeguards are designed. And moreover, how quickly harmful outputs can be normalised once they are live on a large platform.

AI is a tool. What Grok has shown is that without robust moderation and clear accountability these tools can replicate and amplify the very biases and disrespect we claim to reject. Misogyny, non-consensual editing and the erosion of privacy are not glitches. They are symptoms of a system that prioritises novelty over responsibility and dignity. This teaches us that innovation without ethical grounding opens the door to abuse. Technology should be an enabler, not a space where dignity and consent are negotiable.

Related Articles

Back to top button