The National Human Rights Commission (NHRC) has issued notices to multiple central ministries seeking an Action Taken Report within 15 days over alleged large-scale violations of children’s data protection norms by artificial intelligence, social media and ed-tech platforms in India. The report classifies platforms into four risk tiers. (Representative file photo)
In a notice dated March 24, 2026, the Commission said it had taken cognisance of a complaint based on a research report by the Advanced Study Institute of Asia (ASIA), which flagged “serious, large-scale, and systemic violations” by digital, social media, ed-tech and AI platforms widely accessed by children in India.
The notice has been sent to the ministry of electronics and information technology (MeitY), the ministry of home affairs, the ministry of women and child development, the ministry of education, and the department of telecommunications, among others.
The ASIA report, titled ‘DPDP Compliance in Respect to Children’s Data’, evaluates 14 widely used platforms against the provisions of the Digital Personal Data Protection (DPDP) Act, 2023. It finds that across 196 compliance checks, 71% were non-compliant, 16% partially non-compliant, and only 13% relatively compliant.
The report classifies platforms into four risk tiers. Instagram, xAI Grok, Canva, ChatGPT, Perplexity and WhatsApp fall in the very high risk category, with scores ranging from 89% to 100%. Gemini, Notebook LM, Microsoft Math Solver and Claude are categorised as high risk, while Photomath, Khan Academy and SATHEE fall in the medium risk category.
DIKSHA, a government-run platform, is the only one placed in the low-risk category with a score of 46%.
To be sure, the DPDP framework is being implemented in phases, beginning with the notification of the Rules in November 2025, which brought the basic legal framework into force along with the operationalisation of institutions such as the Data Protection Board. This is followed by a second phase after 12 months, around November 2026, when key systems such as consent managers and related obligations are expected to become operational. Full compliance is required after 18 months, around May 2027, when all provisions including notice and consent requirements, protections for children’s data, breach reporting obligations, and broader responsibilities of data fiduciaries are expected to be fully enforced. The report situates its findings against this ongoing rollout, as platforms transition toward compliance.
However, to be noted, the IT ministry has been holding consultations with the industry on fast tracking this timeline and cutting down the compliance periods, in some cases, from 18 months to immediate adherence.
The report identifies recurring gaps across platforms, including the absence of verifiable parental consent mechanisms, reliance on self-declared age verification, behavioural tracking and profiling of minors, and sharing of children’s data with third parties without adequate safeguards. It also highlights a mismatch between Indian law and platform policies, noting that while the DPDP Act defines a child as anyone under 18, most platforms use a minimum age threshold of 13, creating a “five-year regulatory gap.”
In its notice, the NHRC said the findings prima facie indicate that these platforms, acting as data fiduciaries, are “failing to discharge statutory obligations,” thereby exposing children to risks such as unlawful data processing, behavioural surveillance, profiling, and algorithmic manipulation.
The Commission added that such entities are allegedly enabling “continuous tracking, behavioural profiling, and algorithm-driven manipulation of minors,” along with large-scale sharing of children’s data with third parties without informed authorisation. It described these as pointing to “grave data fiduciary misconduct” and violations of principles such as purpose limitation and data minimisation.
Response from the ministries are awaited and the copy will be updated whenever it is received.