Empowering Young Minds: Rethinking Regulation of Behavioural Monitoring and Targeted Ads Under India’s DPDP Act

MediaNama, March 03, 2025

By Krishaank Jugiani

In today’s digital world, children are growing up with internet services, with different streams of information and experiences. A 7-year-old might watch animated shows, an 11-year-old tween may look for educational resources and a 16-year-old teenager actively participates in content creation. Although all three fall under the definition of children as per the Digital Personal Data Protection (DPDP) Act, yet their digital skills and need for online safety differ widely.

In its intent to protect young users, Section 9 of the Act prohibits platforms from tracking or carrying out behavioural monitoring of children or targeted advertising directed at children, unless exempted. This approach assumes that children, due to their evolving cognitive and decision-making capacities, are uniquely vulnerable online. They may not always fully grasp how their data is collected and used for various purposes.

However, does a blanket ban account for children’s diverse needs, awareness levels and benefits of digital engagement? Should a teenager, who understands digital trade-offs, be treated the same as a child still learning to navigate the digital world?

At the core of this debate is personalisation, enabled by behavioural monitoring, that tailors content to children’s interests. When leveraged responsibly, personalisation can enhance children’s digital experience, inclusivity, and access. A 10-year-old tween with disability can access relevant study material more easily, while a 17-year-old teenager in distress might find supportive online spaces.

Behavioural monitoring can also flag real time threats like cyberbullying, exposure to harmful content, detect predatory elements and enable timely interventions. Platforms can use behavioural data to provide culturally relevant and regional content, and can empower young content creators by helping their work reach relevant audiences.

However, when not done right, personalisation poses risks like potentially locking children into echo chambers, limiting their exposure to diverse viewpoints and critical thinking. Without proper safeguards, risks like exposure to inappropriate content and privacy violations can emerge. It can promote excessive screen time and digital addiction, leading to adverse physical and mental health.

Further, age-appropriate contextual advertising, enabled by behavioural monitoring helps sustain free access to online content, ensuring that platforms provide free digital experiences. However, without safeguards, children may face manipulative ads, influencing spending habits. An 8-year-old might unknowingly click on a repeatedly pushed ad, a 17-year-old digitally aware teenager may actively dismiss irrelevant content. Such unfair and unethical practices not only affect emotional and financial well-being but also erode trust in digital platforms.

Still, an outright ban on behavioural monitoring risks throwing the baby out with the bathwater. A depersonalised internet experience would resemble traditional media, lacking the tailored content. Children may encounter irrelevant or inappropriate material, potentially exposing children to harmful material.

Without ad-driven monetisation, platforms may also shift to subscription-based models, placing high-quality, age-appropriate content behind paywalls. This could deepen digital inequities by limiting access for financially disadvantaged users, restricting their learning and entertainment opportunities. Children might seek alternative, unregulated platforms and spaces that pose higher cybersecurity and other safety risks.

The draft Rules propose exemptions to the ban for specific sectors and certain activities. However, these exemptions are too narrow, and may hinder beneficial uses of personalisation. This is particularly important given the differing needs of children at various stages of development. A 7-year-old may require strict protective barriers, while a 16-year-old may benefit from guided exposure to diverse viewpoints and responsible digital engagement.
A one-size-fits-all approach can potentially limit the digital ecosystem’s ability to serve children effectively. A nuanced regulatory approach could address aforementioned challenges while retaining the benefits of personalisation. This could involve exploring more exemptions under the Rules, while applying certain principles.

An effective age-appropriate design code must be developed that reflects varying digital maturity levels and needs of children. Younger children under eight require stringent protections, including parental oversight. Tweens (ages 9–12) should have simplified explanations of data use, with privacy settings that allow parental guidance. Teenagers (ages 13–17) should have greater control over their privacy settings, with clear disclosures enabling informed digital engagement, discretion and agency.

A risk-based framework can help balance safety and privacy by distinguishing between uses of behavioral monitoring. Monitoring for child safety—such as detecting cyberbullying or preventing exposure to harmful content—should be permissible, whereas profiling for advertising should be prohibited. This can be done by adopting metrics such as: privacy and security—ensuring children’s data is protected from breaches while maintaining privacy; psychological impact—assessing risks associated with exposure to certain content or tracking; parental satisfaction—understanding and integrating parent’s perspectives; and data minimisation—restricting data collection and storage to essential information only.

Implementing these principles requires a regulatory framework that is both flexible while allowing space for innovation. The DPDP Act lays the foundation for digital privacy, but its effectiveness depends on a nuanced, risk-based approach. A balanced ecosystem—where personalisation provides benefits while ensuring safe online spaces—can create a digital environment that protects, empowers, and enriches young minds.

Krishaank Jugiani is a technology policy researcher at CUTS International.

This article can also be viewed at:
https://www.medianama.com/