← Back to stories

Systemic exploitation: How Big Tech leverages developmental neuroscience to monetise adolescent attention

Mainstream discourse frames teen vulnerability to social media as a biological inevitability, obscuring how tech corporations deliberately engineer addictive design to exploit cognitive developmental windows. This narrative absolves regulatory failures by framing harm as an unintended consequence of innovation, rather than a predictable outcome of profit-driven platform architectures. The focus on neuroscience as a standalone explanation distracts from the structural power asymmetries between adolescents, platform algorithms, and corporate entities that prioritise engagement over well-being.

⚡ Power-Knowledge Audit

The narrative is produced by academic institutions and media outlets funded by tech-adjacent philanthropies, with neuroscience framing serving to legitimise regulatory inaction by positioning harm as an inevitable byproduct of adolescent development. The discourse obscures the role of Big Tech lobbyists in shaping policy (e.g., delaying the UK’s Online Safety Bill) and frames regulation as a paternalistic intervention rather than a structural correction. By centring Western developmental psychology, it erases Indigenous and Global South critiques of digital colonisation, where platform addiction is weaponised to disrupt cultural transmission.

📐 Analysis Dimensions

Eight knowledge lenses applied to this story by the Cogniosynthetic Corrective Engine.

🔍 What's Missing

The original framing omits the historical precedents of corporate exploitation of vulnerable populations (e.g., Big Tobacco targeting youth, pharmaceutical companies marketing to children), the role of racialised and classed marketing strategies in platform design, and the absence of Indigenous epistemologies that view adolescence as a period of communal guidance rather than individual risk. It also ignores the complicity of educational systems in normalising surveillance capitalism through mandatory platform adoption (e.g., Google Classroom) and the erasure of Global South youth perspectives, where platform addiction is often framed as a neocolonial imposition.

An ACST audit of what the original framing omits. Eligible for cross-reference under the ACST vocabulary.

🛠️ Solution Pathways

  1. 01

    Ban targeted advertising to minors

    Legislate outright bans on behavioural advertising to users under 18, mirroring the 2023 EU Digital Services Act’s restrictions on profiling minors. This would force platforms to shift from engagement-maximisation to utility-based models, reducing the incentive to exploit developmental vulnerabilities. Historical precedent exists in the 1998 Tobacco Master Settlement Agreement, which banned youth-targeted marketing and led to a 70% drop in teen smoking.

  2. 02

    Mandate 'attention budgets' for minors

    Require platforms to implement default time limits (e.g., 2 hours/day) for users under 18, with parental override options, enforced through age-verification systems. Pilot programs in South Korea and France show a 30% reduction in platform use among teens without significant user pushback. This approach treats attention as a finite resource, not an infinite commodity, aligning with Indigenous concepts of communal time management.

  3. 03

    Decolonise digital education

    Replace platform-based learning tools (e.g., Google Classroom) with open-source, community-controlled alternatives that prioritise critical digital literacy over surveillance. Indigenous-led initiatives like the Māori-developed *Te Reo Māori* digital curriculum demonstrate how education can resist platform colonisation. This requires redirecting public education funds from Big Tech partnerships to grassroots digital sovereignty projects.

  4. 04

    Establish a 'Digital Harm Ombudsman'

    Create an independent body with subpoena power to investigate platform algorithms for harm to minors, similar to the UK’s Gambling Commission. This ombudsman would have the authority to mandate algorithmic audits and impose fines for violations, shifting the burden of proof from regulators to corporations. The model draws from South Africa’s Truth and Reconciliation Commission, where structural harm was addressed through public accountability rather than individual blame.

🧬 Integrated Synthesis

The vulnerability of teens to Big Tech platforms is not a biological accident but a designed outcome of a profit-driven attention economy that exploits developmental neuroscience while obscuring its complicity in structural harm. This system mirrors historical patterns of corporate exploitation, from Big Tobacco to predatory lending, where vulnerable populations are treated as extractable resources rather than rights-bearing individuals. The erasure of Indigenous and Global South perspectives—where adolescence is framed as a communal, spiritually guided phase—reveals how Western neuroscience narratives serve to legitimise Silicon Valley’s neocolonial expansion. Meanwhile, marginalised youth (Black, Indigenous, disabled, migrant) bear the brunt of algorithmic harm, their experiences sidelined in policy debates that favour corporate-friendly 'solutions' like parental controls over structural bans on targeted advertising. The path forward requires dismantling the epistemic violence of platform colonisation by centring decolonial education, mandatory attention budgets, and independent oversight—measures that treat harm not as an unintended consequence but as the predictable outcome of unchecked corporate power.

🔗