← Back to stories

Algorithmic design fuels relapse in eating disorders by prioritizing engagement over health

Mainstream coverage often overlooks how social media platforms are structurally incentivized to maximize user engagement, which can inadvertently promote harmful content. These platforms use algorithms that prioritize emotionally charged content, including diet and eating disorder material, to increase time-on-site and ad revenue. This systemic issue is compounded by the lack of regulation and accountability for platform designers who profit from these dynamics.

⚡ Power-Knowledge Audit

This narrative is produced by researchers and journalists who highlight the effects of social media but often lack the power to influence platform design. It serves the interests of public health advocates and users harmed by algorithmic manipulation, while obscuring the role of tech companies that profit from addictive design and resist regulatory scrutiny.

📐 Analysis Dimensions

Eight knowledge lenses applied to this story by the Cogniosynthetic Corrective Engine.

🔍 What's Missing

The original framing omits the role of platform ownership structures, the lack of transparency in algorithmic curation, and the absence of indigenous or holistic health perspectives in digital wellness discourse. It also fails to address the historical roots of body image issues in Western media and the global spread of these norms.

An ACST audit of what the original framing omits. Eligible for cross-reference under the ACST vocabulary.

🛠️ Solution Pathways

  1. 01

    Implement ethical AI design standards

    Platforms should adopt design principles that prioritize user well-being over engagement metrics. This includes auditing algorithms for harmful content amplification and incorporating health literacy into content moderation policies.

  2. 02

    Expand regulatory oversight of social media

    Governments should enforce transparency and accountability in platform operations, including mandating that companies disclose how algorithms shape user experiences and how they respond to harmful content.

  3. 03

    Integrate holistic health education into digital literacy programs

    Schools and community organizations should teach digital literacy that includes critical media analysis and body positivity, helping users recognize and resist harmful content patterns online.

  4. 04

    Amplify marginalized voices in platform governance

    Platform decision-making should include representatives from affected communities, such as eating disorder survivors and mental health advocates, to ensure that policies reflect diverse needs and experiences.

🧬 Integrated Synthesis

The systemic issue of social media's role in eating disorder relapse is rooted in the intersection of algorithmic design, commercial incentives, and cultural norms. Historically, beauty standards have been weaponized by media to sell products and services, a pattern now amplified by data-driven platforms. Indigenous and cross-cultural perspectives offer alternative models of health and body image that challenge the individualistic logic of Western diet culture. Scientific research confirms the harmful effects of algorithmic curation, but it is often disconnected from the ethical and regulatory frameworks needed to address them. Marginalized voices, particularly those of eating disorder survivors, must be integrated into platform governance to ensure that design decisions reflect their lived realities. By combining ethical AI design, regulatory reform, and holistic education, we can begin to shift the systemic drivers of digital harm.

🔗