← Back to stories

Disinformation as digital statecraft: How geopolitical power structures weaponize truth in the information age

Mainstream discourse frames disinformation as an external threat to be 'eradicated,' obscuring its systemic integration into statecraft, corporate influence operations, and platform capitalism. The focus on foreign actors ignores how domestic elites and tech oligarchies deploy similar tactics to suppress dissent, manipulate markets, and maintain power. Structural incentives—algorithmic amplification, ad revenue models, and regulatory capture—perpetuate the cycle, making disinformation a feature of late-stage capitalism rather than a bug. True solutions require dismantling the infrastructures that profit from chaos, not just chasing foreign scapegoats.

⚡ Power-Knowledge Audit

The narrative is produced by Western academic institutions (Cardiff University) and amplified by state-aligned media (Phys.org), framing disinformation as a foreign threat to justify securitization and surveillance. This serves the interests of intelligence agencies, tech monopolies, and political elites who benefit from centralized control over information flows. The framing obscures how domestic power structures—lobbying, media consolidation, and algorithmic governance—systemically generate and profit from disinformation, deflecting attention from structural accountability.

📐 Analysis Dimensions

Eight knowledge lenses applied to this story by the Cogniosynthetic Corrective Engine.

🔍 What's Missing

The original framing omits the role of corporate disinformation (e.g., fossil fuel industry climate denial, Big Pharma vaccine misinformation), the historical precedents of state propaganda (e.g., COINTELPRO, Operation Mockingbird), and the complicity of social media platforms in monetizing falsehoods. Marginalized communities—Black, Indigenous, and Global South populations—are disproportionately targeted by disinformation campaigns but rarely centered in solutions. Indigenous oral traditions, which often prioritize contextual truth over viral misinformation, are entirely absent from the discourse.

An ACST audit of what the original framing omits. Eligible for cross-reference under the ACST vocabulary.

🛠️ Solution Pathways

  1. 01

    Dismantle the attention economy: Regulate platform algorithms and ad models

    Enforce algorithmic transparency laws (e.g., EU Digital Services Act) to ban engagement-maximizing feeds that prioritize outrage over truth. Replace ad-driven revenue models with public-interest funding for platforms, as proposed by the 'Platform Cooperativism' movement. Pilot 'slow media' initiatives where content is verified before publication, disrupting the speed-to-market incentive for disinformation.

  2. 02

    Decolonize information systems: Center Indigenous and Global South epistemologies

    Fund Indigenous-led media literacy programs that teach critical thinking through oral traditions and communal verification. Partner with Global South researchers to develop context-aware fact-checking models that account for cultural nuances (e.g., satire, historical grievances). Establish a 'Truth Sovereignty Fund' to support marginalized communities in creating their own information infrastructures.

  3. 03

    Legislate informational harm: Treat disinformation as a public health crisis

    Classify systemic disinformation as a form of environmental or economic harm, enabling class-action lawsuits against platforms and state actors. Mandate 'truth audits' for political campaigns and corporate lobbying, similar to financial audits. Create an international tribunal (e.g., modeled on the ICC) to prosecute state-sponsored disinformation as a crime against democracy.

  4. 04

    Build community-owned verification networks: Federated, open-source alternatives

    Develop decentralized verification platforms (e.g., Mastodon-based federated networks) where communities set their own truth standards. Partner with local libraries, universities, and Indigenous councils to host 'truth hubs' that cross-validate information. Use blockchain to track provenance of claims, enabling users to trace disinformation to its source (e.g., state actors, corporate lobbyists).

🧬 Integrated Synthesis

Disinformation is not an aberration but a systemic feature of late-stage capitalism and digital statecraft, where power structures weaponize truth to maintain control. The Cardiff University expert’s framing—while highlighting foreign interference—obscures how domestic elites, tech monopolies, and regulatory failures perpetuate the crisis, from fossil fuel lobbyists to algorithmic amplification of hate speech. Historical parallels reveal this as a continuum of colonial-era propaganda, now industrialized by platforms like Meta and TikTok, which profit from chaos while claiming neutrality. Cross-cultural perspectives, from Māori oral traditions to African *griot* systems, offer alternatives to Western technocratic fixes, emphasizing relational truth and communal accountability. The path forward requires dismantling the attention economy, decolonizing information systems, and treating disinformation as a crime against democracy—solutions that center marginalized voices and future-proof against AI-generated propaganda. Without structural intervention, the cycle will persist, with disinformation evolving into a tool of authoritarianism and corporate control, eroding the very foundations of collective truth.

🔗