📄 ACADEMIC WHITEPAPER (APA Style) — Version 1.0

Title: Structural Antifascism Through Technical Architecture: A Local‑First Approach for NGO Educational Infrastructure
Author: Branko May Trinkwald
Affiliation: Crumbforest Research Initiative
Date: February 2026
Length: ≈10 pages

Abstract

This paper introduces structural antifascism through architecture as a design paradigm for educational and humanitarian technology systems. Unlike policy‑based or ethics‑declaration approaches, structural antifascism treats architecture itself as the primary safeguard against authoritarian misuse. Drawing on 23 years of robotics pedagogy, field deployments in refugee settlements, and contemporary debates on algorithmic governance, this work argues that cloud‑centric educational AI systems reproduce the very asymmetries they claim to address: centralization, opacity, dependency, and extractive data practices.

We present a practical alternative: a local‑first, audit‑ready, child‑safe computing environment, built on low‑cost hardware, offline‑capable AI (Ollama), local vector search (Qdrant), minimal attack surface (Pelicase design), and verifiable, reproducible interaction logs. Through the case study of the Crumbforest Pelicase deployment, we illustrate how NGOs can implement sovereign educational infrastructures that resist authoritarian co‑optation by design. The paper concludes with recommendations for policy, architecture, and pedagogy in humanitarian contexts.

Keywords: antifascism, local‑first AI, NGO technology, auditability, child protection, decentralization, sovereignty, educational infrastructure


1. Introduction

Global NGOs increasingly rely on cloud‑based educational technology to deliver digital literacy, language learning, and computational training (UNESCO, 2024). Yet cloud infrastructures introduce systemic risks: surveillance, vendor lock‑in, opaque algorithmic decision‑making, and the concentration of informational power (Pasquale, 2015; Zuboff, 2019). When educational data is collected centrally, children’s questions, behavioral profiles, and learning pathways become extractive assets.

This paper argues that technical architecture—not institutional policy—determines whether technology can be co‑opted by authoritarian, corporate, or extractive actors. A system that is inherently non‑centralizable is resistant even if political conditions deteriorate. This is the principle of structural antifascism: systems that cannot be used fascistically.

The argument emerges from long‑term pedagogical practice (1999–2025), robotics education, and recent field work deploying low‑infrastructure digital learning systems in East Africa. The approach is grounded in practical pedagogy: children must learn in a space where every answer can be verified, logged, and understood. “Okay Google” provides answers without evidence; the Pelicase architecture provides evidence for every answer.


2. Background: Cloud Dependence and Authoritarian Compatibility

Commercial cloud platforms—Google Classroom, Microsoft 365 Education, Amazon’s education APIs—offer convenience but replicate a familiar structural pattern:

Requirement for Authoritarian Control Cloud Platform Property
Centralization of truth Centralized data centers
Identity fixity Mandatory cloud accounts
Behavioral surveillance Telemetry, logs, profiling
Opaque decision logic Proprietary ML models
Dependency Subscription-based access

These properties are compatible with both democratic and authoritarian regimes because their architecture derives from market optimization, not civic resilience.

Research on digital authoritarianism highlights how centralized platforms facilitate censorship, tracking, and coercion (Bradshaw & Howard, 2020). Even in benevolent contexts, dependency attenuates autonomy (Couldry & Mejias, 2019). NGOs working with vulnerable populations must assume worst-case scenarios—not best-case corporate assurances.


3. Structural Antifascism: Definition

We define structural antifascism as:

A design principle in which a system’s architecture prevents its use for centralized control, surveillance, or coercion, independent of operator intent.

Thus, antifascism becomes:
- Not ideological, but architectural.
- Not policy-based, but technical.
- Not dependent on trust, but on non-extractive design.
- Not enforced by ethics, but by system topology.

This aligns with scholarship on democratic infrastructure (Kelty, 2008), local-first computing (Kleppmann et al., 2022), and critical data studies (Benjamin, 2019).


4. Methodology: Pedagogy Meets Infrastructure

The Pelicase system was designed through:
1. Longitudinal observation (1999–2025) of children’s computational learning behavior.
2. Field deployments in low-resource settings (e.g., Nakivale refugee settlement).
3. Iterative architectural testing:
- transparent logs
- reproducible system states
- offline‑first operations
4. Security analysis based on child protection principles (“Krümelschutz”):
- no central identity provider
- no data exfiltration
- no hidden telemetry
- verifiable answer flows

The result is a bounded truth environment—a finite, self‑contained computational space where all actions are observable, reproducible, and auditable.


5. System Architecture

5.1 Hardware Layer

  • Raspberry Pi 5: low power, robust, repairable
  • Pelicase enclosure: shock-proof, dust-proof, field-serviceable
  • Local WiFi access point: no Internet required
  • Solar compatibility for off-grid education

5.2 Software Stack

  • Ollama for local LLM inference
  • Qdrant for local vector retrieval
  • TTYD-based isolated terminals for Bash learning
  • Containerized services enabling reproducible deployments
  • Audit logging through local-only, read-only system journals

5.3 Security Model

  • No central cloud identity required
  • Per‑user isolated directories (chmod 700)
  • Verifiable logs (journalctl, ls -la, stat)
  • Attack surface minimized to local LAN
  • No third-party telemetry or analytics

6. Case Study: Crumbforest Pelicase Deployment

6.1 Context

The deployment occurred in early 2026 as part of an NGO‑supported digital literacy initiative. The goal was to equip 30 learners with foundational computing, Bash literacy, and safe access to offline AI tools.

6.2 Observations

Children demonstrated:
- faster conceptual adoption with verifiable commands (ls, cat, stat)
- increased agency when answers were checkable
- decreased dependency on “black box” outputs
- improved understanding of privacy and security

Instructors observed:
- drastically simplified maintenance
- no vendor lock‑in
- no data governance conflicts
- resilience to connectivity outages

Qualitative indicators suggest that auditability increases trust more than accuracy.


7. Analysis: Why Architecture Matters for NGOs

7.1 Independence from Political Instability

Local‑first systems preserve data sovereignty, community autonomy, and operational continuity even if:
- national Internet is censored
- cloud providers terminate services
- political regimes shift authoritarian

7.2 Avoiding Extractive Data Economies

Cloud platforms monetize usage metrics, behavioral profiles, and linguistic patterns. Local systems collect nothing by default.

7.3 Educational Benefits

Auditability teaches causal reasoning, computational literacy, and epistemic humility (“Check the logs”). Children learn not only to use technology—but to understand it.


8. Discussion

Cloud-based educational AI is structurally incompatible with the requirements of vulnerable populations. No amount of ethical policy can compensate for architectural dependency.

In contrast, bounded truth environments—self-contained systems where every output is explainable and evidential—offer a blueprint for civic‑resilient infrastructure.

This suggests a broader theoretical claim:

Antifascist technology is local-first technology.

This claim does not imply isolationism; rather, it argues for a substrate of sovereignty onto which optional connectivity can be layered.


9. Recommendations for NGOs

  1. Adopt local-first architectures.
  2. Avoid systems that require cloud identity or constant connectivity.
  3. Mandate auditability.
  4. Every output should have a log, a file path, and a reproducible state.
  5. Prioritize child protection through invisibility.
  6. Do not store what you do not need.
  7. Deploy sovereign compute units.
  8. Containers, Pelicases, classroom kits.
  9. Treat AI as an appliance, not a service.
  10. Local inference prevents data exfiltration and reduces operating cost.
  11. Promote computational literacy.
  12. Let learners “look under the hood” via Bash and transparent logs.

10. Conclusion

Structural antifascism is not a political slogan but a systems‑engineering paradigm. In educational and humanitarian contexts, it is insufficient to merely prohibit harmful uses; the architecture itself must make them impossible.

Local-first, transparent, verifiable infrastructures provide a pathway for NGOs to deploy safe, equitable, and resilient AI‑supported learning environments. The Crumbforest Pelicase demonstrates that this approach is not theoretical—it is practical, low-cost, and deployable today.

NGOs, educators, and technologists now face a choice:
Build systems that require trust—or systems that make trust unnecessary.


References (APA Style)

  • Benjamin, R. (2019). Race after technology: Abolitionist tools for the new jim code. Polity.
  • Bradshaw, S., & Howard, P. (2020). The global organization of social media disinformation campaigns. Journal of International Affairs, 71(1), 23–32.
  • Couldry, N., & Mejias, U. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
  • Kelty, C. (2008). Two bits: The cultural significance of free software. Duke University Press.
  • Kleppmann, M., et al. (2022). Local-first software: You own your data, in spite of the cloud. Communications of the ACM, 65(12), 46–53.
  • Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
  • UNESCO. (2024). Digital learning for all? Global report on technology in education.
  • Zuboff, S. (2019). The age of surveillance capitalism. PublicAffairs.