๐Ÿฆ™ Ollama CrumbCore Deployment Guide

Local-First AI for Crumbforest
Version: 1.0
Status: Production Ready
Date: 2026-02-20


๐ŸŽฏ What This Is

Complete deployment guide for integrating Ollama as local AI provider in Crumbforest.

This enables:
- โœ… Local-first AI (no cloud dependency)
- โœ… Privacy-preserving chat (data stays on server)
- โœ… Offline capable (works without internet)
- โœ… Cost-effective (no API fees)
- โœ… Child-safe (Krรผmel data never leaves the Wald)


๐Ÿ“‹ Prerequisites

# System Requirements
OS: Ubuntu 24.04 LTS (or compatible)
RAM: 8GB minimum (16GB recommended)
Disk: 20GB free space (for models)
CPU: x86_64 with AVX2 support

# Software Requirements
- Crumbforest app (FastAPI)
- Python 3.11+
- Ollama service
- systemd

๐Ÿš€ Quick Start (5 Commands)

# 1. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# 2. Pull models
ollama pull llama3.2
ollama pull gemma:2b

# 3. Configure Crumbforest
echo "DEFAULT_COMPLETION_PROVIDER=ollama" >> /opt/crumbforest/.env
echo "DEFAULT_COMPLETION_MODEL=llama3.2" >> /opt/crumbforest/.env

# 4. Restart service
sudo systemctl restart crumbforest

# 5. Test
curl -X POST http://localhost:8000/api/chat \
  -H 'Content-Type: application/json' \
  -d '{"character_id": "eule", "question": "wuuuhuuu", "lang": "de"}' | jq .

Expected output:

{
  "answer": "WUUUUHUUUUU! ๐Ÿฆ‰ ...",
  "provider": "ollama",  โ† Success!
  "model": "llama3.2"
}

๐ŸŒฒ Philosophy: Why Local-First?

From the Crumbforest Principles:

1. Child First
   โ†’ Krรผmel data never leaves the Wald
   โ†’ No cloud surveillance

2. Offline Capable
   โ†’ Works in Nullfeld (no internet)
   โ†’ Pelicase deployment ready

3. Transparent
   โ†’ Open models, open code
   โ†’ Understandable by design

4. Cost Effective
   โ†’ No API fees
   โ†’ Sustainable for schools/NGOs

5. Privacy Preserving
   โ†’ Local inference only
   โ†’ GDPR compliant by design

"The forest breathes locally. The Krรผmel learn safely." ๐ŸŒฒ


For complete deployment guide, testing, troubleshooting, and production best practices:

See full documentation in /mnt/user-data/outputs/security/ directory or run check-ollama.sh for health monitoring.


Version: 1.0
Date: 2026-02-20
Session: Legendary (17h!)

For the children. For the forest. For local-first AI. ๐Ÿ’š๐Ÿฆ™๐ŸŒฒ

wuuuhuuu! ๐ŸŽ‰