HelpDeveloper GuideDeveloper Setup Guide

Developer Setup Guide

This guide is for developers who want to run, modify, or deploy LanJAM.

Prerequisites

Before you begin, make sure you have the following installed:

ToolVersionPurpose
Node.js22 or laterJavaScript runtime
pnpm9.xPackage manager
Docker & Docker ComposeLatestRuns PostgreSQL, MinIO, and Whisper
OllamaLatestRuns AI models locally

Quick start

# Clone the repository
git clone <repo-url> && cd lanjam-ai

# Start infrastructure (PostgreSQL, MinIO, Whisper)
docker compose up -d

# Install dependencies
pnpm install

# Copy environment variables and fill in values
cp .env.example .env

# Run database migrations
pnpm db:migrate

# Start the development server
pnpm dev

Visit http://localhost:5173 to access the app. The first visit triggers the setup flow to create an admin account.

Installing Ollama

Ollama runs the AI models that power the chat. Install it on the machine that will do the AI processing:

macOS

brew install ollama
ollama serve

Linux

curl -fsSL https://ollama.com/install.sh | sh
ollama serve

Windows

Download the installer from ollama.com and run it. Ollama runs as a background service.

Once Ollama is running, pull a model:

ollama pull llama3.2

Project structure

LanJAM is a TurboRepo monorepo with pnpm workspaces:

apps/
  web/           → React Router 7 (SSR, Vite) + Tailwind v4
packages/
  api/           → Route handlers, services, middleware
  db/            → Drizzle ORM schemas, repositories, migrations
  utils/         → Shared types (Zod), error classes, utilities
  file-extract/  → File text extraction (PDF, DOCX, etc.)

Key commands

CommandDescription
pnpm devStart dev server (all workspaces)
pnpm buildBuild all packages and app
pnpm lintLint with Biome
pnpm formatFormat with Biome
pnpm checkType-check all workspaces
pnpm db:generateGenerate Drizzle migrations
pnpm db:migrateRun database migrations
pnpm db:pushPush schema to DB (dev only)

Architecture

The API follows a Handler → Service → Repository pattern:

  • Handlers (packages/api/src/routes/) — Validate requests, call services, return responses
  • Services (packages/api/src/services/) — Business logic for external systems (Ollama, MinIO, embeddings)
  • Repositories (packages/db/src/repositories/) — Data access only, all methods require userId for data isolation

Authentication uses httpOnly cookies with SHA-256 hashed session tokens. Passcodes are hashed with argon2id.

Environment variables

Create a .env file from .env.example. Key variables:

VariableDescriptionDefault
DATABASE_URLPostgreSQL connection stringpostgresql://lanjam:lanjam@localhost:5432/lanjam
MINIO_ENDPOINTMinIO hostlocalhost
MINIO_PORTMinIO API port9100
MINIO_ACCESS_KEYMinIO access keyminioadmin
MINIO_SECRET_KEYMinIO secret keyminioadmin
OLLAMA_HOSTOllama API endpointhttp://localhost:11434
WHISPER_HOSTWhisper API endpointhttp://localhost:8000
SESSION_SECRETSecret for sessions (min 32 chars)
SESSION_DAYSSession expiry in days180
MAX_UPLOAD_MBMax file upload size25

Docker services

The docker-compose.yml runs three services:

ServicePortNotes
PostgreSQL 17 (+ pgvector)5432Persistent volume, health checks
MinIO9100 (API), 9101 (Console)S3-compatible object storage
Whisper8000Speech-to-text, CPU optimised

Database migrations

When you change the schema in packages/db/src/schema/:

# Generate a migration file
pnpm db:generate

# Apply migrations
pnpm db:migrate

Migration files are stored in packages/db/drizzle/.

Deployment considerations

  • LanJAM is designed for LAN-only access. Do not expose it to the public internet.
  • For production, set a strong SESSION_SECRET (at least 32 random characters).
  • Ollama can run on a separate, more powerful machine — set OLLAMA_HOST to point to it, or use the remote model feature in the admin panel.
  • The Whisper service is optional — voice input will not work without it, but everything else functions normally.
  • Consider backing up the PostgreSQL database and MinIO data volumes regularly.