Installation
Installation
Section titled “Installation”Choose your installation method based on your needs:
- Development Mode - Fast iteration with hot reload (recommended for development)
- Docker - Quick deployment with one command
- Docker + HTTPS - Production deployment with automatic SSL certificates
Prerequisites
Section titled “Prerequisites”Choose based on your installation method:
Development Mode:
- Bun (recommended) or Node.js 20+
- Git
Docker:
- Docker and Docker Compose
- Git
For Offline AI (optional):
- Ollama - For running local models
- At least 8GB RAM recommended for local models
Development Mode
Section titled “Development Mode”Perfect for contributing, customizing, or exploring the codebase.
1. Clone and Install
Section titled “1. Clone and Install”git clone https://github.com/1337hero/faster-chat.gitcd faster-chatbun install # or npm install2. Start Development Servers
Section titled “2. Start Development Servers”bun run dev # Starts both frontend and backendOn first run, the server will automatically:
- ✅ Generate a secure encryption key for API key storage (
server/.env) - ✅ Create required data directories
- ✅ Initialize the SQLite database
The application will be available at:
- Frontend: http://localhost:3000
- API Server: http://localhost:3001
3. First Account Setup
Section titled “3. First Account Setup”- Navigate to http://localhost:3000/login
- Click “Register” to create an account
- Your first account is automatically promoted to admin
Docker Deployment
Section titled “Docker Deployment”Quick deployment with a single command.
1. Clone the Repository
Section titled “1. Clone the Repository”git clone https://github.com/1337hero/faster-chat.gitcd faster-chat2. Generate Encryption Key
Section titled “2. Generate Encryption Key”echo "API_KEY_ENCRYPTION_KEY=$(node -e "console.log(require('crypto').randomBytes(32).toString('hex'))")" > server/.env3. Start with Docker
Section titled “3. Start with Docker”docker compose up -dAccess at: http://localhost:8787
Docker Configuration
Section titled “Docker Configuration”Default Settings:
- Port: 8787 (configurable via
APP_PORTinserver/.env) - Storage: SQLite database in
chat-datavolume - Runtime: Node.js 22 on Debian (native module compatibility)
Environment Variables (server/.env):
# Required: Encryption key for API keysAPI_KEY_ENCRYPTION_KEY=... # Auto-generated above
# Optional: Configure portAPP_PORT=8787 # Internal port (default: 8787)
# For local Ollama access from DockerOLLAMA_BASE_URL=http://host.docker.internal:11434Useful Commands:
docker compose up -d # Startdocker compose logs -f # View logsdocker compose down # Stopdocker compose up -d --build # Rebuild
# Reset databasedocker compose downdocker volume rm faster-chat_chat-datadocker compose up -dDocker + HTTPS (Production)
Section titled “Docker + HTTPS (Production)”Production deployment with automatic HTTPS certificates via Caddy.
1. Configure Domain
Section titled “1. Configure Domain”Edit Caddyfile and replace localhost with your domain:
yourdomain.com { reverse_proxy app:8787}2. Point DNS
Section titled “2. Point DNS”Create an A record pointing your domain to your server’s IP address.
3. Start with Caddy
Section titled “3. Start with Caddy”docker compose -f docker-compose.yml -f docker-compose.caddy.yml up -dCaddy Features:
- Automatic HTTPS with Let’s Encrypt
- Certificate auto-renewal
- HTTP/2 and HTTP/3 support
- Compression and security headers
- Only 13MB overhead (Alpine-based)
Access at: https://yourdomain.com
Setting Up Ollama (Optional)
Section titled “Setting Up Ollama (Optional)”Run AI models completely offline on your local machine.
Install Ollama
Section titled “Install Ollama”macOS / Linux:
curl -fsSL https://ollama.ai/install.sh | shWindows: Download from ollama.ai
Pull a Model
Section titled “Pull a Model”# Fast, general-purpose modelollama pull llama3.2
# Larger, more capable modelollama pull llama3.1:70bStart Ollama
Section titled “Start Ollama”ollama serve # Usually runs automaticallyOllama runs on http://localhost:11434 by default.
Next Steps
Section titled “Next Steps”- Quick Start Guide - Configure providers and start chatting
- Provider Configuration - Add OpenAI, Anthropic, or other cloud providers
- Admin Guide - Manage users and settings