Classroom Setup for ChatCraft¶
This guide walks educators and students through the technical setup needed to use ChatCraft in a classroom environment. It assumes basic familiarity with Python and the command line.
๐ป System Requirements¶
- Operating System: Linux, macOS, or WSL on Windows
- Python: 3.8+
- Ollama installed and running (for local LLM inference)
- Internet access (for first-time setup or optional remote models)
๐ง Installation Steps¶
1. Clone the ChatCraft Repository¶
git clone https://github.com/teaching-repositories/chat.git
cd chat
2. Create a Virtual Environment¶
We recommend uv
for fast installs, but venv
or virtualenv
also works.
uv venv .venv
source .venv/bin/activate
3. Install ChatCraft (Editable Mode)¶
uv pip install -e '.[dev]'
๐ค Run Ollama¶
ChatCraft expects a local Ollama server by default.
1. Install Ollama¶
Follow instructions at: https://ollama.com/download
2. Start the Server¶
ollama run llama3
codellama
, mistral
, etc.)
๐ Test the Setup¶
CLI Check:¶
chat doctor
Interactive Mode:¶
chat interactive
๐งช Classroom Preparation¶
- โ Ensure all students have Python and Ollama installed
- โ Pre-install the models if bandwidth is limited
- โ
Use
just doctor
orchat doctor
to verify working setup - โ
Optionally, preload
.venv
and models on lab machines - โ
Use offline bundle (
just bundle
) if internet is restricted
๐ฆ Optional: Offline Setup¶
Use the offline zip bundle:
just bundle
ChatCraft_Offline_Bundle.zip
to students for isolated environments.
๐ Remote Ollama (Advanced)¶
You can point ChatCraft to a remote Ollama server by setting the environment variable:
export CHATCRAFT_SERVER_URL=http://your-ollama-server:11434
๐งฐ Related Tools¶
just doctor
- Check environmentjust repl
orchat interactive
- Start REPLjust build-all
- Build docs and mini-projectsmkdocs serve
- Live preview of documentation site