Classroom Setup for HandsOnAI¶
This guide walks educators and students through the technical setup needed to use HandsOnAI in a classroom environment. It assumes basic familiarity with Python and supports any OpenAI-compatible LLM provider.
๐ป System Requirements¶
- Operating System: Linux, macOS, or WSL on Windows
- Python: 3.10+ (3.6+ minimum)
- Access to an OpenAI-compatible LLM provider:
- Local: Ollama for privacy and control
- Cloud: OpenAI, OpenRouter, Together AI, etc. for advanced models
- Internet access (for installation and cloud providers)
๐ง Installation Steps¶
Option 1: Simple Installation (Recommended for Students)¶
pip install hands-on-ai
Option 2: Development Installation (Recommended for Educators)¶
1. Clone the HandsOnAI Repository¶
git clone https://github.com/teaching-repositories/hands-on-ai.git
cd hands-on-ai
2. Create a Virtual Environment¶
We recommend uv for fast installs, but venv or virtualenv also works.
uv venv .venv
source .venv/bin/activate
3. Install HandsOnAI (Editable Mode)¶
uv pip install -e '.[dev]'
๐ Provider Configuration¶
HandsOnAI works with any OpenAI-compatible provider. Choose the best option for your classroom:
Option 1: Local Ollama (Privacy & Control)¶
Best for: Schools concerned about data privacy, offline environments
- Install Ollama: https://ollama.com/download
- Start the server:
ollama run llama3 - No additional configuration needed - HandsOnAI defaults to
http://localhost:11434
Option 2: Shared Classroom Server (Recommended)¶
Best for: Centralized management, consistent performance
import os
os.environ['HANDS_ON_AI_SERVER'] = 'https://your-classroom-server.edu'
os.environ['HANDS_ON_AI_API_KEY'] = input('Enter your API key: ')
Option 3: Cloud Providers (Advanced Models)¶
Best for: Access to latest models, minimal setup
import os
# OpenAI
os.environ['HANDS_ON_AI_SERVER'] = 'https://api.openai.com'
os.environ['HANDS_ON_AI_API_KEY'] = 'sk-your-openai-key'
# Or OpenRouter (access to many models)
os.environ['HANDS_ON_AI_SERVER'] = 'https://openrouter.ai/api'
os.environ['HANDS_ON_AI_API_KEY'] = 'sk-or-your-key'
os.environ['HANDS_ON_AI_MODEL'] = 'openai/gpt-4'
๐ Test the Setup¶
Quick Python Test:¶
from hands_on_ai.chat import get_response
print(get_response("Hello! Are you working?"))
CLI Check:¶
handsonai doctor
Module Tests:¶
# Test chat module
chat "Tell me a joke"
# Test agent module
agent "What is 15 * 23? Use the calculator tool"
# Test RAG module
rag ask "What is machine learning?" --docs path/to/documents/
๐งช Classroom Preparation Checklist¶
For Local Ollama Setup:¶
- โ Ensure all students have Python 3.10+ installed
- โ Install Ollama on lab machines or student laptops
- โ
Pre-download models if bandwidth is limited:
ollama pull llama3 - โ
Test with
handsonai doctoron each machine
For Centralized Server Setup:¶
- โ Set up OpenAI-compatible server with authentication
- โ Generate API keys for each student or class
- โ Provide students with server URL and API keys
- โ Test connection from student machines
For Cloud Provider Setup:¶
- โ Set up accounts with chosen provider (OpenAI, OpenRouter, etc.)
- โ Configure billing and usage limits
- โ Distribute API keys securely to students
- โ Monitor usage to avoid unexpected costs
๐ Student Setup Instructions¶
Provide students with this simple setup script:
# hands_on_ai_setup.py
import os
# Configuration (update with your classroom details)
os.environ['HANDS_ON_AI_SERVER'] = 'https://your-classroom-server.edu'
os.environ['HANDS_ON_AI_MODEL'] = 'llama3'
# Get API key securely
api_key = input('Enter your API key: ')
os.environ['HANDS_ON_AI_API_KEY'] = api_key
# Test the setup
try:
from hands_on_ai.chat import get_response
response = get_response("Hello! Confirm you're working correctly.")
print("โ
Setup successful!")
print(f"Response: {response}")
except Exception as e:
print(f"โ Setup failed: {e}")
print("Please check your configuration and try again.")
๐งฐ Related Tools & Commands¶
Diagnostic Commands¶
handsonai doctor- Check provider connection and configurationhandsonai --help- View CLI options
Module Commands¶
chat --help- Chat module help and optionsagent --help- Agent module help and optionsrag --help- RAG module help and options
Development Tools (if using dev installation)¶
pytest- Run test suitemkdocs serve- Live preview of documentation siteruff check- Code linting
๐ฏ Quick Troubleshooting¶
"Connection refused" or "404 errors"¶
- โ Check if your provider server is running
- โ Verify the server URL is correct
- โ Confirm API key is valid (if required)
- โ Check firewall settings for classroom servers
"Model not found" errors¶
- โ Verify model name matches provider's format
- โ
For Ollama: run
ollama pull model-namefirst - โ For cloud providers: check available models in their docs
Slow responses¶
- โ Try a smaller/faster model
- โ Check network connection for cloud providers
- โ Consider using local Ollama for faster responses
๐ See Also¶
- Configuration Guide - Detailed configuration options
- Ollama Guide - Local Ollama setup instructions
- Education Guide - Pedagogical guidance
- Provider Compatibility (see main README) - Supported providers