Skip to content

Classroom Setup for HandsOnAI

This guide walks educators and students through the technical setup needed to use HandsOnAI in a classroom environment. It assumes basic familiarity with Python and supports any OpenAI-compatible LLM provider.


๐Ÿ’ป System Requirements

  • Operating System: Linux, macOS, or WSL on Windows
  • Python: 3.10+ (3.6+ minimum)
  • Access to an OpenAI-compatible LLM provider:
  • Local: Ollama for privacy and control
  • Cloud: OpenAI, OpenRouter, Together AI, etc. for advanced models
  • Internet access (for installation and cloud providers)

๐Ÿ”ง Installation Steps

pip install hands-on-ai

1. Clone the HandsOnAI Repository

git clone https://github.com/teaching-repositories/hands-on-ai.git
cd hands-on-ai

2. Create a Virtual Environment

We recommend uv for fast installs, but venv or virtualenv also works.

uv venv .venv
source .venv/bin/activate

3. Install HandsOnAI (Editable Mode)

uv pip install -e '.[dev]'
This installs the core package, CLI entry points, and development tools.


๐ŸŒ Provider Configuration

HandsOnAI works with any OpenAI-compatible provider. Choose the best option for your classroom:

Option 1: Local Ollama (Privacy & Control)

Best for: Schools concerned about data privacy, offline environments

  1. Install Ollama: https://ollama.com/download
  2. Start the server: ollama run llama3
  3. No additional configuration needed - HandsOnAI defaults to http://localhost:11434

Best for: Centralized management, consistent performance

import os
os.environ['HANDS_ON_AI_SERVER'] = 'https://your-classroom-server.edu'
os.environ['HANDS_ON_AI_API_KEY'] = input('Enter your API key: ')

Option 3: Cloud Providers (Advanced Models)

Best for: Access to latest models, minimal setup

import os
# OpenAI
os.environ['HANDS_ON_AI_SERVER'] = 'https://api.openai.com'
os.environ['HANDS_ON_AI_API_KEY'] = 'sk-your-openai-key'

# Or OpenRouter (access to many models)
os.environ['HANDS_ON_AI_SERVER'] = 'https://openrouter.ai/api'  
os.environ['HANDS_ON_AI_API_KEY'] = 'sk-or-your-key'
os.environ['HANDS_ON_AI_MODEL'] = 'openai/gpt-4'

๐Ÿš€ Test the Setup

Quick Python Test:

from hands_on_ai.chat import get_response
print(get_response("Hello! Are you working?"))

CLI Check:

handsonai doctor
Should report that your provider is reachable and show available models.

Module Tests:

# Test chat module
chat "Tell me a joke"

# Test agent module  
agent "What is 15 * 23? Use the calculator tool"

# Test RAG module
rag ask "What is machine learning?" --docs path/to/documents/

๐Ÿงช Classroom Preparation Checklist

For Local Ollama Setup:

  • โœ… Ensure all students have Python 3.10+ installed
  • โœ… Install Ollama on lab machines or student laptops
  • โœ… Pre-download models if bandwidth is limited: ollama pull llama3
  • โœ… Test with handsonai doctor on each machine

For Centralized Server Setup:

  • โœ… Set up OpenAI-compatible server with authentication
  • โœ… Generate API keys for each student or class
  • โœ… Provide students with server URL and API keys
  • โœ… Test connection from student machines

For Cloud Provider Setup:

  • โœ… Set up accounts with chosen provider (OpenAI, OpenRouter, etc.)
  • โœ… Configure billing and usage limits
  • โœ… Distribute API keys securely to students
  • โœ… Monitor usage to avoid unexpected costs

๐ŸŽ“ Student Setup Instructions

Provide students with this simple setup script:

# hands_on_ai_setup.py
import os

# Configuration (update with your classroom details)
os.environ['HANDS_ON_AI_SERVER'] = 'https://your-classroom-server.edu'
os.environ['HANDS_ON_AI_MODEL'] = 'llama3'

# Get API key securely
api_key = input('Enter your API key: ')
os.environ['HANDS_ON_AI_API_KEY'] = api_key

# Test the setup
try:
    from hands_on_ai.chat import get_response
    response = get_response("Hello! Confirm you're working correctly.")
    print("โœ… Setup successful!")
    print(f"Response: {response}")
except Exception as e:
    print(f"โŒ Setup failed: {e}")
    print("Please check your configuration and try again.")

Diagnostic Commands

  • handsonai doctor - Check provider connection and configuration
  • handsonai --help - View CLI options

Module Commands

  • chat --help - Chat module help and options
  • agent --help - Agent module help and options
  • rag --help - RAG module help and options

Development Tools (if using dev installation)

  • pytest - Run test suite
  • mkdocs serve - Live preview of documentation site
  • ruff check - Code linting

๐ŸŽฏ Quick Troubleshooting

"Connection refused" or "404 errors"

  • โœ… Check if your provider server is running
  • โœ… Verify the server URL is correct
  • โœ… Confirm API key is valid (if required)
  • โœ… Check firewall settings for classroom servers

"Model not found" errors

  • โœ… Verify model name matches provider's format
  • โœ… For Ollama: run ollama pull model-name first
  • โœ… For cloud providers: check available models in their docs

Slow responses

  • โœ… Try a smaller/faster model
  • โœ… Check network connection for cloud providers
  • โœ… Consider using local Ollama for faster responses

๐Ÿ“š See Also