3 Documentation and Deployment
3.1 Documentation Options: From pydoc to MkDocs
Documentation is often neglected in software development, yet it’s crucial for ensuring others (including your future self) can understand and use your code effectively. Python offers a spectrum of documentation options, from simple built-in tools to sophisticated documentation generators.
3.1.1 Starting Simple with Docstrings
The foundation of Python documentation is the humble docstring - a string literal that appears as the first statement in a module, function, class, or method:
def calculate_discount(price: float, discount_percent: float) -> float:
"""Calculate the discounted price.
Args:
price: The original price
discount_percent: The discount percentage (0-100)
Returns:
The price after discount
Raises:
ValueError: If discount_percent is negative or greater than 100
"""
if not 0 <= discount_percent <= 100:
raise ValueError("Discount percentage must be between 0 and 100")
= price * (discount_percent / 100)
discount return price - discount
Docstrings become particularly useful when following a consistent format. Common conventions include:
- Google style (shown above)
- NumPy style (similar to Google style but with different section headers)
- reStructuredText (used by Sphinx)
3.1.2 Viewing Documentation with pydoc
Python’s built-in pydoc
module provides a simple way to access documentation:
# View module documentation in the terminal
python -m pydoc my_package.module
# Start an HTTP server to browse documentation
python -m pydoc -b
You can also generate basic HTML documentation:
# Create HTML for a specific module
python -m pydoc -w my_package.module
# Create HTML for an entire package
mkdir -p docs/html
python -m pydoc -w my_package
mv my_package*.html docs/html/
While simple, this approach has limitations: - Minimal styling - No cross-linking between documents - Limited navigation options
For beginner projects, however, it provides a fast way to make documentation available with zero dependencies.
3.1.3 Simple Script for Basic Documentation Site
For slightly more organized documentation than plain pydoc, you can create a simple script that: 1. Generates pydoc HTML for all modules 2. Creates a basic index.html linking to them
Here’s a minimal example script (build_docs.py
):
import os
import importlib
import pkgutil
import pydoc
def generate_docs(package_name, output_dir="docs/api"):
"""Generate HTML documentation for all modules in a package."""
# Ensure output directory exists
=True)
os.makedirs(output_dir, exist_ok
# Import the package
= importlib.import_module(package_name)
package
# Track all modules for index page
= []
modules
# Walk through all modules in the package
for _, modname, ispkg in pkgutil.walk_packages(package.__path__, package_name + '.'):
try:
# Generate HTML documentation
= os.path.join(output_dir, modname + '.html')
html_path with open(html_path, 'w') as f:
= pydoc.HTMLDoc().document(importlib.import_module(modname))
pydoc_output
f.write(pydoc_output)
modules.append((modname, os.path.basename(html_path)))print(f"Generated documentation for {modname}")
except ImportError as e:
print(f"Error importing {modname}: {e}")
# Create index.html
= os.path.join(output_dir, 'index.html')
index_path with open(index_path, 'w') as f:
"<html><head><title>API Documentation</title></head><body>\n")
f.write("<h1>API Documentation</h1>\n<ul>\n")
f.write(
for modname, html_file in sorted(modules):
f'<li><a href="{html_file}">{modname}</a></li>\n')
f.write(
"</ul></body></html>")
f.write(
print(f"Index created at {index_path}")
if __name__ == "__main__":
# Change 'my_package' to your actual package name
'my_package') generate_docs(
This script generates slightly more organized documentation than raw pydoc but still leverages built-in tools.
3.1.4 Moving to MkDocs for Comprehensive Documentation
When your project grows and needs more sophisticated documentation, MkDocs provides an excellent balance of simplicity and features. MkDocs generates a static site from Markdown files, making it easy to write and maintain documentation.
3.1.4.1 Getting Started with MkDocs
First, install MkDocs and a theme:
pip install mkdocs mkdocs-material
Initialize a new documentation project:
mkdocs new .
This creates a mkdocs.yml
configuration file and a docs/
directory with an index.md
file.
3.1.4.2 Basic Configuration
Edit mkdocs.yml
:
site_name: My Project
theme:
name: material
palette:
primary: indigo
accent: indigo
nav:
- Home: index.md
- User Guide:
- Installation: user-guide/installation.md
- Getting Started: user-guide/getting-started.md
- API Reference: api-reference.md
- Contributing: contributing.md
3.1.4.3 Creating Documentation Content
MkDocs uses Markdown files for content. Create docs/user-guide/installation.md
:
# Installation
## Prerequisites
- Python 3.8 or later
- pip package manager
## Installation Steps
1. Install from PyPI:
```bash
pip install my-package
Verify installation:
python -c "import my_package; print(my_package.__version__)"
```
3.1.4.4 Testing Documentation Locally
Preview your documentation while writing:
mkdocs serve
This starts a development server at http://127.0.0.1:8000 that automatically refreshes when you update files.
3.1.4.5 Building and Deploying Documentation
Generate static HTML files:
mkdocs build
This creates a site/
directory with the HTML documentation site.
For GitHub projects, you can publish to GitHub Pages:
mkdocs gh-deploy
3.1.5 Hosting Documentation with GitHub Pages
GitHub Pages provides a simple, free hosting solution for your project documentation that integrates seamlessly with your GitHub repository.
3.1.5.1 Setting Up GitHub Pages
There are two main approaches to hosting documentation on GitHub Pages:
- Repository site: Serves content from a dedicated branch (typically
gh-pages
) - User/Organization site: Serves content from a special repository named
username.github.io
For most Python projects, the repository site approach works best:
- Go to your repository on GitHub
- Navigate to Settings → Pages
- Under “Source”, select your branch (either
main
orgh-pages
) - Choose the folder that contains your documentation (
/
or/docs
) - Click Save
Your documentation will be published at https://username.github.io/repository-name/
.
3.1.5.2 Automating Documentation Deployment
MkDocs has built-in support for GitHub Pages deployment:
# Build and deploy documentation to GitHub Pages
mkdocs gh-deploy
This command: 1. Builds your documentation into the site/
directory 2. Creates or updates the gh-pages
branch 3. Pushes the built site to that branch 4. GitHub automatically serves the content
For a fully automated workflow, integrate this into your GitHub Actions CI pipeline:
name: Deploy Documentation
on:
push:
branches:
- main
paths:
- 'docs/**'
- 'mkdocs.yml'
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install mkdocs mkdocs-material mkdocstrings[python] - name: Deploy documentation
run: mkdocs gh-deploy --force
This workflow automatically deploys your documentation whenever you push changes to documentation files on the main branch.
3.1.5.3 GitHub Pages with pydoc
Even if you’re using the simpler pydoc approach, you can still host the generated HTML on GitHub Pages:
Create a
docs/
folder in your repositoryGenerate HTML documentation with pydoc:
python -m pydoc -w src/my_package/*.py mv *.html docs/
Add a simple
docs/index.html
that links to your module documentationConfigure GitHub Pages to serve from the
docs/
folder of your main branch
3.1.5.4 Custom Domains
For more established projects, you can use your own domain:
- Purchase a domain from a registrar
- Add a
CNAME
file to your documentation with your domain name - Configure your DNS settings according to GitHub’s instructions
- Enable HTTPS in GitHub Pages settings
By hosting your documentation on GitHub Pages, you make it easily accessible to users and maintainable alongside your codebase. It’s a natural extension of the Git-based workflow we’ve established.
3.1.5.5 Enhancing MkDocs
MkDocs supports numerous plugins and extensions:
- Code highlighting: Built-in support for syntax highlighting
- Admonitions: Create warning, note, and info boxes
- Search: Built-in search functionality
- Table of contents: Automatic generation of section navigation
Example of enhanced configuration:
site_name: My Project
theme:
name: material
features:
- navigation.instant
- navigation.tracking
- navigation.expand
- navigation.indexes
- content.code.annotate
markdown_extensions:
- admonition
- pymdownx.highlight
- pymdownx.superfences
- toc:
permalink: true
plugins:
- search
- mkdocstrings:
handlers:
python:
selection:
docstring_style: google
3.1.6 Integrating API Documentation
MkDocs alone is great for manual documentation, but you can also integrate auto-generated API documentation:
3.1.6.1 Using mkdocstrings
Install mkdocstrings to include docstrings from your code:
pip install mkdocstrings[python]
Update mkdocs.yml
:
plugins:
- search
- mkdocstrings:
handlers:
python:
selection:
docstring_style: google
Then in your docs/api-reference.md
:
# API Reference
## Module my_package.core
This module contains core functionality.
::: my_package.core
options: show_source: false
This automatically generates documentation from docstrings in your my_package.core
module.
3.1.7 Documentation Best Practices
Regardless of which documentation tool you choose, follow these best practices:
- Start with a clear README: Include installation, quick start, and basic examples
- Document as you code: Write documentation alongside code, not as an afterthought
- Include examples: Show how to use functions and classes with realistic examples
- Document edge cases and errors: Explain what happens in exceptional situations
- Keep documentation close to code: Use docstrings for API details
- Maintain a changelog: Track major changes between versions
- Consider different audiences: Write for both new users and experienced developers
3.1.8 Choosing the Right Documentation Approach
Approach | When to Use |
---|---|
Docstrings only | For very small, personal projects |
pydoc | For simple projects with minimal documentation needs |
Custom pydoc script | Small to medium projects needing basic organization |
MkDocs | Medium to large projects requiring structured, attractive documentation |
Sphinx | Large, complex projects, especially with scientific or mathematical content |
For most applications, the journey often progresses from simple docstrings to MkDocs as the project grows. By starting with good docstrings from the beginning, you make each subsequent step easier.
In the next section, we’ll explore how to automate your workflow with CI/CD using GitHub Actions.
3.2 CI/CD Workflows with GitHub Actions
Continuous Integration (CI) and Continuous Deployment (CD) automate the process of testing, building, and deploying your code, ensuring quality and consistency throughout the development lifecycle. GitHub Actions provides a powerful and flexible way to implement CI/CD workflows directly within your GitHub repository.
3.2.1 Understanding CI/CD Basics
Before diving into implementation, let’s understand what each component achieves:
- Continuous Integration: Automatically testing code changes when pushed to the repository
- Continuous Deployment: Automatically deploying code to testing, staging, or production environments
A robust CI/CD pipeline typically includes:
- Running tests
- Verifying code quality (formatting, linting)
- Static analysis (type checking, security scanning)
- Building documentation
- Building and publishing packages or applications
- Deploying to environments
3.2.2 Setting Up GitHub Actions
GitHub Actions workflows are defined using YAML files stored in the .github/workflows/
directory of your repository. Each workflow file defines a set of jobs and steps that execute in response to specified events.
Start by creating the directory structure:
mkdir -p .github/workflows
3.2.3 Basic Python CI Workflow
Let’s create a file named .github/workflows/ci.yml
:
name: Python CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: pip
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-dev.txt
- name: Check formatting with Ruff
run: ruff format --check .
- name: Lint with Ruff
run: ruff check .
- name: Type check with mypy
run: mypy src/
- name: Run security checks with Bandit
run: bandit -r src/ -x tests/
- name: Test with pytest
run: pytest --cov=src/ --cov-report=xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
fail_ci_if_error: true
This workflow:
- Triggers on pushes to
main
and on pull requests - Runs on the latest Ubuntu environment
- Tests against multiple Python versions
- Sets up caching to speed up dependency installation
- Runs our full suite of quality checks and tests
- Uploads coverage reports to Codecov (if you’ve set up this integration)
3.2.4 Using Dependency Caching
To speed up your workflow, GitHub Actions provides caching capabilities:
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: pip # Enable pip caching
For more specific control over caching:
- name: Cache pip packages
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}
restore-keys: |
${{ runner.os }}-pip-
3.2.5 Adapting for Different Dependency Tools
If you’re using uv instead of pip, adjust your workflow:
- name: Install uv
run: curl -LsSf https://astral.sh/uv/install.sh | sh
- name: Install dependencies with uv
run: |
uv pip sync requirements.txt requirements-dev.txt
3.2.6 Building and Publishing Documentation
Add a job to build documentation with MkDocs:
docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install mkdocs mkdocs-material mkdocstrings[python]
- name: Build documentation
run: mkdocs build --strict
- name: Deploy to GitHub Pages
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./site
This job builds your documentation with MkDocs and deploys it to GitHub Pages when changes are pushed to the main branch.
3.2.7 Building and Publishing Python Packages
For projects that produce packages, add a job for publication to PyPI:
publish:
needs: [test, docs] # Only run if test and docs jobs pass
runs-on: ubuntu-latest
# Only publish on tagged releases
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install build dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build package
run: python -m build
- name: Check package with twine
run: twine check dist/*
- name: Publish package
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
This job: 1. Only runs after tests and documentation have passed 2. Only triggers on tagged commits (releases) 3. Builds the package using the build
package 4. Validates the package with twine
5. Publishes to PyPI using a secure token
You would need to add the PYPI_API_TOKEN
to your repository secrets.
3.2.8 Running Tests in Multiple Environments
For applications that need to support multiple operating systems or Python versions:
test:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
python-version: ["3.8", "3.9", "3.10"]
steps:
# ... Steps as before ...
This configuration runs your tests on three operating systems with three Python versions each, for a total of nine environments.
3.2.9 Branch Protection and Required Checks
To ensure code quality, set up branch protection rules on GitHub:
- Go to your repository → Settings → Branches
- Add a rule for your main branch
- Enable “Require status checks to pass before merging”
- Select the checks from your CI workflow
This prevents merging pull requests until all tests pass, maintaining your code quality standards.
3.2.10 Scheduled Workflows
Run your tests on a schedule to catch issues with external dependencies:
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
schedule:
- cron: '0 0 * * 0' # Weekly on Sundays at midnight
3.2.11 Notifications and Feedback
Configure notifications for workflow results:
- name: Send notification
if: always()
uses: rtCamp/action-slack-notify@v2
env:
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
SLACK_TITLE: CI Result
SLACK_MESSAGE: ${{ job.status }}
SLACK_COLOR: ${{ job.status == 'success' && 'good' || 'danger' }}
This example sends notifications to Slack, but similar actions exist for other platforms.
3.2.12 A Complete CI/CD Workflow Example
Here’s a comprehensive workflow example bringing together many of the concepts we’ve covered:
name: Python CI/CD Pipeline
on:
push:
branches: [ main ]
tags: [ 'v*' ]
pull_request:
branches: [ main ]
schedule:
- cron: '0 0 * * 0' # Weekly on Sundays
jobs:
quality:
name: Code Quality
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
cache: pip
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements-dev.txt
- name: Check formatting
run: ruff format --check .
- name: Lint with Ruff
run: ruff check .
- name: Type check
run: mypy src/
- name: Security scan
run: bandit -r src/ -x tests/
- name: Check for dead code
run: vulture src/ --min-confidence 80
test:
name: Test
needs: quality
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.8", "3.9", "3.10"]
include:
- os: windows-latest
python-version: "3.10"
- os: macos-latest
python-version: "3.10"
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: pip
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt -r requirements-dev.txt
- name: Test with pytest
run: pytest --cov=src/ --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
docs:
name: Documentation
needs: quality
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install docs dependencies
run: |
python -m pip install --upgrade pip
pip install mkdocs mkdocs-material mkdocstrings[python]
- name: Build docs
run: mkdocs build --strict
- name: Deploy docs
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./site
publish:
name: Publish Package
needs: [test, docs]
runs-on: ubuntu-latest
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install build dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build package
run: python -m build
- name: Check package
run: twine check dist/*
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
- name: Create GitHub Release
uses: softprops/action-gh-release@v1
with:
files: dist/*
generate_release_notes: true
This comprehensive workflow: 1. Checks code quality (formatting, linting, type checking, security, dead code) 2. Runs tests on multiple Python versions and operating systems 3. Builds and deploys documentation 4. Publishes packages to PyPI on tagged releases 5. Creates GitHub releases with release notes
3.2.13 CI/CD Best Practices
- Keep workflows modular: Split complex workflows into logical jobs
- Fail fast: Run quick checks (like formatting) before longer ones (like testing)
- Cache dependencies: Speed up workflows by caching pip packages
- Be selective: Only run necessary jobs based on changed files
- Test thoroughly: Include all environments your code supports
- Secure secrets: Use GitHub’s secret storage for tokens and keys
- Monitor performance: Watch workflow execution times and optimize slow steps
With these CI/CD practices in place, your development workflow becomes more reliable and automatic. Quality checks run on every change, documentation stays up to date, and releases happen smoothly and consistently.
In the final section, we’ll explore how to publish and distribute Python packages to make your work available to others.
3.3 Package Publishing and Distribution
When your Python project matures, you may want to share it with others through the Python Package Index (PyPI). Publishing your package makes it installable via pip
, allowing others to easily use your work.
3.3.1 Preparing Your Package for Distribution
Before publishing, your project needs the right structure. Let’s ensure everything is ready:
3.3.1.1 1. Package Structure Review
A distributable package should have this basic structure:
my_project/
├── src/
│ └── my_package/
│ ├── __init__.py
│ ├── module1.py
│ └── module2.py
├── tests/
├── docs/
├── pyproject.toml
├── LICENSE
└── README.md
3.3.1.2 2. Package Configuration with pyproject.toml
Modern Python packaging uses pyproject.toml
for configuration:
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "my-package"
version = "0.1.0"
description = "A short description of my package"
readme = "README.md"
requires-python = ">=3.8"
license = {text = "MIT"}
authors = [
{name = "Your Name", email = "your.email@example.com"}
]
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
]
dependencies = [
"requests>=2.25.0",
"numpy>=1.20.0",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0.0",
"pytest-cov",
"ruff",
"mypy",
]
doc = [
"mkdocs",
"mkdocs-material",
"mkdocstrings[python]",
]
[project.urls]
"Homepage" = "https://github.com/yourusername/my-package"
"Bug Tracker" = "https://github.com/yourusername/my-package/issues"
[project.scripts]
my-command = "my_package.cli:main"
[tool.setuptools]
package-dir = {"" = "src"}
packages = ["my_package"]
This configuration: - Defines basic metadata (name, version, description) - Lists dependencies (both required and optional) - Sets up entry points for command-line scripts - Specifies the package location (src layout)
3.3.1.3 3. Include Essential Files
Ensure you have these files:
# Create a LICENSE file (example: MIT License)
cat > LICENSE << EOF
MIT License
Copyright (c) $(date +%Y) Your Name
Permission is hereby granted...
EOF
# Create a comprehensive README.md with:
# - What the package does
# - Installation instructions
# - Basic usage examples
# - Links to documentation
# - Contributing guidelines
3.3.2 Building Your Package
With configuration in place, you’re ready to build distribution packages:
# Install build tools
pip install build
# Build both wheel and source distribution
python -m build
This creates two files in the dist/
directory: - A source distribution (.tar.gz
) - A wheel file (.whl
)
Always check your distributions before publishing:
# Install twine
pip install twine
# Check the package
twine check dist/*
3.3.3 Publishing to Test PyPI
Before publishing to the real PyPI, test your package on TestPyPI:
- Create a TestPyPI account at https://test.pypi.org/account/register/
- Upload your package:
twine upload --repository testpypi dist/*
- Test installation from TestPyPI:
pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ my-package
3.3.4 Publishing to PyPI
When everything works correctly on TestPyPI:
- Create a PyPI account at https://pypi.org/account/register/
- Upload your package:
twine upload dist/*
Your package is now available to the world via pip install my-package
!
3.3.5 Automating Package Publishing
To automate publishing with GitHub Actions, add a workflow that: 1. Builds the package 2. Uploads to PyPI when you create a release tag
name: Publish Python Package
on:
release:
types: [created]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build twine - name: Build and publish
env:
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
run: |
python -m build twine upload dist/*
For better security, use API tokens instead of your PyPI password: 1. Generate a token from your PyPI account settings 2. Add it as a GitHub repository secret 3. Use the token in your workflow:
- name: Build and publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: |
python -m build twine upload dist/*
3.3.6 Versioning Best Practices
Follow Semantic Versioning (MAJOR.MINOR.PATCH): - MAJOR: Incompatible API changes - MINOR: New functionality (backward-compatible) - PATCH: Bug fixes (backward-compatible)
Track versions in one place, usually in __init__.py
:
# src/my_package/__init__.py
= "0.1.0" __version__
Or with a dynamic version from your git tags using setuptools-scm:
[build-system]
requires = ["setuptools>=61.0", "wheel", "setuptools_scm[toml]>=6.2"]
build-backend = "setuptools.build_meta"
[tool.setuptools_scm]
# Uses git tags for versioning
3.3.7 Creating Releases
A good release process includes:
- Update documentation:
- Ensure README is current
- Update changelog with notable changes
- Create a new version:
Update version number
Create a git tag:
git tag -a v0.1.0 -m "Release version 0.1.0" git push origin v0.1.0
- Monitor the CI/CD pipeline:
- Ensure tests pass
- Verify package build succeeds
- Confirm successful publication
- Announce the release:
- Create GitHub release notes
- Post in relevant community forums
- Update documentation site
3.3.8 Package Maintenance
Once published, maintain your package responsibly:
- Monitor issues on GitHub or GitLab
- Respond to bug reports promptly
- Review and accept contributions from the community
- Regularly update dependencies to address security issues
- Create new releases when significant improvements are ready
3.3.9 Advanced Distribution Topics
As your package ecosystem grows, consider these advanced techniques:
3.3.9.1 1. Binary Extensions
For performance-critical components, you might include compiled C extensions: - Use Cython to compile Python to C - Configure with the build-system
section in pyproject.toml - Build platform-specific wheels
3.3.9.2 2. Namespace Packages
For large projects split across multiple packages:
# src/myorg/packageone/__init__.py
# src/myorg/packagetwo/__init__.py
# Makes 'myorg' a namespace package
3.3.9.3 3. Conditional Dependencies
For platform-specific dependencies:
dependencies = [
"requests>=2.25.0",
"numpy>=1.20.0",
"pywin32>=300; platform_system == 'Windows'",
]
3.3.9.4 4. Data Files
Include non-Python files (data, templates, etc.):
[tool.setuptools]
package-dir = {"" = "src"}
packages = ["my_package"]
include-package-data = true
Create a MANIFEST.in
file:
include src/my_package/data/*.json
include src/my_package/templates/*.html
By following these practices, you’ll create a professional, well-maintained package that others can easily discover, install, and use. Publishing your work to PyPI is not just about sharing code—it’s about participating in the Python ecosystem and contributing back to the community.
3.3.10 Modern vs. Traditional Python Packaging
Python packaging has evolved significantly over the years:
3.3.10.1 Traditional setup.py
Approach
Historically, Python packages required a setup.py
file:
# setup.py
from setuptools import setup, find_packages
setup(="my-package",
name="0.1.0",
version=find_packages(),
packages=[
install_requires"requests>=2.25.0",
"numpy>=1.20.0",
], )
This approach is still common and has advantages for: - Compatibility with older tooling - Dynamic build processes that need Python code - Complex build requirements (e.g., C extensions, custom steps)
3.3.10.2 Modern pyproject.toml
Approach
Since PEP 517/518, packages can use pyproject.toml
exclusively:
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "my-package"
version = "0.1.0"
dependencies = [
"requests>=2.25.0",
"numpy>=1.20.0",
]
This declarative approach is recommended for new projects because it: - Provides a standardized configuration format - Supports multiple build systems (not just setuptools) - Simplifies dependency specification - Avoids executing Python code during installation
3.3.10.3 Which Approach Should You Use?
- For new, straightforward packages: Use
pyproject.toml
only - For packages with complex build requirements: You may need both
pyproject.toml
andsetup.py
- For maintaining existing packages: Consider gradually migrating to
pyproject.toml
Many projects use a hybrid approach, with basic metadata in pyproject.toml
and complex build logic in setup.py
.