8 Beyond Scripts: Notebooks, Dashboards, and Interactive Python
Scripts aren’t the only way to ship Python. This chapter explores notebooks, dashboards, and interactive tools—legitimate ways to share, deploy, and deliver Python work.
8.1 Why This Chapter Appears Late (But Isn’t an Afterthought)
This chapter comes near the end of the book, but that’s intentional—not because notebooks are less important. We wanted you to first understand the complete script-based workflow: project structure, testing, documentation, packaging, and distribution. These fundamentals apply whether you’re writing scripts OR notebooks.
Now that you understand the “full” workflow, you can appreciate both when notebooks simplify things and when they create limitations. You’ll recognise that a requirements.txt matters for Binder just as it does for pip installs, that documentation practices transfer to notebook markdown cells, and that nbdev’s testing approach builds on pytest concepts you already know.
Notebooks are a first-class citizen in the Python ecosystem—not an alternative for people who can’t handle “real” development. Many professional data scientists, researchers, and educators ship exclusively through notebooks. This chapter gives you the complete picture.
Notebooks are often easier to share than packaged scripts—a Colab link gives anyone instant access with zero installation. But they expose your code by default, which creates friction for some audiences. Not everyone wants to scroll past Python cells to see results. Tools like Voilà and Mercury address this, but it’s worth knowing: the simplicity of notebooks comes with visibility trade-offs.
8.2 The Three Ways to Write Python
There are only three ways to write Python code:
- REPL - Interactive exploration (not for shipping)
- Scripts - Traditional apps, packages, CLI tools (this book’s main focus)
- Notebooks - Data analysis, reports, teaching, prototypes
This book has focused primarily on scripts. But for many Python practitioners—especially in data science, education, and research—notebooks ARE the deliverable. A well-structured notebook with a sharing link is shipping.
8.3 When Notebooks Make Sense
Notebooks excel when:
- The narrative matters - Analysis with explanation, teaching materials
- Exploration is the product - Data investigation, research findings
- Visuals are central - Charts, plots, interactive widgets
- Reproducibility is key - Share exact environment and execution order
- Zero-install is required - Viewers shouldn’t need to set up Python
Notebooks are less suited for:
- Production APIs or services
- CLI tools
- Reusable libraries (though nbdev challenges this)
- Long-running applications
8.5 Zero-Install Execution
The power of notebooks: viewers can RUN your code without installing anything.
8.5.1 Google Colab
The most accessible option. Add a badge to your README:
[](COLAB_URL)
Where:
BADGE_URL = https://colab.research.google.com
/assets/colab-badge.svg
COLAB_URL = https://colab.research.google.com
/github/USER/REPO/blob/main/FILE
Colab advantages:
- Zero setup for viewers
- Free GPU/TPU access
- Google Drive integration
- GitHub integration (open directly from repos)
Colab workflow with GitHub:
- Develop locally or in Colab
- Save to GitHub (File → Save a copy to GitHub)
- Share Colab link that opens from GitHub
- Viewers get latest version automatically
This gives you version control (GitHub) with zero-install execution (Colab)—a “clunky Dropbox” that’s actually better because it’s versioned.
8.5.2 Binder
mybinder.org turns any GitHub repo into interactive notebooks:
[](https://mybinder.org/v2/gh/username/repo/main)Binder advantages:
- Works with
requirements.txtorenvironment.yml - Full JupyterLab environment
- No Google account required
Binder limitations:
- Slower startup (builds environment)
- Sessions timeout
- Limited resources
8.5.3 Kaggle Kernels
For data science work, Kaggle provides:
- Free GPU access
- Built-in datasets
- Community sharing
- Competition integration
8.6 Notebooks as Applications
Transform notebooks into interactive applications that hide the code.
8.6.1 Voilà
Voilà converts notebooks into standalone dashboards:
pip install voila
voila notebook.ipynb- Renders only output cells (code hidden)
- Supports ipywidgets for interactivity
- Deploy on Heroku, Binder, or your own server
8.6.2 Mercury
Mercury turns notebooks into web apps:
# Add YAML header to notebook
---
title: My Analysis
description: Interactive data explorer
params:
date_range:
input: slider
min: 2020
max: 2024
---- Automatic widget generation from parameters
- PDF/HTML export
- Authentication support
- Self-hostable
8.6.3 Streamlit (Notebook-Adjacent)
Not notebooks, but similar rapid-development workflow:
# app.py
import streamlit as st
import pandas as pd
st.title("Data Explorer")
data = pd.read_csv("data.csv")
st.dataframe(data)streamlit run app.py- Python scripts, not notebooks
- But similar “write and see” iteration
- Easy deployment via Streamlit Cloud
8.6.4 Panel and HoloViz
For more complex dashboards:
- Panel - Flexible dashboarding from notebooks
- HoloViews - High-level plotting
- hvPlot - Interactive pandas plots
8.7 Notebooks as Libraries: nbdev
nbdev flips the script: develop libraries FROM notebooks.
notebook.ipynb → Python package + docs + tests
We dedicate Chapter 9 to building a complete package with nbdev—TextKit, a text analysis library. That case study parallels the SimpleBot chapter (Chapter 4) but follows the notebook-first workflow. Think of nbdev as an alternative path to the same destination: instead of writing .py files and separate docs, you write notebooks that generate both. The end result—a published package—is the same.
The nbdev philosophy:
- Write code, tests, and documentation together
- Export specific cells to modules
- Generate API docs automatically
- Literate programming for Python
Basic workflow:
#| export
def process_data(df):
"""Clean and transform dataframe.
Parameters
----------
df : DataFrame
Input data
Returns
-------
DataFrame
Cleaned data
"""
return df.dropna()#| test
def test_process_data():
df = pd.DataFrame({'a': [1, None, 3]})
result = process_data(df)
assert len(result) == 2When to use nbdev:
- You naturally develop in notebooks
- Documentation and code should live together
- Teaching libraries where explanation matters
8.8 Low-Code Alternatives
8.8.1 Anvil
Anvil provides a different model:
- Drag-and-drop UI builder
- Write Python for event handlers
- Hosted deployment included
- Database and user management built-in
# Behind a button click
def button_click(self, **event_args):
self.label.text = "Hello, " + self.text_box.textAnvil is good for:
- Internal business tools
- Forms and data entry
- Quick prototypes with real UIs
- Teaching event-driven programming
Trade-offs:
- Vendor lock-in
- Less “Pythonic” project structure
- Limited customisation
8.9 Notebook Best Practices for Shipping
8.9.1 Structure Your Notebooks
1. Title and Overview (Markdown)
2. Setup and Imports
3. Data Loading
4. Analysis Sections (numbered)
5. Conclusions
6. Appendix (helper functions, details)
8.9.2 Environment Management
Include a requirements cell:
# Requirements: pandas>=1.5, matplotlib>=3.6, seaborn>=0.12Or ship with requirements.txt / environment.yml for Binder.
8.9.3 Clear Outputs vs. Keep Outputs
| Approach | When |
|---|---|
| Clear outputs | Version control (smaller diffs) |
| Keep outputs | Sharing (viewers see results immediately) |
Consider: clear for development, render for sharing.
8.9.4 Use nbstripout
Automatically strip outputs on commit:
pip install nbstripout
nbstripout --install8.10 Comparison Table
| Tool | Type | Hosting | Best For |
|---|---|---|---|
| GitHub + nbviewer | View only | Free | Simple sharing |
| Colab | Interactive | Free (Google) | Zero-install, GPU |
| Binder | Interactive | Free | Reproducible environments |
| Voilà | Dashboard | Self-host/Binder | Hide code, show results |
| Mercury | Web app | Self-host | Parameterized reports |
| Streamlit | Web app | Streamlit Cloud | Rapid app development |
| nbdev | Library dev | PyPI | Literate programming |
| Anvil | Full app | Anvil servers | Low-code business apps |
8.11 When to Graduate from Notebooks
Notebooks are great, but sometimes you need to move to scripts:
- Tests are growing - pytest is better than notebook tests
- Reuse across projects - Package your code
- Production deployment - APIs, services, CLI tools
- Team collaboration - Notebooks have merge conflicts
The path: Notebook → extract functions to .py → package → tests → CI/CD.
Or use nbdev to keep working in notebooks while generating proper packages.
8.12 Summary
- Notebooks are a legitimate shipping format
- “Shipping” can mean a Colab link, not just a packaged app
- Multiple tools exist to share, execute, and transform notebooks
- Choose based on your audience: viewers, runners, or users
- Know when to graduate to scripts (or use nbdev to avoid the choice)
Many notebook environments now include built-in AI assistants. Google Colab has Gemini, and JupyterLab supports extensions for Copilot and other AI tools. These are particularly useful for exploratory data analysis — ask AI to generate plotting code, explain error messages, or suggest the next analysis step. The interactive, cell-by-cell nature of notebooks makes AI collaboration especially natural: generate code in one cell, review and modify it, then run it immediately.
Next: In Chapter 9, we’ll build TextKit — a complete Python package developed entirely in notebooks using nbdev.
8.13 Exercises
Share a notebook: Push a notebook to GitHub and create both an nbviewer link and a Colab badge.
Try Binder: Add a
requirements.txtto a repo and create a Binder link.Build a dashboard: Take an analysis notebook and convert it to a Voilà dashboard.
Explore nbdev: Create a simple function in a notebook and export it to a Python module using nbdev.