Project Checkpoint
In this course, the practical project track is one study tracker project that grows step by step. In this opening checkpoint, the goal is to get a very small walking skeleton running.
A walking skeleton is a thin end-to-end slice of the application. For this part, that means:
- PostgreSQL starts,
- the application connects to the database,
- one route runs a SQL query,
- and one page renders seeded rows.
This chapter is intentionally copy-paste friendly. Use it to get the first version running. You do not need to understand every line yet.
Before getting into the files, it helps to see the application at a high level. A database is usually one part of a larger application, not the whole system by itself.
Tools You Need
If you need setup help or a reminder of the project structure, keep these support pages nearby:
For this milestone, you need:
- Docker and Docker Compose
- a code editor
- a terminal
- and a browser
Keep the same project directory through the whole course. Later project chapters will extend this same scaffold.
Copy the Starter Files
Create a project folder, for example study-tracker, and inside it create these folders:
mkdir -p study-tracker/app/templates
mkdir -p study-tracker/migrations
At the end of this chapter, the project should look roughly like this:
study-tracker/
├── app/
│ ├── __init__.py
│ ├── db.py
│ ├── main.py
│ ├── migrate.py
│ └── templates/
│ └── decks.html
├── migrations/
│ ├── 001_schema.sql
│ └── 002_seed.sql
├── compose.yaml
├── Dockerfile
├── requirements.txt
└── start.sh
requirements.txt
Copy this file into the project root:
fastapi==0.135.2
uvicorn==0.42.0
jinja2==3.1.6
python-multipart==0.0.22
psycopg[binary]==3.3.3
Dockerfile
Copy this file into the project root:
FROM python:3.14-bookworm
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt /app/requirements.txt
RUN pip install --no-cache-dir -r /app/requirements.txt
COPY app /app/app
COPY migrations /app/migrations
COPY start.sh /app/start.sh
RUN chmod +x /app/start.sh
EXPOSE 8000
CMD ["sh", "/app/start.sh"]
compose.yaml
Copy this file into the project root:
services:
db:
image: postgres:18
environment:
POSTGRES_DB: studytracker
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
ports:
- "5432:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres -d studytracker"]
interval: 5s
timeout: 5s
retries: 12
app:
build: .
depends_on:
db:
condition: service_healthy
environment:
DATABASE_URL: postgresql://postgres:postgres@db:5432/studytracker
ports:
- "8000:8000"
volumes:
- .:/app
command: ["sh", "/app/start.sh"]
start.sh
Copy this file into the project root:
#!/bin/sh
set -eu
python3 - <<'PY'
import os
import time
import psycopg
database_url = os.environ["DATABASE_URL"]
for _ in range(60):
try:
with psycopg.connect(database_url):
break
except psycopg.OperationalError:
time.sleep(1)
else:
raise SystemExit("Database did not become available in time.")
PY
python3 -m app.migrate
exec python3 -m uvicorn app.main:app \
--host 0.0.0.0 \
--port 8000 \
--reload \
--reload-dir /app/app \
--reload-dir /app/migrations
app/__init__.py
Create an empty app/__init__.py.
app/db.py
Copy this file into app/db.py:
import os
import psycopg
from psycopg.rows import dict_row
def get_connection():
return psycopg.connect(
os.environ["DATABASE_URL"],
row_factory=dict_row,
)
app/migrate.py
Copy this file into app/migrate.py:
from pathlib import Path
from .db import get_connection
def migration_directory() -> Path:
return Path(__file__).resolve().parent.parent / "migrations"
def split_sql_statements(sql_text: str) -> list[str]:
statements: list[str] = []
current: list[str] = []
for line in sql_text.splitlines():
current.append(line)
if ";" in line:
statement = "\n".join(current).strip()
if statement:
statements.append(statement)
current = []
trailing = "\n".join(current).strip()
if trailing:
statements.append(trailing)
return statements
def apply_migrations() -> None:
with get_connection() as conn:
conn.execute(
"""
CREATE TABLE IF NOT EXISTS schema_migrations (
filename TEXT PRIMARY KEY,
applied_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
)
"""
)
applied = {
row["filename"]
for row in conn.execute(
"SELECT filename FROM schema_migrations"
).fetchall()
}
for path in sorted(migration_directory().glob("*.sql")):
if path.name in applied:
continue
with conn.transaction():
for statement in split_sql_statements(path.read_text(encoding="utf-8")):
conn.execute(statement)
conn.execute(
"INSERT INTO schema_migrations (filename) VALUES (%s)",
(path.name,),
)
if __name__ == "__main__":
apply_migrations()
app/main.py
Copy this file into app/main.py:
from fastapi import FastAPI, Request
from fastapi.responses import RedirectResponse
from fastapi.templating import Jinja2Templates
from .db import get_connection
app = FastAPI()
templates = Jinja2Templates(directory="app/templates")
def render(request: Request, template_name: str, **context):
return templates.TemplateResponse(
request=request,
name=template_name,
context=context,
)
@app.get("/")
def index():
return RedirectResponse("/decks", status_code=303)
@app.get("/decks")
def list_decks(request: Request):
with get_connection() as conn:
decks = conn.execute(
"""
SELECT id, name, description
FROM decks
ORDER BY name
"""
).fetchall()
return render(request, "decks.html", decks=decks)
app/templates/decks.html
Copy this file into app/templates/decks.html:
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>Study Tracker</title>
</head>
<body>
<h1>Study Tracker</h1>
<h2>Decks</h2>
<ul>
{% for deck in decks %}
<li>
<strong>{{ deck.name }}</strong>
{% if deck.description %}
- {{ deck.description }}
{% endif %}
</li>
{% endfor %}
</ul>
</body>
</html>
migrations/001_schema.sql
Copy this file into migrations/001_schema.sql:
CREATE TABLE decks (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
);
migrations/002_seed.sql
Copy this file into migrations/002_seed.sql:
INSERT INTO decks (name, description)
VALUES
('SQL Basics', 'Starter deck for SQL concepts.'),
('Web Development', 'Starter deck for backend web ideas.');
Start and Verify
In the project root, run:
docker compose up --build
When startup succeeds, open http://localhost:8000/decks in the browser.
Before moving on, verify all of the following:
/decksopens in the browser,- the page shows the two seeded decks,
app/main.pycontains the first route and SQL query,app/templates/decks.htmlrenders the page,- and the
migrationsfolder contains the schema and seed SQL files.
Return the Walking Skeleton
When the walking skeleton works, return a zip of the project milestone.
The root of the zip should contain:
requirements.txt- the folder
app - the folder
migrations
You may also include compose.yaml, Dockerfile, and start.sh. These files are useful for local development, but the automated grader focuses on the Python application and the SQL migrations.
Do not add an extra enclosing project directory inside the zip. The listed files and folders should be directly at the root of the archive.
Keep This Project
Do not treat this as a temporary setup. Later parts extend the same project with new routes, SQL, forms, summaries, transactions, and other database features.
Common Pitfalls
- forgetting to create the
migrationsfolder or the SQL files, - running only PostgreSQL without the application,
- assuming the page is correct without checking that the rows come from the database,
- and zipping the whole enclosing project folder instead of the project contents.