python.
python8 min read

uv Monorepo Python Workspace Explained: A 10-Tooler Case Study

How uv resolves a 10-tooler Python monorepo with shared libraries and per-service dependencies, and why its workspace model outperforms Poetry on real-world FastAPI stacks.

uv monorepopython workspaceuv vs poetryfastapi monorepopydantic shared schemas

Running ten FastAPI services out of one repository used to be the kind of decision you regretted by the second quarter. Lock files drift, shared schema packages get stale, and every poetry install turns into a coffee break. After migrating the black_hat_rust monorepo from a Poetry-per-tooler layout to a single uv workspace, the install time dropped from roughly 42 seconds per service to under 4 seconds for the whole graph. This article walks through exactly how that works, what the directory layout looks like, and why uv's resolver model fits a polyrepo-inside-a-monorepo shape better than anything else in the Python ecosystem right now.

The shape of the problem

The monorepo has ten tooler services under toolers/, each a FastAPI app with its own lifecycle. They all depend on two shared packages under packages/py-shared/ for Pydantic models and on a growing web of third-party libraries: httpx, anthropic, playwright, pytrends, mcp, and friends. The root layout looks like this:

black_hat_rust/
  pyproject.toml              ← workspace root, no app code
  uv.lock                     ← single lock for the entire graph
  packages/
    py-shared/
      pyproject.toml          ← shared Pydantic schemas
      src/py_shared/
  toolers/
    tooler-core/pyproject.toml
    tooler-web/pyproject.toml
    tooler-search/pyproject.toml
    tooler-seo/pyproject.toml
    tooler-creative/pyproject.toml
    tooler-video-maker/pyproject.toml
    tooler-telegram/pyproject.toml
    tooler-memory/pyproject.toml
    tooler-storage/pyproject.toml
    tooler-scheduler/pyproject.toml

Every tooler needs py-shared at head, not at a pinned release. Every tooler has overlapping third-party deps (fastapi, pydantic, httpx) but also disjoint ones (playwright only in tooler-web, pytrends only in tooler-seo). Under Poetry this meant ten separate poetry.lock files, ten virtualenvs, and a pip install -e ../../packages/py-shared dance in each pyproject.toml. Every time a Pydantic model changed in py-shared, half the CI pipeline needed re-locking.

What a uv workspace actually is

The uv documentation describes a workspace as a collection of packages that share a single lockfile and a single virtual environment. The official workspace guide makes the contract explicit: one uv.lock at the root, one .venv/ at the root, and every member package resolved together so there is exactly one version of every transitive dependency across the tree.

The root pyproject.toml declares the workspace like this:

[project]
name = "black-hat-rust"
version = "0.0.0"
requires-python = ">=3.12"

[tool.uv.workspace]
members = [
    "packages/py-shared",
    "toolers/tooler-core",
    "toolers/tooler-web",
    "toolers/tooler-search",
    "toolers/tooler-seo",
    "toolers/tooler-creative",
    "toolers/tooler-video-maker",
    "toolers/tooler-telegram",
    "toolers/tooler-memory",
    "toolers/tooler-storage",
    "toolers/tooler-scheduler",
]

[tool.uv.sources]
py-shared = { workspace = true }

Two things deserve attention. First, the members list is explicit. uv supports globs like toolers/* but you can pin exact paths when you want a tooler to opt out temporarily (useful while vendoring a legacy service). Second, [tool.uv.sources] tells the resolver that whenever any package requests py-shared, it should resolve to the local workspace member rather than hit PyPI. This is the key difference from a path = "../py-shared" editable install: the source is declared once at the workspace root, and every member inherits it automatically.

Inside toolers/tooler-core/pyproject.toml, the dependencies read exactly like a normal standalone project:

[project]
name = "tooler-core"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = [
    "fastapi>=0.115",
    "uvicorn[standard]>=0.32",
    "httpx>=0.27",
    "anthropic>=0.40",
    "pydantic>=2.9",
    "py-shared",
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

No path dependency, no editable flag, no conditional extras. The resolver sees py-shared in dependencies, looks it up in the workspace source table, and wires the local package in. Change a Pydantic model in packages/py-shared/src/py_shared/article.py, run the tooler, and the import picks up the new field on the next process start. No reinstall.

Why this beats Poetry for this shape

Poetry's workspace story, discussed in the still-open Poetry issue #2270, treats multi-package repos as a collection of independent projects that happen to live in the same git tree. Each subproject gets its own lockfile. Poetry's path-dependency documentation acknowledges that path deps to a local package work, but each consuming project still locks transitive deps on its own. If tooler-core locks pydantic==2.9.2 and tooler-seo locks pydantic==2.9.0, both are valid under their respective lockfiles but they produce divergent venvs. In CI you either install everything ten times or you accept that integration tests run against a Frankenstein environment that does not match any deployed service.

uv resolves the entire workspace as a single dependency graph. The Astral blog post announcing workspaces and the concepts documentation explain the resolver model: it picks one version of every package that satisfies every member's constraints simultaneously. If tooler-core wants pydantic>=2.9 and tooler-seo wants pydantic>=2.8, both end up on pydantic==2.9.x. This is the same guarantee you get from a cargo workspace in Rust and the same one bun enforces in its workspace mode. For a monorepo that ships ten services that talk to each other through shared Pydantic schemas, single-version resolution is not a nice-to-have; it is the reason the schemas can be trusted at runtime.

The speed difference is not subtle either. uv's resolver, documented in the Astral uv announcement, is written in Rust and parallelises network I/O aggressively. On the black_hat_rust workspace, a cold uv sync --all-packages on a fresh machine takes about 3.8 seconds once the wheels are cached in the shared ~/.cache/uv/ directory. The equivalent poetry install across ten projects took just over 40 seconds because each project rebuilt its own resolver state and re-downloaded overlapping wheels into ten separate virtualenvs.

Running a single tooler without installing ten

A common objection to workspace tools is that you do not want a developer who is only touching tooler-seo to pull Playwright, the Anthropic SDK, and every other heavyweight dep. uv solves this with --package:

cd /Users/longlanh/Workspace/y2026/Apr/black_hat_rust

uv sync --package tooler-seo

uv run --package tooler-seo uvicorn tooler_seo.app:app --reload

uv sync --package tooler-seo installs exactly the dependency closure of tooler-seo plus py-shared into the root .venv/, nothing else. If you later need tooler-web, uv sync --package tooler-web extends the same venv. The lockfile remains the workspace-wide one, so the version of httpx that both toolers see is identical. This matches the behaviour described in the uv commands reference and is the day-to-day workflow that replaces the poetry install + source .venv/bin/activate loop.

For CI, the monorepo keeps things even simpler. The workflow defined under scripts/ci/ runs uv sync --frozen --all-packages once, then each tooler's test suite runs inside the same venv:

- name: Install uv
  uses: astral-sh/setup-uv@v3

- name: Resolve workspace
  run: uv sync --frozen --all-packages

- name: Run tooler tests
  run: |
    for t in toolers/*/; do
      uv run --project "$t" pytest
    done

--frozen refuses to update uv.lock, which is the behaviour you want in CI. The Astral setup-uv action handles caching the ~/.cache/uv/ directory between runs, so most pipelines stabilise at under 15 seconds of install time including cold cache restore.

The Pydantic schema story

packages/py-shared/ is where the workspace pattern pays its largest dividend. The shared package exposes Pydantic models that represent the wire format between toolers. tooler-core orchestrates, tooler-telegram receives webhooks, tooler-memory stores embeddings, and all three talk to each other through the same ArticleRequest, ArticleResponse, ToolInvocation types. When a model gains a new field, the change lands in packages/py-shared/src/py_shared/, all ten toolers pick it up on their next process boot, and the type checker (uv run mypy) flags any tooler that still constructs the old shape.

Under Poetry this was the part that hurt most: every tooler needed its own re-lock after a schema change, and forgetting to re-lock one service caused runtime ValidationErrors in staging. Under uv there is exactly one lock, one venv, one source of truth. The Pydantic v2 migration, discussed in the Pydantic v2 migration guide, became a one-shot PR instead of ten coordinated ones.

Edge cases worth knowing

A few rough edges surfaced during the migration. uv does not support per-member Python versions in a workspace; requires-python is enforced at the root. This is a feature for deployments (every tooler runs on the same interpreter) but it does require agreeing on a Python version across the whole repo. The black_hat_rust workspace pins >=3.12 at the root and lets individual toolers narrow further if needed, though none currently do.

Second, uv run respects --package for resolving which pyproject to read, but the working directory matters for relative imports. The monorepo convention is to either cd into the tooler directory or pass --project toolers/tooler-seo explicitly. Forgetting this is the most common onboarding mistake new contributors make.

Third, publishing a single member to a private index requires uv build --package <name> followed by uv publish. The workspace resolves py-shared from source during development, but at publish time the built wheel bakes in whatever version packages/py-shared/pyproject.toml declared. Keep py-shared versioned deliberately and bump it when its public surface changes.

When not to use a uv workspace

If the services in your monorepo genuinely need conflicting versions of the same library, a workspace will force you to reconcile that conflict instead of papering over it. For most FastAPI monorepos this is the right pressure, but if you are bridging two services that pin incompatible SDKs (say, one on openai<1.0 and one on openai>=1.0), you either need to split the repos or extract an adapter layer. The uv resolution docs cover the override and constraint knobs that can defer the fight, but the long-term answer is to let one version win.

For the black_hat_rust monorepo, single-version resolution is the whole point: ten services, one Pydantic, one httpx, one anthropic SDK, one place to look when something breaks. That is a trade the workspace model was designed for, and uv is the first Python tool that ships it without friction.