I mass-deleted requirements.txt files from a monorepo last month. Fourteen of them. Some had unpinned dependencies, some had pins from 2021, one had a comment that said # TODO: fix this next to a package that no longer exists on PyPI. Nobody cried. The CI pipeline didn’t break. We’d already moved everything to pyproject.toml and uv.

Python packaging has been a punchline for years. “It’s 2024 and we still can’t install packages properly” was a meme that wrote itself. But here’s the thing — it’s 2026 now, and the landscape genuinely changed. Not incrementally. Fundamentally. uv showed up and rewrote the rules. Poetry matured into something reliable. pyproject.toml won. The old setup.py + requirements.txt + virtualenv + pip stack isn’t dead, but it’s legacy. If you’re starting a new project today and reaching for that combo, you’re choosing the hard path for no reason.

I wrote about Poetry vs pip back in 2024. A lot of that still holds, but the ecosystem moved fast. This is where things stand now.


The War Story: Why Pinning Matters More Than You Think

Let me tell you about the deploy that ruined a Friday.

We had a data pipeline — nothing fancy, just a FastAPI service that pulled from Postgres, ran some transforms, pushed to S3. It’d been running fine for months. The requirements.txt looked like this:

fastapi>=0.100.0
uvicorn
sqlalchemy>=2.0
boto3
pydantic

Notice the problem? No upper bounds. No lock file. No pins on transitive dependencies. Every deploy was a roll of the dice — we just didn’t know it yet.

On a Friday afternoon (of course), someone merged a PR that touched nothing related to dependencies. The CI pipeline rebuilt the Docker image, pip grabbed the latest everything, and Pydantic v3 had dropped that morning. It was a major version bump with breaking changes to model serialization. Our entire API surface returned 500s. Every endpoint. In production.

The rollback took 40 minutes because we didn’t have the previous image tagged properly either. That’s a separate problem, but the root cause was simple: unpinned dependencies in a file that gave us zero reproducibility guarantees.

I’d been advocating for Poetry internally for months. After that Friday, I didn’t need to advocate anymore. The team moved to pyproject.toml with a lock file the following Monday. These days I’d reach for uv instead, but the lesson is the same — if your dependency versions aren’t locked, your builds aren’t reproducible, and your deploys are a coin flip.


uv Changed Everything

I don’t say this lightly. I’ve been writing Python for over a decade and I’m deeply skeptical of “this tool will fix everything” claims. But uv, from the Astral team (same folks behind Ruff), genuinely changed how I work with Python.

uv is a Python package and project manager written in Rust. It’s fast. Not “a bit faster than pip” fast — we’re talking 10-100x faster. Installing a fresh virtual environment with 200 packages takes seconds, not minutes. But speed isn’t even the main thing. It’s that uv replaces an entire stack of tools with one binary.

Here’s what uv replaces:

  • pip — package installation
  • pip-tools — dependency compilation and locking
  • virtualenv / venv — environment creation
  • pyenv — Python version management
  • pipx — running CLI tools in isolated environments

One tool. One command. Let me show you what a typical workflow looks like.

Start a new project:

uv init my-service
cd my-service

This gives you a pyproject.toml, a .python-version file, and a basic project structure. No boilerplate. No choosing between setuptools and flit and hatch.

Add dependencies:

uv add fastapi uvicorn sqlalchemy boto3
uv add --dev pytest ruff mypy

Each uv add updates your pyproject.toml and regenerates the uv.lock file. The lock file pins every transitive dependency with hashes. That Friday deploy disaster? Can’t happen.

Run your code:

uv run python -m my_service
uv run pytest
uv run mypy src/

uv run automatically creates and manages the virtual environment. You never touch source .venv/bin/activate again. It just works.

Need a specific Python version?

uv python install 3.12
uv python pin 3.12

uv downloads and manages Python installations. No pyenv, no deadsnakes PPA, no compiling from source. This is huge for CI/CD pipelines where you want deterministic Python versions without relying on whatever the runner image ships.


Poetry in 2026: Still Relevant, Still Good

Poetry didn’t disappear when uv showed up. It’s at version 2.x now, and it’s a mature, battle-tested tool that thousands of production projects depend on. If your team is already on Poetry and things are working, there’s no urgent reason to migrate.

Poetry’s strengths haven’t changed much since I compared it to pip:

poetry new my-library
cd my-library
poetry add requests httpx
poetry add --group dev pytest coverage
poetry install
poetry run pytest

The poetry.lock file is solid. Dependency resolution works. Virtual environment management is automatic. Publishing to PyPI is built in with poetry publish. For library authors especially, Poetry’s workflow is clean.

Where Poetry falls short compared to uv in 2026:

  • Speed. Poetry’s resolver is written in Python. It’s gotten faster over the years, but it’s still noticeably slower than uv on large dependency trees. On a project with 300+ transitive dependencies, I’ve seen poetry lock take 45 seconds where uv lock finishes in under 2.
  • Python version management. Poetry doesn’t manage Python installations. You still need pyenv or system packages.
  • Scope. Poetry is a dependency manager and build tool. uv is that plus a Python version manager plus a tool runner. Fewer moving parts.

That said, Poetry has one advantage that matters: ecosystem maturity. Every CI provider has Poetry examples. Every tutorial covers it. Your team already knows it. Switching tools has a cost, and “uv is faster” might not justify that cost for an established project.

My rule of thumb: new projects get uv. Existing Poetry projects stay on Poetry unless there’s a specific pain point.


pyproject.toml Won (Finally)

Remember when Python had setup.py, setup.cfg, requirements.txt, MANIFEST.in, Pipfile, Pipfile.lock, and tox.ini all in the same project? Each tool had its own config file, its own format, its own opinions.

pyproject.toml ended that. PEP 517, PEP 518, PEP 621 — the standards landed, the tools adopted them, and now there’s one file to rule them all. Here’s what a modern pyproject.toml looks like:

[project]
name = "my-service"
version = "1.2.0"
description = "Data pipeline service"
requires-python = ">=3.11"
dependencies = [
    "fastapi>=0.115.0,<1.0",
    "uvicorn>=0.30.0",
    "sqlalchemy>=2.0,<3.0",
    "boto3>=1.35.0",
    "pydantic>=2.0,<3.0",
]

[project.optional-dependencies]
dev = [
    "pytest>=8.0",
    "ruff>=0.8.0",
    "mypy>=1.13",
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.ruff]
line-length = 100
target-version = "py311"

[tool.mypy]
strict = true
python_version = "3.11"

[tool.pytest.ini_options]
testpaths = ["tests"]

One file. Project metadata, dependencies, dev dependencies, build system config, linter config, type checker config, test config. Everything. If you’re using Python type hints with mypy (and you should be), the config lives right here.

The [project] table follows PEP 621, which means it’s tool-agnostic. Whether you use uv, Poetry, pip, hatch, or flit to install your project, the metadata format is the same. Poetry still uses [tool.poetry.dependencies] for its own dependency specification, but it reads PEP 621 metadata too.


Lock Files: The Non-Negotiable

After that Friday deploy, I became a lock file evangelist. I don’t care which tool generates it — uv.lock, poetry.lock, requirements.txt from pip-tools — but you need one, and it needs to be in version control.

A lock file pins every dependency, including transitive ones, to exact versions with integrity hashes. Here’s what uv lock produces (abbreviated):

# uv.lock
[[package]]
name = "fastapi"
version = "0.115.6"
source = { registry = "https://pypi.org/simple" }
dependencies = [
    { name = "pydantic" },
    { name = "starlette" },
]

[[package]]
name = "pydantic"
version = "2.10.4"
source = { registry = "https://pypi.org/simple" }
dependencies = [
    { name = "pydantic-core" },
    { name = "typing-extensions" },
]

Every version is exact. Every transitive dependency is captured. When you run uv sync on another machine or in CI, you get the exact same environment. Not “compatible versions” — the same versions.

pip-tools is the old-school way to get this with plain pip:

# Install pip-tools
uv tool install pip-tools

# Compile requirements.in to requirements.txt with pinned versions
pip-compile requirements.in -o requirements.txt --generate-hashes

# Install exactly what's in the lock file
pip-sync requirements.txt

It works, but it’s manual. You have to remember to recompile after changing requirements.in. uv and Poetry handle this automatically when you add or update dependencies.


Docker and CI: Where This All Comes Together

Packaging decisions hit hardest in two places: Docker builds and CI pipelines. Get it wrong and you’re waiting 10 minutes for every build. Get it right and deploys are fast and deterministic.

Here’s my standard Dockerfile for a Python service using uv. I covered multi-stage Docker builds in detail before — same principles apply here:

FROM python:3.12-slim AS builder

COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv

WORKDIR /app
COPY pyproject.toml uv.lock ./

RUN uv sync --frozen --no-dev --no-install-project

COPY src/ src/
RUN uv sync --frozen --no-dev

FROM python:3.12-slim
WORKDIR /app
COPY --from=builder /app/.venv .venv
COPY --from=builder /app/src src
ENV PATH="/app/.venv/bin:$PATH"
CMD ["python", "-m", "my_service"]

The --frozen flag is critical. It tells uv to fail if the lock file is out of date instead of silently resolving new versions. That’s your safety net. The two-step uv sync means dependency layers get cached by Docker — you only reinstall packages when pyproject.toml or uv.lock change.

For GitHub Actions CI, uv has first-class support:

- uses: astral-sh/setup-uv@v4
  with:
    version: "latest"

- run: uv sync --frozen
- run: uv run pytest
- run: uv run mypy src/
- run: uv run ruff check src/

Four lines. Python version comes from .python-version in your repo. Dependencies come from uv.lock. No pip install, no pip cache, no actions/setup-python dance. It just works.


What Should You Actually Use?

I’ll make this simple.

Use uv if: you’re starting a new project, you want one tool for everything, you care about speed, or you’re tired of managing pyenv + pip + virtualenv + pip-tools as separate pieces. uv is where the momentum is. Astral ships updates constantly, the community is growing fast, and the DX is the best I’ve seen in Python tooling.

Use Poetry if: your team already uses it and it’s working. Poetry 2.x is stable and capable. The resolver is slower than uv but it’s correct. The ecosystem support is deep. Don’t switch tools for the sake of switching.

Use pip-tools if: you’re in a constrained environment where you can’t install new tools, or you need to stay close to pip for organizational reasons. pip-compile + pip-sync gives you lock files with plain pip. It’s not glamorous but it works.

Don’t use raw pip + requirements.txt for anything beyond throwaway scripts. No resolver, no lock file, no environment management. It’s 2026. We have better options.

Whatever you pick, the non-negotiables are:

  1. pyproject.toml for project metadata — not setup.py
  2. A lock file in version control — uv.lock, poetry.lock, or compiled requirements.txt
  3. Pinned transitive dependencies with hashes
  4. Automated environment creation — no “works on my machine”

If you’re building async Python services, the packaging choice doesn’t affect your runtime code at all. FastAPI with uvicorn works identically whether you installed it with uv, Poetry, or pip. The difference is in everything around the code: how fast your CI runs, whether your deploys are reproducible, how painful onboarding is for new developers.


Where We’re Headed

The Python packaging ecosystem spent a decade fragmented. PEPs landed slowly, tools competed on incompatible formats, and everyone had a different “right way” to set up a project. That era is ending.

pyproject.toml is the standard. Lock files are expected. uv proved that Python tooling can be fast and unified. Poetry proved that dependency management can be correct and user-friendly. The community converged, and the result is that setting up a Python project in 2026 is genuinely pleasant for the first time.

I still have scars from that Friday deploy. But I also haven’t had an unpinned dependency break production since. The tools caught up. Use them.