🐍 Python·Mar 2026·9 rules · 18 min read

python-packaging

A complete guide to packaging and distributing modern Python projects using pyproject.toml, build backends (Hatchling, setuptools, Flit), wheel/sdist building, versioning, entry points, and publishing to PyPI or a private registry. Everything you need to make your Python code ready for distribution.

pyproject.tomlhatchlingwheelsdisttwinePyPISemVerentry-pointssrc-layoutprivate-registry

Why pyproject.toml — Not setup.py

The era of setup.py is over. PEP 517, 518, 621, and 660 define a new standard that puts all packaging configuration in a single file: pyproject.toml. No more separate setup.cfg, MANIFEST.in, or the deprecated python setup.py install command.

1
Single
config file
9
Rule files
in this skill
5+
Build backends
supported

The key benefit: you can use any build backend — Hatchling, setuptools, Flit, PDM, Poetry — without changing your basic workflow. All follow the same standard, and all are installable with pip install -e ..

Choose the Right Build Backend

A build backend is the component that turns your source code into a wheel (.whl) or sdist (.tar.gz). There are five main options, each with its own characteristics:

BackendBest forKey features
Hatchling ⭐New projects, any sizeAuto-include data files, dynamic versioning, hooks
setuptoolsLegacy projects / migration, C extensionsMost mature, C/Cython support
FlitSimple pure-Python librariesMinimal config, zero boilerplate
PDMFull project managerLock file, integrated virtual env
PoetryIntegrated dependency + buildLock file, rich CLI, resolver

✓ Default recommendation: Hatchling

For new projects, always start with Hatchling. Auto-discovers packages, automatically includes non-.py data files, supports dynamic versioning from Git tags, and offers an expressive [tool.hatch] configuration. No MANIFEST.in required.

Full pyproject.toml Structure

Three main sections that every pyproject.toml must have: [build-system], [project], and [project.urls].

tomlpyproject.toml
# ── Build backend ──────────────────────────────────────
[build-system]
requires      = ["hatchling"]
build-backend = "hatchling.build"

# ── Metadata PEP 621 ────────────────────────────────────
[project]
name            = "my-package"          # kebab-case, unique on PyPI
version         = "1.0.0"              # or dynamic (see below)
description     = "Short description"
readme          = "README.md"
license         = { file = "LICENSE" }
authors         = [{ name = "Name", email = "you@example.com" }]
requires-python = ">=3.10"
classifiers = [
  "Development Status :: 5 - Production/Stable",
  "Intended Audience :: Developers",
  "License :: OSI Approved :: MIT License",
  "Programming Language :: Python :: 3.10",
  "Programming Language :: Python :: 3.11",
  "Programming Language :: Python :: 3.12",
  "Typing :: Typed",
]
keywords      = ["keyword1", "keyword2"]
dependencies  = [
  "requests>=2.28",
  "pydantic>=2.0",
]

[project.urls]
Homepage      = "https://github.com/user/my-package"
Documentation = "https://my-package.readthedocs.io"
Repository    = "https://github.com/user/my-package"
Issues        = "https://github.com/user/my-package/issues"
Changelog     = "https://github.com/user/my-package/blob/main/CHANGELOG.md"

# ── Optional dependencies (extras) ─────────────────────
[project.optional-dependencies]
dev = [
  "pytest>=7.4",
  "pytest-cov>=4.1",
  "black>=23.0",
  "ruff>=0.1",
  "mypy>=1.5",
]
docs = [
  "mkdocs>=1.5",
  "mkdocs-material>=9.4",
]

# ── CLI entry points ────────────────────────────────────
[project.scripts]
my-tool = "my_package.cli:main"

# ── Hatchling: build targets ────────────────────────────
[tool.hatch.build.targets.wheel]
packages = ["src/my_package"]

# ── Other tool config ───────────────────────────────────
[tool.ruff]
line-length    = 88
target-version = "py310"

[tool.mypy]
python_version       = "3.10"
strict               = true
ignore_missing_imports = true

[tool.pytest.ini_options]
testpaths = ["tests"]
addopts   = "-ra -q"

Project Structure — src Layout

src layout is the standard recommended by PyPA. The difference from flat layout: the package is not at the root, but inside a src/ folder. This prevents the "works locally but fails after install" bug because imports in tests use the installed version, not the local directory version.

bashproject structure
my-package/
├── pyproject.toml          ← the only config file
├── README.md
├── LICENSE
├── CHANGELOG.md
├── .gitignore
├── src/
│   └── my_package/         ← underscore, not kebab-case
│       ├── __init__.py
│       ├── _version.py     ← optional: version string only
│       ├── py.typed        ← optional: typed package marker (PEP 561)
│       ├── core.py
│       ├── cli.py          ← if you have a CLI
│       └── data/           ← data files (JSON, templates, etc.)
│           └── config.json
└── tests/
    ├── conftest.py
    ├── test_core.py
    └── test_cli.py

In a well-structured __init__.py, expose the public API and version string:

pythonsrc/my_package/__init__.py
# Version read from metadata (no need to hardcode in two places)
from importlib.metadata import version, PackageNotFoundError

try:
    __version__ = version("my-package")
except PackageNotFoundError:
    __version__ = "unknown"

# Public API
from .core import MyClass, main_function
from .utils import helper

__all__ = ["__version__", "MyClass", "main_function", "helper"]

Versioning — Single Source of Truth

One of the classic Python packaging problems: version scattered across many places__init__.py, pyproject.toml, CHANGELOG.md, and the Git tag. Use one of the three strategies below, and never mix them:

OPTION 01 — Simplest
Version in pyproject.toml only
Set version = "1.2.3" in [project], then read it via importlib.metadata in your code. No duplication. Best for projects that bump versions manually or with hatch version patch.
OPTION 02 — Most Common
Version in __init__.py, dynamic in pyproject
Set __version__ = "1.2.3" in __init__.py, then dynamic = ["version"] in pyproject with [tool.hatch.version] path = "src/my_package/__init__.py". Hatchling automatically reads the version string from that file.
OPTION 03 — CI/CD Automated
Version from Git tag (hatch-vcs / setuptools-scm)
No hardcoded version at all. The plugin reads the version from a Git tag at build time: git tag v1.2.3 && python -m build → wheel named my_package-1.2.3-py3-none-any.whl. Ideal for GitHub Actions triggered on tag pushes.
bashbump version with hatch
# Install hatch once
pip install hatch

# Bump version (automatically updates file + version)
hatch version patch    # 1.2.3 → 1.2.4
hatch version minor    # 1.2.3 → 1.3.0
hatch version major    # 1.2.3 → 2.0.0
hatch version 2.0.0    # set directly to a specific version

# Or with bump-my-version (more flexible)
pip install bump-my-version
bump-my-version bump patch
bump-my-version bump --new-version "2.0.0rc1"

Managing Dependencies Correctly

The version specifiers you write in dependencies determine whether your package is compatible with the wider ecosystem or creates conflicts in every environment.

tomlversion specifiers
dependencies = [
  "requests",                         # ❌ any version — avoid
  "requests>=2.28",                   # ✓ minimum version — most common
  "requests>=2.28,<3.0",             # ✓ range — for libs that break often
  "requests~=2.28",                   # ✓ compatible: >=2.28, <3.0
  "requests==2.28.2",                # ✓ exact pin — OK for apps, avoid in libraries
  
  # Platform-specific dependencies
  "pywin32>=300; sys_platform == 'win32'",
  "pyobjc-framework-Cocoa>=9.0; sys_platform == 'darwin'",
  
  # Python version backports
  "tomllib; python_version < '3.11'",
]

Rule of thumb: Library vs Application

Library published to PyPI → use >=minimum, avoid upper bounds. Let the user resolve conflicts.
Application / CLI shipped to end users → exact pin or tight range is fine. Reproducibility matters more than flexibility.

Build: Creating Wheels and SDists

PyPI distributes packages in two formats: wheel (.whl) for fast installation without compiling, and sdist (.tar.gz) as the source distribution. Always build both before uploading.

bashbuild commands
# Install build tools
pip install build twine
# Or faster with uv:
uv tool install build && uv tool install twine

# Build sdist + wheel at once
python -m build

# Output in dist/:
# dist/my_package-1.2.3-py3-none-any.whl   ← wheel
# dist/my_package-1.2.3.tar.gz              ← sdist

# Verify before uploading
twine check dist/*

# Install locally for testing
pip install dist/my_package-1.2.3-py3-none-any.whl

# Dev mode (editable) — source changes take effect immediately
pip install -e .
pip install -e ".[dev,docs]"    # with extras

Wheel names follow the standard format: {distribution}-{version}-{python}-{abi}-{platform}.whl. Pure Python packages produce py3-none-any.whl (runs on all platforms). Packages with C extensions produce platform-specific wheels like cp311-cp311-linux_x86_64.whl.

Entry Points — CLI and Plugin System

Entry points let your package register CLI commands that are immediately runnable after pip install, and enable a plugin system that other packages can extend.

tomlpyproject.toml — entry points
# Console scripts — CLI commands
[project.scripts]
my-tool       = "my_package.cli:main"
my-tool-admin = "my_package.cli:admin"

# GUI scripts — no console window on Windows
[project.gui-scripts]
my-gui = "my_package.gui:main"

# Plugin system — extendable by other packages
[project.entry-points."my_package.plugins"]
csv-exporter  = "my_csv_plugin:CsvExporter"
json-exporter = "my_json_plugin.core:JsonExporter"
pythonsrc/my_package/cli.py — with Click
import click

@click.group()
@click.version_option()
def cli() -> None:
    """Your tool description."""
    pass

@cli.command()
@click.argument("input", type=click.Path(exists=True))
@click.option("-o", "--output", default="output.txt", help="Output file")
@click.option("-v", "--verbose", is_flag=True)
def process(input: str, output: str, verbose: bool) -> None:
    """Process an input file."""
    if verbose:
        click.echo(f"Processing {input} → {output}")

def main() -> None:
    cli()  # entry point calls this

Publish to PyPI — Full Workflow

There are two authentication methods for PyPI: Trusted Publisher (OIDC), which requires no API token at all (for GitHub Actions), and API Token for manual uploads or other CI systems. Always test on TestPyPI first before going to production.

STEP 01
Bump the version and create a Git tag
PyPI does not allow uploading the same version twice. Bump first: hatch version minor, commit, then git tag v1.3.0 && git push --tags.
STEP 02
Clean build
Remove old dist: rm -rf dist/, then rebuild: python -m build. Verify: twine check dist/*.
STEP 03
Test on TestPyPI
twine upload --repository testpypi dist/*, then install from TestPyPI in a clean virtualenv: pip install --index-url https://test.pypi.org/simple/ my-package. Confirm that imports and CLI work correctly.
STEP 04
Upload to PyPI production
twine upload dist/* using a token from pypi.org/manage/account/token. Or use GitHub Actions with Trusted Publisher (zero token required).
yaml.github/workflows/publish.yml — Trusted Publisher
name: Publish to PyPI
on:
  push:
    tags: ["v*"]

jobs:
  publish:
    runs-on: ubuntu-latest
    environment: pypi
    permissions:
      id-token: write       # required for OIDC — no token needed!

    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: "3.12"

      - name: Build
        run: |
          pip install build
          python -m build

      - name: Verify version matches tag
        run: |
          PKG_VER=$(python -c "import my_package; print(my_package.__version__)")
          TAG_VER="${GITHUB_REF#refs/tags/v}"
          [ "$PKG_VER" = "$TAG_VER" ] || { echo "Version mismatch"; exit 1; }

      - name: Publish to PyPI
        uses: pypa/gh-action-pypi-publish@release/v1
        # no password / token required!

Private Registry — Nexus, Artifactory, AWS CodeArtifact

For internal company packages that you don't want to publish to the public PyPI, you need a private registry. The best option depends on your existing infrastructure:

RegistryBest forHosting
Nexus RepositoryEnterprise on-premiseSelf-hosted
Artifactory (JFrog)Enterprise cloud/on-premSaaS / Self-hosted
AWS CodeArtifactAWS ecosystemAWS managed
pypiserverSmall teams, quick setupSelf-hosted, Docker
Google Artifact RegistryGCP ecosystemGCP managed
bashupload and install from a private registry
# Upload to Nexus
TWINE_REPOSITORY_URL=https://nexus.company.com/repository/pypi-internal/ \
TWINE_USERNAME=myuser \
TWINE_PASSWORD=mypassword \
twine upload dist/*

# Install from private registry
pip install my-package \
  --index-url https://nexus.company.com/repository/pypi-all/simple/ \
  --trusted-host nexus.company.com

# Dual: private registry + PyPI fallback in pip.conf
# ~/.config/pip/pip.conf
[global]
index-url = https://nexus.company.com/repository/pypi-all/simple/
extra-index-url = https://pypi.org/simple/

Pre-Publish Checklist

  • Package name is unique on PyPI (check https://pypi.org/project/name/)
  • Version has been bumped (you cannot upload the same version twice)
  • twine check dist/* returns PASSED
  • README.md renders correctly (test: preview on GitHub)
  • License file exists and has content
  • requires-python is set correctly (minimum supported Python)
  • Tested install in a clean virtualenv
  • TestPyPI upload succeeded before pushing to PyPI production
  • Git tag created: git tag v1.x.x && git push --tags
  • CHANGELOG.md updated with a summary of changes

Anti-Patterns to Avoid

  • Using setup.py for new projects — migrate to pyproject.toml
  • Hardcoding version in multiple places — use a single source of truth
  • Forgetting to set requires-python — users will be caught off guard when install fails
  • Committing API tokens or .pypirc to the repo — use env variables or secrets
  • Skipping TestPyPI for the first release — always test first
  • Using --extra-index-url carelessly — vulnerable to dependency confusion attacks
  • Building without rm -rf dist/ first — old wheels get uploaded too
  • Tight upper bounds in libraries: requests<3.0 — limits flexibility for users
AI Skill File

Download python-packaging Skill

This .skill file contains 9 complete rule files ready to use with Claude or any other AI tool as expert context for all questions around modern Python packaging.

9 rule files
pyproject.toml ref
PyPI publish guide
Private registry setup
Versioning patterns
⬇ Download Skill File

Hosted by ZynU Host · host.zynu.net