Skip to content

Latest commit

 

History

History
1769 lines (1297 loc) · 77.8 KB

File metadata and controls

1769 lines (1297 loc) · 77.8 KB

Table of contents

You can find the prerequisites to release Apache Airflow in README.md.

Collect ambiguities during the release (for a follow-up doc PR)

These instructions are imperfect. Every release uncovers at least one command that has drifted, one step that is under-documented, or one automation that silently did the wrong thing. As you run through this document, jot down any such observations in a scratch file kept outside the repo (anywhere that is not tracked by git — a note in your home directory, a scratchpad, a gist). Once the release has landed, turn those notes into a follow-up PR against this document.

Keeping the scratch file out of the repo avoids accidentally committing release-manager notes along with the release-prep PR, and makes it obvious that the notes are input to the next doc PR rather than something to keep around long-term.

Perform review of security issues that are marked for the release

We are keeping track of security issues in the Security Issues repository currently. As a release manager, you should have access to the repository. Please review and ensure that all security issues marked for the release have been addressed and resolved. Ping security team (comment in the issues) if anything missing or the issue does not seem to be addressed.

Additionally, the dependabot alerts and code scanning alerts should be reviewed and security team should be pinged to review and resolve them.

Selecting what to put into the release

The first step of a release is to work out what is being included. This differs based on whether it is a major/minor or a patch release.

  • For a major or minor release, you want to include everything in main at the time of release; you'll turn this into a new release branch as part of the rest of the process.

  • For a patch release, you will be selecting specific commits to cherry-pick and backport into the existing release branch.

i18n workflow

Note

  1. The instructions in this section should be applied only for major/minor releases.
  2. It is recommended to delegate all operations in this task to another committer.
  3. Except for the dev list announcements, it is recommended to communicate them via the #i18n Slack channel as well.

Validating completeness of locale files

Before cutting the release candidate (RC), you should verify the completeness of all merged locale files. Generate a completeness output for all locale files – follow the instructions in section 8.1 of the internationalization (i18n) policy to do so.

Patch releases (v3-X-test branch)

For patch releases, post a reminder to the dev@airflow.apache.org list to complete missing phrases against the v3-X-test branch.

Subject:

[REMINDER] i18n phrases for Airflow ${VERSION} patch release (v${VERSION_BRANCH}-test)

Body (assuming delegation to another committer):

cat <<EOF
Hey fellow Airflowers,

I'm sending this reminder on behalf of the release managers.
We plan to cut the Airflow ${VERSION} RC soon/by <RELEASE_DATE> from v${VERSION_BRANCH}-test.

After running the i18n completeness script against the v${VERSION_BRANCH}-test branch, here is the current coverage across merged locales as of <CURRENT_DATE>:

<OUTPUT_OF_I18N_COMPLETENESS_SCRIPT>

Translation owners and engaged translators are kindly asked to add missing phrases in the v3-<X>-test branch ahead of the RC.

Notes:
1. Changes merged after the final patch release won't be included, and missing terms will fall back to English.
2. Please coordinate via the #i18n Slack channel if you need assistance or expect terminology changes.
3. Keep PRs small and focused to minimize review time on the patch branch.

Thanks for your cooperation!
<your name>
EOF

When it is time to cut the RC:

  • Regenerate the completeness output against v${VERSION_BRANCH}-test.
  • Post the final completeness output on the same thread.

Minor/Major releases

If the median completeness across all supported languages is below 90%, or upon other justifying circumstances (e.g., release of a critical UI feature), you should consider skipping the following instructions and applying an i18n translation freeze instead (see subsection below). Otherwise, you should announce the completeness status to the dev@airflow.apache.org mailing list.

Subject:

[ANNOUNCEMENT] i18n completeness check for Airflow ${VERSION} RC

Body (assuming delegation to another committer):

cat <<EOF
Hey fellow Airflowers,

I'm sending this message on behalf of the release managers.
The release managers are planning to cut the Airflow ${VERSION} RC soon/by <RELEASE_DATE>.

After running the i18n completeness script, this is the coverage state of all merged locales as of <CURRENT_DATE>:

<OUTPUT_OF_I18N_COMPLETENESS_SCRIPT>

Code owners, translation owners, and engaged translators whose locales are currently below 90% coverage are kindly asked to complete their translations prior to the RC being cut.
This will help ensure that all languages included in the upcoming release remain complete and consistent.

Contributors are also encouraged to plan their PRs accordingly and avoid introducing large sets of new English terms close to the release date, to prevent unexpected translation work for code owners.

Important notes:
1. Locales that remain incomplete for two consecutive major or minor releases may be removed from the project, according to the i18n policy.
2. Any changes merged after the final release won't be included, and missing terms will fall back to English.
3. Code owners are responsible for ensuring that their assigned locales reach at least 90% coverage before the RC is cut.
4. Requests for assistance, coordination, or early heads-up on expected terminology changes may be shared in the #i18n Slack channel.
5. PRs introducing new translations may continue to be merged as usual, provided that coverage remains complete by the RC date.

Thanks for your cooperation!
<your name>
EOF

When it is time to cut the RC, you should:

  1. Generate an additional completeness output: a. If there are incomplete locales that were also incomplete in the previous major/minor release, please contact the code owner and ask them to act according to section "Relinquishing translation/code ownership" in the i18n policy (section 6.4). b. If there are other incomplete locales, please write it as a reminder for the next major/minor release.
  2. Post the final completeness output on the same thread.

Applying an i18n translation freeze

Before cutting the release candidate (RC), you may announce a freeze time to allow translators to complete translations for the upcoming release. During the freeze time, no changes to the English locale file should be merged (enforced by CI checks), except for approved exemptions (see below). In general, if the overall median coverage across all supported languages stays above 90%, a freeze is not required. However, if significant changes are introduced that lower the median coverage to or below this threshold, a freeze period can help translators complete their work without being overloaded. When a freeze is used, it should remain in effect until the median coverage reaches at least 90% again, or until the RC is cut, whichever comes first. The freeze should be announced at least two weeks before it starts, to allow time for translators to get ready and for contributors to plan their PRs accordingly. To prepare for the announcement, fetch the completeness output generated earlier. The announcement should be sent via the dev@airflow.apache.org mailing list – you may accompany it with a GitHub issue for tracking purposes.

Subject:

cat <<EOF
[ANNOUNCEMENT] English Translation freeze for Airflow ${VERSION} RC starting at <START_DATE>
EOF

Body (assuming delegation to another committer):

cat <<EOF
Hey fellow Airflowers,

I'm sending this message on behalf of the release managers.
The release managers are planning to cut the Airflow ${VERSION} RC soon/by <RELEASE_DATE>.

After running the i18n completeness script, this is the coverage state of all merged locales as of <CURRENT_DATE>:

<OUTPUT_OF_I18N_COMPLETENESS_SCRIPT>

To prevent overloading the translators and to ensure completeness of all translations by the release, a freeze upon the English locale will be applied starting <START_DATE>,
and until the RC is cut.
Code owners, translation owners, and engaged translators are asked to complete the coverage of their assigned locales during this time.
Contributors are also encouraged to plan their PRs accordingly, to avoid modifying the English locale during the freeze time.

Important notes:
1. Locales that remain incomplete for two consecutive major or minor releases may be removed from the project, according to the i18n policy.
2. Any changes merged after the final release won't be included, and missing terms will fall back to English.
3. Any PR that modifies the English locale during the freeze time will fail CI checks.
4. Requests for exemptions should be communicated in the #i18n Slack channel, and approved by at least 1 PMC member - guidelines for approval are available in the i18n policy.
5. PRs approved for an exemption will be labeled with `allow translation change`, and then the relevant CI check will pass. Translators are encouraged to complete the translations for the exempted terms during the freeze time.
6. Merging PRs for adding new translations could be done during the freeze time - designated code owners should validate that by the end of the freeze time, the coverage of the suggested translation is complete.


Thanks for your cooperation!
<your name>
EOF

When the freeze starts, you should merge a PR for setting the flag FAIL_WHEN_ENGLISH_TRANSLATION_CHANGED to True in the file selective_checks.py. If the freeze gets extended beyond the originally announced date, you should post an update on the same thread. When it is time to cut the RC, you should:

  1. Generate an additional completeness output: a. If there are incomplete locales that were also incomplete in the previous completeness output, please contact the code owner and ask them to act according to section "Relinquishing translation/code ownership" in the i18n policy (section 6.4). b. If there are other incomplete locales, please write it as a reminder for the next major/minor release.
  2. Create a PR for setting the flag back to False.
  3. Post on the same thread that the freeze is lifted, and share the final completeness output.

Note

Release managers - do not hold the release process beyond the due date if there are still incomplete locales after the freeze. It is the responsibility of code owners to ensure the completeness of their locales by the due date.

Selecting what to cherry-pick

For obvious reasons, you can't cherry-pick every change from main into the release branch - some are incompatible without a large set of other changes, some are brand-new features, and some just don't need to be in a release.

In general only security fixes, data-loss bugs and regression fixes are essential to bring into a patch release; also changes in dependencies (pyproject.toml) resulting from releasing newer versions of packages that Airflow depends on. Other bugfixes can be added on a best-effort basis, but if something is going to be very difficult to backport (maybe it has a lot of conflicts, or heavily depends on a new feature or API that's not being backported), it's OK to leave it out of the release at your sole discretion as the release manager - if you do this, update the milestone in the issue to the "next" minor release.

Many issues will be marked with the target release as their Milestone; this is a good shortlist to start with for what to cherry-pick.

For a patch release, find out other bug fixes that are not marked with the target release as their Milestone and mark those as well. You can accomplish this by running the following command:

./dev/airflow-github needs-categorization 2.3.2 HEAD

You are likely want to cherry-pick some of the latest doc changes in order to bring clarification and explanations added to the documentation. Usually you can see the list of such changes via:

git fetch upstream
git log --oneline upstream/v3-2-test | sed -n 's/.*\((#[0-9]*)\)$/\1/p' > /tmp/merged
git log --oneline --decorate upstream/v2-2-stable..upstream/main -- docs/apache-airflow docs/docker-stack/ | grep -vf /tmp/merged

Those changes that are "doc-only" changes should be marked with type:doc-only label so that they land in documentation part of the changelog. The tool to review and assign the labels is described below.

Making the cherry picking

It is recommended to clone Airflow from apache/airflow directly (not your fork) into a dedicated release-manager checkout and run the commands on the relevant test branch there. This repo follows the standard convention that upstreamapache/airflow and origin → your fork (see contributing-docs/10_working_with_git.rst), so in this release-manager clone add apache/airflow as upstream:

git remote add upstream https://github.com/apache/airflow.git
git fetch upstream

All the commands in this document assume upstream is the remote that tracks apache/airflow. If you previously set this up under a different name (e.g. apache), either rename it with git remote rename apache upstream or pass the alternative name via the --remote-name option where the commands accept it.

To see cherry picking candidates (unmerged PR with the appropriate milestone), from the test branch you can run:

./dev/airflow-github compare 3.1.3 --unmerged

You can start cherry picking from the bottom of the list. (older commits first)

When you cherry-pick, pick in chronological order onto the vX-Y-test release branch. You'll move them over to be on vX-Y-stable once the release is cut. Use the -x option to keep a reference to the original commit we cherry picked from. ("cherry picked from commit ...")

git cherry-pick <hash-commit> -x

Collapse Cadwyn Migrations

Before cutting an RC, bump the HEAD date of Cadwyn versioned API (execution api for now: airflow-core/src/airflow/api_fastapi/execution_api) to reflect the tentative release date of Airflow. All the Cadwyn migrations in between the tentative release date and last release date should be collapsed.

Refer #49116 as a good example.

Reviewing cherry-picked PRs and assigning labels

We have the tool that allows to review cherry-picked PRs and assign the labels ./assign_cherry_picked_prs_with_milestone.py

It allows to manually review and assign milestones and labels to cherry-picked PRs:

./dev/assign_cherry_picked_prs_with_milestone.py assign-prs --previous-release v2-2-stable --current-release upstream/v2-2-test --milestone-number 48

It summarises the state of each cherry-picked PR including information whether it is going to be excluded or included in changelog or included in doc-only part of it. It also allows to re-assign the PRs to the target milestone and apply the changelog:skip or type:doc-only label.

You can also add --skip-assigned flag if you want to automatically skip the question of assignment for the PRs that are already correctly assigned to the milestone. You can also avoid the "Are you OK?" question with --assume-yes flag.

You can review the list of PRs cherry-picked and produce a nice summary with --print-summary (this flag assumes the --skip-assigned flag, so that the summary can be produced without questions:

./dev/assign_cherry_picked_prs_with_milestone.py assign-prs --previous-release v2-2-stable \
  --current-release upstream/v2-2-test --milestone-number 48 --skip-assigned --assume-yes --print-summary \
  --output-folder /tmp

This will produce summary output with nice links that you can use to review the cherry-picked changes, but it also produces files with list of commits separated by type in the folder specified. In the case above, it will produce three files that you can use in the next step:

Changelog commits written in /tmp/changelog-changes.txt

Doc only commits written in /tmp/doc-only-changes.txt

Excluded commits written in /tmp/excluded-changes.txt

You can see for example which files have been changed by "doc-only" or "excluded" changes, to make sure that no "sneaky" changes were by mistake classified wrongly.

git show --format=tformat:"" --stat --name-only $(cat /tmp/doc-only-changes.txt) | sort | uniq

Then if you see suspicious file (example airflow/sensors/base.py) you can find details on where they came from:

git log upstream/v3-2-test --format="%H" -- airflow/sensors/base.py | grep -f /tmp/doc-only-changes.txt | xargs git show

And the URL to the PR it comes from:

git log upstream/v3-2-test --format="%H" -- airflow/sensors/base.py | grep -f /tmp/doc-only-changes.txt | \
    xargs -n 1 git log --oneline --max-count=1 | \
    sed s'/.*(#\([0-9]*\))$/https:\/\/github.com\/apache\/airflow\/pull\/\1/'

Prepare the Apache Airflow Package RC

Update the milestone

Before cutting an RC, we should look at the milestone and merge anything ready, or if we aren't going to include it in the release we should update the milestone for those issues. We should do that before cutting the RC so the milestone gives us an accurate view of what is going to be in the release as soon as we know what it will be.

Build RC artifacts

The Release Candidate artifacts we vote upon should be the exact ones we vote against, without any modification other than renaming – i.e. the contents of the files must be the same between voted release candidate and final release. Because of this the version in the built artifacts that will become the official Apache releases must not include the rcN suffix.

  • Set environment variables
# You can avoid repeating this command for every release if you will set it in .zshrc
# see https://unix.stackexchange.com/questions/608842/zshrc-export-gpg-tty-tty-says-not-a-tty
export GPG_TTY=$(tty)

# Set Version
export VERSION=3.1.3
export VERSION_SUFFIX=rc1
export VERSION_RC=${VERSION}${VERSION_SUFFIX}
export VERSION_BRANCH=3-1
export TASK_SDK_VERSION=1.1.3
export TASK_SDK_VERSION_RC=${TASK_SDK_VERSION}${VERSION_SUFFIX}
export PREVIOUS_VERSION=3.1.2
export SYNC_BRANCH=changes-3.1.2rc1 # sync branch, if different from the test branch

# Set AIRFLOW_REPO_ROOT to the path of your git repo
export AIRFLOW_REPO_ROOT=$(pwd)


# Example after cloning
git clone https://github.com/apache/airflow.git airflow
cd airflow
export AIRFLOW_REPO_ROOT=$(pwd)
  • Install breeze command:
uv tool install -e ./dev/breeze
  • Verify your GPG signing key is ready.

    Before you spend 10+ minutes building artifacts only to discover that signing fails, run these checks once:

    # 1. The apache.org key has a secret signing subkey available locally.
    gpg --list-secret-keys apache.org
    
    # 2. Signing actually works (exits 0, writes a .asc, verifies cleanly).
    echo test > /tmp/sign-check && \
        gpg --yes --armor --local-user apache.org \
            --output /tmp/sign-check.asc --detach-sig /tmp/sign-check && \
        gpg --verify /tmp/sign-check.asc /tmp/sign-check && \
        rm -f /tmp/sign-check /tmp/sign-check.asc && \
        echo "GPG signing OK"
    
    # 3. The fingerprint of your signing (sub)key appears in the Airflow KEYS file.
    #    Without this, PMC verifiers cannot validate the release.
    FINGERPRINT=$(gpg --list-keys --with-colons apache.org | awk -F: '/^fpr:/ {print $10; exit}')
    curl -fsS https://dist.apache.org/repos/dist/release/airflow/KEYS | \
        grep -q "${FINGERPRINT}" && echo "Key ${FINGERPRINT} is in KEYS" || \
        echo "MISSING: add your key to KEYS before releasing"

    If any of these fail, fix them before the build step. For first-time release managers, adding your key to the KEYS file is a separate PR against https://dist.apache.org/repos/dist/release/airflow/ (SVN).

    sign.sh defaults to SIGN_WITH=apache.org. If your apache.org uid resolves to multiple keys (rare), set SIGN_WITH explicitly to the fingerprint of the key you want to use.

  • For major/minor version release, run the following commands to create the 'test' and 'stable' branches.

    breeze release-management create-minor-branch --version-branch ${VERSION_BRANCH}
  • Check out the 'test' branch

    git checkout v${VERSION_BRANCH}-test
    git reset --hard upstream/v${VERSION_BRANCH}-test
  • Create a new branch from v${VERSION_BRANCH}-test

    git checkout -b ${SYNC_BRANCH}

    We sync this new branch to the stable branch so that people would continue to backport PRs to the test branch while the RC is being voted. The new branch must be in sync with where you cut it off from the test branch.

  • Switch to the new branch in .github/workflows/ci-notification.yml workflow-status matrix

  • Set the Airflow version in airflow-core/src/airflow/__init__.py (without the RC tag).

  • Set the Task SDK version in task-sdk/src/airflow/sdk/__init__.py (without the RC tag)

  • Those two steps below are temporary - until we finally split task-sdk and airflow-core:

    • Update the Task SDK version >= part in airflow-core/pyproject.toml to == TASK_SDK_VERSION without RC
    • Update the Task SDK version >= part in pyproject.toml to == TASK_SDK_VERSION without RC
  • Run git commit without a message to update versions in docs.

  • Add supported Airflow version to ./scripts/ci/prek/supported_versions.py and let prek do the job again.

  • Replace the versions in README.md about installation and verify that installation instructions work fine.

  • Update the build status badge in README.md to point to the new vX-Y-test branch (the 3.x row in the build status table). The uv run dev/update_github_branch_config.py X Y script does this automatically.

  • Add entry for default python version to PROVIDERS_COMPATIBILITY_TESTS_MATRIX in src/airflow_breeze/global_constants.py with the new Airflow version, and empty exclusion for providers. This list should be updated later when providers with minimum version for the next version of Airflow will be added in the future.

  • Check Apache Airflow is tested with (stable version) in README.md has the same tested versions as in the tip of the stable branch in dev/breeze/src/airflow_breeze/global_constants.py

  • Create backport-to-vX-Y-test label:

    gh label create 'backport-to-vX-Y-test' --repo apache/airflow --description 'Backport to vX-Y-test' --color 0e8a16
  • Update .github/boring-cyborg.yml and add backport-to-vX-Y-test auto-assignment for the new branch.

  • Update the DEFAULT_BRANCHES list in dev/sync_fork.sh to replace the previous vX-Y-test entry with the newly cut vX-Y-test branch so contributors using the helper sync the current release branch by default.

  • Update .github/ configuration on main to add the new vX-Y-test branch (you can use the uv run dev/update_github_branch_config.py X Y helper script for this). The following files need updating:

    • .github/dependabot.yml — add target-branch: vX-Y-test entries for github-actions, pip, and npm ecosystems.
    • .github/workflows/milestone-tag-assistant.yml — add vX-Y-test to the push branches list.
    • .github/workflows/basic-tests.yml — update the release-management dry-run commands to test the new version.
    • .github/workflows/ci-notification.yml — switch the workflow-status matrix branch to the new branch.
  • Commit the above changes with the message Update version to ${VERSION}.

  • Build the release notes:

    Preview with:

    towncrier build --draft --version=${VERSION} --date=2021-12-15 --dir airflow-core --config airflow-core/newsfragments/config.toml

    Then remove the --draft flag to have towncrier build the release notes for real.

    If no significant changes were added in this release, add the header and put "No significant changes." (e.g. 2.1.4).

    This will partly generate the release note based on the fragments (adjust to rst format). All PRs does not necessarily create a fragment to document its change, to generate the body of the release note based on the cherry picked commits:

    ./dev/airflow-github changelog v2-3-stable v2-3-test
    
  • Commit the release note change.

  • PR from the 'test' branch to the 'stable' branch

    Cherry-picked commits often include provider dependency bumps (changes to >= constraints on apache-airflow-providers-* packages in pyproject.toml). CI blocks such changes by default — only Release Managers should perform them. To allow the PR to pass, add the allow provider dependency bump label (and skip common compat check if common.compat files changed). For example:

    gh pr create \
      --base v3-2-stable \
      --head v3-2-test \
      --title "Airflow ${VERSION}: test to stable" \
      --label "allow provider dependency bump" \
      --label "skip common compat check" \
      --body "Sync v3-2-test into v3-2-stable for Airflow ${VERSION} release." \
      --web

Tip

Shortcut for first RC candidates: When preparing the first RC candidate for a new minor release (e.g., 3.2.0rc1), it is unlikely to be approved on the first attempt — bugs are typically found during RC testing. In this case, the release manager can prepare the RC directly from the v3-X-test branch without opening a PR to v3-X-stable. This saves the overhead of creating and managing a PR that will likely need additional changes before GA. However, when using this shortcut, the release manager must verify that the v3-X-test push CI action ("Tests" workflow) has succeeded before cutting the RC. You can check this at: https://github.com/apache/airflow/actions/workflows/ci-amd-arm.yml?query=event%3Apush+branch%3Av3-2-test (adjust the branch filter for the relevant v3-X-test branch).

  • When the PR is approved (or when using the shortcut above), install dev/breeze in a virtualenv:

    uv pip install -e ./dev/breeze
  • Set GITHUB_TOKEN environment variable. Needed in patch release for generating issue for testing of the RC. You can generate the token by following this link

    export GITHUB_TOKEN="my_token"
  • Configure a short-lived PyPI token for this upload only. Until Trusted Publishing is deployed for apache-airflow on PyPI, the recommended practice is:

    1. Log in to https://pypi.org and create an API token right before the upload step. Scope caveat: you would ideally create a project-scoped token for apache-airflow alone, but PyPI only allows project-scoped tokens for projects you already own/maintain on that account. Most Airflow release managers do not have per-project owner rights on apache-airflow, so in practice you will need to create an account-wide ("all projects") token. That is acceptable only if you treat it as single-use and delete it immediately after the upload (step 4 below). Never keep an all-projects token on disk longer than the upload itself.
    2. Put it in ~/.pypirc (or export as TWINE_USERNAME=__token__ TWINE_PASSWORD=pypi-...).
    3. Run the start-rc-process command (below) — it uploads to PyPI under the hood.
    4. Immediately delete the token from the PyPI web UI after the upload completes. Do not keep long-lived release-manager tokens on disk.

    This is a defence-in-depth practice: the RM machine becomes a one-time release vehicle, not a persistent point of compromise.

  • Start the release candidate process by running the below command (If you have not generated a key yet, generate it by following instructions on http://www.apache.org/dev/openpgp.html#key-gen-generate-key):

    git checkout main
    git pull # Ensure that the script is up-to-date
    breeze release-management start-rc-process \
        --version ${VERSION_RC} \
        --previous-version ${PREVIOUS_VERSION} \
        --task-sdk-version ${TASK_SDK_VERSION_RC} \
        --sync-branch ${SYNC_BRANCH}

    Testing the start-rc-process command: Before running the actual release command, you can safely test it using:

    # Test with dry-run (shows what would be executed without doing it).
    # --remote-name defaults to "upstream" per the project convention, so only
    # pass it explicitly if you set apache/airflow up under a different name.
    breeze release-management start-rc-process \
        --version ${VERSION_RC} \
        --previous-version ${PREVIOUS_VERSION} \
        --task-sdk-version ${TASK_SDK_VERSION_RC} \
        --sync-branch ${SYNC_BRANCH} \
        --dry-run
  • Create issue in github for testing the release using this subject:

    cat <<EOF
    Status of testing of Apache Airflow ${VERSION_RC}
    EOF
  • Generate the body of the issue using the below command:

      breeze release-management generate-issue-content-core --previous-release ${PREVIOUS_VERSION} --current-release ${VERSION_RC}

Publish release candidate documentation (staging)

Documentation is an essential part of the product and should be made available to users. In our cases, documentation for the pre-release versions is published in staging S3 bucket. The documentation source code and build tools are available in the apache/airflow repository, so you need to run several workflows to publish the documentation. More details about it can be found in Docs README showing the architecture and workflows including manual workflows for emergency cases.

We have two options publishing the documentation 1. Using breeze commands 2. Manually using GitHub Actions.:

Using breeze commands

You can use the breeze command to publish the documentation. The command does the following:

  1. Triggers Publish Docs to S3.
  2. Triggers workflow in apache/airflow-site to refresh
  3. Triggers S3 to GitHub Sync
breeze workflow-run publish-docs --ref <tag> --site-env <staging/live/auto> apache-airflow docker-stack task-sdk

# Example for RC
breeze workflow-run publish-docs --ref ${VERSION_RC} --site-env staging apache-airflow docker-stack task-sdk

The --ref parameter should be the tag of the release candidate you are publishing.

The --site-env parameter should be set to staging for pre-release versions or live for final releases. the default option is auto if the tag is rc it publishes to staging bucket, otherwise it publishes to live bucket.

Other available parameters can be found with:

breeze workflow-run publish-docs --help

In case you publish the documentation from branch, you can specify --airflow-version and --airflow-base-version parameters to specify which version of airflow you want to build the documentation for - as it cannot be automatically derived from tag name. Normally both are automatically derived from the tag name.

One of the interesting features of publishing this way is that you can also rebuild historical version of the documentation with patches applied to the documentation (if they can be applied cleanly).

Yoy should specify the --apply-commits parameter with the list of commits you want to apply separated by commas and the workflow will apply those commits to the documentation before building it. (don't forget to add --skip-write-to-stable-folder if you are publishing previous version of the distribution). Example:

breeze workflow-run publish-docs --ref 3.0.3 --site-env staging \
  --apply-commits 4ae273cbedec66c87dc40218c7a94863390a380d,e61e9618bdd6be8213d277b1427f67079fcb1d9b \
  --skip-write-to-stable-folder \
  apache-airflow docker-stack task-sdk

Manually using GitHub Actions

There are two steps to publish the documentation:

  1. Publish the documentation to the staging S3 bucket.

The release manager publishes the documentation using GitHub Actions workflow Publish Docs to S3. By default auto selection should publish to the staging bucket - based on the tag you use - pre-release tags go to staging. But you can also override it and specify the destination manually to be live or staging.

You should specify 'apache-airflow docker-stack task-sdk' passed as packages to be built.

After that step, the provider documentation should be available under https://airflow.stage.apache.org// URL (RC PyPI packages are build with the staging urls) but stable links and drop-down boxes are not updated yet.

  1. Invalidate Fastly cache, update version drop-down and stable links with the new versions of the documentation.

In order to do it, you need to run the Build docs workflow in airflow-site repository - but make sure to use staging branch.

After that workflow completes, the new version should be available in the drop-down list and stable links should be updated, also Fastly cache will be updated

Prepare production Docker Image RC

Production Docker images should be manually prepared and pushed by the release manager or another committer who has access to Airflow's DockerHub. Note that we started releasing a multi-platform build, so you need to have an environment prepared to build multi-platform images. You can achieve it with:

  • GitHub Actions Manual Job (easiest)
  • Emulation (very slow)
  • Hardware builders if you have both AMD64 and ARM64 hardware locally

Building the image is triggered by running the Release PROD Images workflow.

When you trigger it you need to pass Airflow Version (including the right rc suffix). The workflow will normalize and verify version passed. When you are testing or want to release image faster you can also select the check-box to limit the build to only AMD platform. This will speed up the build significantly.

Release prod image

The manual building is described in MANUALLY_BUILDING_IMAGES.md.

Prepare Vote email on the Apache Airflow release candidate

Subject:

cat <<EOF
[VOTE] Release Airflow ${VERSION} from ${VERSION_RC} & Task SDK ${TASK_SDK_VERSION} from ${TASK_SDK_VERSION_RC}
EOF

Body:

cat <<EOF
Hey fellow Airflowers,

The release candidates for Apache Airflow ${VERSION_RC} and Task SDK ${TASK_SDK_VERSION_RC} are now available for testing!

This email is calling for a vote on the release, which will last at least 72 hours, from Friday, October 8, 2021 at 4:00 pm UTC
until Monday, October 11, 2021 at 4:00 pm UTC, and until 3 binding +1 votes have been received.

https://www.timeanddate.com/worldclock/fixedtime.html?msg=8&iso=20211011T1600&p1=1440

Status of testing of the release is kept in TODO:URL_OF_THE_ISSUE_HERE

Consider this my +1 binding vote.

Airflow ${VERSION_RC} is available at:
https://dist.apache.org/repos/dist/dev/airflow/${VERSION_RC}/

"apache-airflow" Meta package:
- *apache-airflow-${VERSION}-source.tar.gz* is a source release that comes with INSTALL instructions.
- *apache-airflow-${VERSION}.tar.gz* is the binary Python "sdist" release.
- *apache_airflow-${VERSION}-py3-none-any.whl* is the binary Python wheel "binary" release.

"apache-airflow-core" package:
- *apache_airflow_core-${VERSION}.tar.gz* is the binary Python "sdist" release.
- *apache_airflow_core-${VERSION}-py3-none-any.whl* is the binary Python wheel "binary" release.

Task SDK ${TASK_SDK_VERSION} is available at:
https://dist.apache.org/repos/dist/dev/airflow/task-sdk/${TASK_SDK_VERSION}/

"apache-airflow-task-sdk" package:
- *apache_airflow_task_sdk-${TASK_SDK_VERSION}.tar.gz* is the binary Python "sdist" release.
- *apache_airflow_task_sdk-${TASK_SDK_VERSION}-py3-none-any.whl* is the binary Python wheel "binary" release.

Public keys are available at:
https://dist.apache.org/repos/dist/release/airflow/KEYS

Please vote accordingly:

[ ] +1 approve
[ ] +0 no opinion
[ ] -1 disapprove with the reason

Only votes from PMC members are binding, but all members of the community
are encouraged to test the release and vote with "(non-binding)".

The test procedure for PMC members is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md#verify-the-release-candidate-by-pmc-members

The test procedure for contributors and members of the community who would like to test this RC is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md#verify-the-release-candidate-by-contributors

Please note that the version number excludes the 'rcX' string, so it's now
simply ${VERSION} for Airflow package and ${TASK_SDK_VERSION} for Task SDK. This will allow us to rename the artifact without modifying
the artifact checksums when we actually release.

Docs (for preview):
https://airflow.staged.apache.org/docs/apache-airflow/${VERSION}

Release Notes:
- https://github.com/apache/airflow/blob/${VERSION_RC}/RELEASE_NOTES.rst
- https://airflow.staged.apache.org/docs/apache-airflow/${VERSION}/release_notes.html (Rendered HTML)

Testing Instructions using PyPI:
You can build a virtualenv that installs this and other required packages (e.g. task sdk), like this:

uv venv
uv pip install -U \\
  apache-airflow==${VERSION_RC} \\
  apache-airflow-core==${VERSION_RC} \\
  apache-airflow-task-sdk==${TASK_SDK_VERSION_RC}

Constraints files are at https://github.com/apache/airflow/tree/constraints-${VERSION_RC}

Cheers,
<your name>
EOF

Note, For RC2/3 you may refer to shorten vote period as agreed in mailing list thread.

Verify the release candidate by PMC members

PMC members should perform the manual verification steps below.

Optionally, you can also run the automated Breeze verification via breeze release-management verify-rc-by-pmc as a cross-check after completing the manual steps (see Optional: Automated verification using Breeze).

Note

verify-rc-by-pmc is experimental and can change without notice. If you choose to use it, treat it only as an additional validation step after completing the manual verification below, and compare the results. Do not use it as the sole verification method.

PMC members should verify the releases in order to make sure the release is following the Apache Legal Release Policy.

At least 3 (+1) votes should be recorded in accordance to Votes on Package Releases

The legal checks include:

  • verifying if packages can be reproducibly built from sources
  • checking if the packages are present in the right dist folder on svn
  • verifying if all the sources have correct licences
  • verifying if release manager signed the releases with the right key
  • verifying if all the checksums are valid for the release

Reproducible package check

Airflow supports reproducible builds, which means that the packages prepared from the same sources should produce binary identical packages in reproducible way. You should check if the packages can be binary-reproduced when built from the sources.

Checkout airflow sources and build packages in dist folder (replace X.Y.Zrc1 with the version you are checking):

VERSION=X.Y.Z
VERSION_SUFFIX=rc1
VERSION_RC=${VERSION}${VERSION_SUFFIX}
TASK_SDK_VERSION=X.Y.Z
TASK_SDK_VERSION_RC=${TASK_SDK_VERSION}${VERSION_SUFFIX}
git fetch upstream --tags
git checkout ${VERSION_RC}
export AIRFLOW_REPO_ROOT=$(pwd)
rm -rf dist/*
breeze release-management prepare-airflow-distributions --distribution-format both --version-suffix ""
breeze release-management prepare-task-sdk-distributions --distribution-format both --version-suffix ""
breeze release-management prepare-tarball --tarball-type apache_airflow --version ${VERSION} --version-suffix ${VERSION_SUFFIX}

The prepare-*-distributions by default will use Dockerized approach and building of the packages will be done in a docker container. However, if you have hatch installed locally you can use --use-local-hatch flag and it will build and use docker image that has hatch installed.

breeze release-management prepare-airflow-distributions --distribution-format both --use-local-hatch --version-suffix ""
breeze release-management prepare-task-sdk-distributions --distribution-format both --use-local-hatch --version-suffix ""
breeze release-management prepare-tarball --tarball-type apache_airflow --version ${VERSION} --version-suffix ${VERSION_SUFFIX}

This is generally faster and requires less resources/network bandwidth. Note that you have to do it before preparing the tarball as preparing packages cleans up dist folder from apache-airflow artifacts as it uses hatch's -c build flag.

The prepare-*-distributions commands (no matter if docker or local hatch is used) should produce the reproducible .whl, .tar.gz packages in the dist folder.

The prepare-tarball command should produce reproducible -source.tar.gz tarball of sources.

Change to the directory where you have the packages from svn:

# First clone the repo if you do not have it
cd ..
[ -d asf-dist ] || svn checkout --depth=immediates https://dist.apache.org/repos/dist asf-dist
svn update --set-depth=infinity asf-dist/dev/airflow

# Then compare the packages
cd asf-dist/dev/airflow/${VERSION_RC}
for i in *.whl *.tar.gz
do
  echo "Checking if $(basename $i) is the same as ${AIRFLOW_REPO_ROOT}/dist/$(basename $i)"
  diff "$(basename $i)" "${AIRFLOW_REPO_ROOT}/dist/$(basename $i)" && echo "OK"
done
cd ../task-sdk/${TASK_SDK_VERSION_RC}
for i in *.whl *.tar.gz
do
  echo "Checking if $(basename $i) is the same as ${AIRFLOW_REPO_ROOT}/dist/$(basename $i)"
  diff "$(basename $i)" "${AIRFLOW_REPO_ROOT}/dist/$(basename $i)" && echo "OK"
done

The output should be empty (files are identical). In case the files are different, you should see:

Binary files apache_airflow-2.9.0.tar.gz and .../apache_airflow-2.9.0.tar.gz differ

SVN check

The files should be present in the sub-folder of Airflow dist

The following files should be present (9 files):

  • -source.tar.gz + .asc + .sha512
  • .tar.gz + .asc + .sha512
  • -py3-none-any.whl + .asc + .sha512

As a PMC member, you should be able to clone the SVN repository or update it if you already checked it out:

cd ${AIRFLOW_REPO_ROOT}
cd ..
[ -d asf-dist ] || svn checkout --depth=immediates https://dist.apache.org/repos/dist asf-dist
svn update --set-depth=infinity asf-dist/dev/airflow

Set an environment variable: PATH_TO_AIRFLOW_SVN to the root of folder where you clone the SVN repository:

cd asf-dist/dev/airflow
export PATH_TO_AIRFLOW_SVN=$(pwd -P)

Optionally you can use the breeze release-management check-release-files command to verify that all expected files are present in SVN. This command may also help with verifying installation of the packages.

breeze release-management check-release-files airflow --version ${VERSION_RC} --path-to-airflow-svn=${PATH_TO_AIRFLOW_SVN}

You will see commands that you can execute to check installation of the distributions in containers.

breeze release-management check-release-files task-sdk --version ${TASK_SDK_VERSION_RC} --path-to-airflow-svn=${PATH_TO_AIRFLOW_SVN}

You will see commands that you can execute to check installation of the distributions in containers.

Licence check

This can be done with the Apache RAT tool.

Download the latest jar from https://creadur.apache.org/rat/download_rat.cgi (unpack the binary, the jar is inside)

You can run this command to do it for you (including checksum verification for your own security):

# Checksum value is taken from https://downloads.apache.org/creadur/apache-rat-0.18/apache-rat-0.18-bin.tar.gz.sha512
wget -q https://archive.apache.org/dist/creadur/apache-rat-0.18/apache-rat-0.18-bin.tar.gz -O /tmp/apache-rat-0.18-bin.tar.gz
echo "315b16536526838237c42b5e6b613d29adc77e25a6e44a866b2b7f8b162e03d3629d49c9faea86ceb864a36b2c42838b8ce43d6f2db544e961f2259e242748f4  /tmp/apache-rat-0.18-bin.tar.gz" | sha512sum -c -
tar -xzf /tmp/apache-rat-0.18-bin.tar.gz -C /tmp

Unpack the release source archive (the <package + version>-source.tar.gz file) to a folder

rm -rf /tmp/apache-airflow-src && mkdir -p /tmp/apache-airflow-src && tar -xzf ${PATH_TO_AIRFLOW_SVN}/${VERSION_RC}/apache_airflow*-source.tar.gz --strip-components 1 -C /tmp/apache-airflow-src

Run the check:

cp ${AIRFLOW_REPO_ROOT}/.rat-excludes /tmp/apache-airflow-src/.rat-excludes
java -jar /tmp/apache-rat-0.18/apache-rat-0.18.jar --input-exclude-file /tmp/apache-airflow-src/.rat-excludes /tmp/apache-airflow-src/ | grep -E "! |INFO: "

You should see no files reported as Unknown or with wrong licence and summary of the check similar to:

INFO: Apache Creadur RAT 0.18 (Apache Software Foundation)
INFO: Excluding patterns: .git-blame-ignore-revs, .github/*, .git ...
INFO: Excluding MISC collection.
INFO: Excluding HIDDEN_DIR collection.
SLF4J(W): No SLF4J providers were found.
SLF4J(W): Defaulting to no-operation (NOP) logger implementation
SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details.
INFO: RAT summary:
INFO:   Approved:  15615
INFO:   Archives:  2
INFO:   Binaries:  813
INFO:   Document types:  5
INFO:   Ignored:  2392
INFO:   License categories:  2
INFO:   License names:  2
INFO:   Notices:  216
INFO:   Standards:  15609
INFO:   Unapproved:  0
INFO:   Unknown:  0

There should be no files reported as Unknown or Unapproved. The files that are unknown or unapproved should be shown with a line starting with !.

For example:

! Unapproved:         1    A count of unapproved licenses.
! /CODE_OF_CONDUCT.md

Signature check

Make sure you have imported into your GPG the PGP key of the person signing the release. You can find the valid keys in KEYS.

You can import the whole KEYS file:

wget https://dist.apache.org/repos/dist/release/airflow/KEYS
gpg --import KEYS

You can also import the keys individually from a keyserver. The below one uses Kaxil's key and retrieves it from the default GPG keyserver OpenPGP.org:

gpg --keyserver keys.openpgp.org --receive-keys CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F

You should choose to import the key when asked.

Note that by being default, the OpenPGP server tends to be overloaded often and might respond with errors or timeouts. Many of the release managers also uploaded their keys to the GNUPG.net keyserver, and you can retrieve it from there.

gpg --keyserver keys.gnupg.net --receive-keys CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F

Once you have the keys, the signatures can be verified after switching to the directory where you have the release packages:

cd ${PATH_TO_AIRFLOW_SVN}/${VERSION_RC}
echo
echo "Checking Airflow ${VERSION_RC} Signatures"
echo
for i in *.asc
do
   echo -e "Checking $i\n"; gpg --verify $i
done
cd ../task-sdk/${TASK_SDK_VERSION_RC}
echo
echo "Checking TaskSDK ${TASK_SDK_VERSION_RC} Signatures"
echo
for i in *.asc
do
   echo -e "Checking $i\n"; gpg --verify $i
done

This should produce results similar to the below. The "Good signature from ..." is indication that the signatures are correct. Do not worry about the "not certified with a trusted signature" warning. Most of the certificates used by release managers are self-signed, and that's why you get this warning. By importing the key either from the server in the previous step or from the KEYS page, you know that this is a valid key already. To suppress the warning you may edit the key's trust level by running gpg --edit-key <key id> trust and entering 5 to assign trust level ultimate.

Checking Airflow 3.1.8rc1 Signatures

Checking apache_airflow-3.1.8-py3-none-any.whl.asc

gpg: assuming signed data in 'apache_airflow-3.1.8-py3-none-any.whl'
gpg: Signature made Fri 06 Mar 2026 11:13:05 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2
Checking apache_airflow-3.1.8-source.tar.gz.asc

gpg: assuming signed data in 'apache_airflow-3.1.8-source.tar.gz'
gpg: Signature made Fri 06 Mar 2026 11:13:06 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2
Checking apache_airflow-3.1.8.tar.gz.asc

gpg: assuming signed data in 'apache_airflow-3.1.8.tar.gz'
gpg: Signature made Fri 06 Mar 2026 11:13:06 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2
Checking apache_airflow_core-3.1.8-py3-none-any.whl.asc

gpg: assuming signed data in 'apache_airflow_core-3.1.8-py3-none-any.whl'
gpg: Signature made Fri 06 Mar 2026 11:13:05 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2
Checking apache_airflow_core-3.1.8.tar.gz.asc

gpg: assuming signed data in 'apache_airflow_core-3.1.8.tar.gz'
gpg: Signature made Fri 06 Mar 2026 11:13:05 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2

Checking TaskSDK 1.1.8rc1 Signatures

Checking apache_airflow_task_sdk-1.1.8-py3-none-any.whl.asc

gpg: assuming signed data in 'apache_airflow_task_sdk-1.1.8-py3-none-any.whl'
gpg: Signature made Fri 06 Mar 2026 11:13:05 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2
Checking apache_airflow_task_sdk-1.1.8.tar.gz.asc

gpg: assuming signed data in 'apache_airflow_task_sdk-1.1.8.tar.gz'
gpg: Signature made Fri 06 Mar 2026 11:13:05 AM CET
gpg:                using EDDSA key 5055919906242571E5B0CC5A1846E140F733C4B2
gpg: Good signature from "Rahul Vats <rah.sharma11@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 5055 9199 0624 2571 E5B0  CC5A 1846 E140 F733 C4B2

SHA512 sum check

Run this:

cd ${PATH_TO_AIRFLOW_SVN}/${VERSION_RC}
echo
echo "Checking Airflow ${VERSION_RC} Checksums"
echo
for i in *.sha512
do
    echo "Checking $i"; shasum -a 512 `basename $i .sha512 ` | diff - $i
done
cd ../task-sdk/${TASK_SDK_VERSION_RC}
echo
echo "Checking TaskSDK ${TASK_SDK_VERSION_RC} Checksums"
echo
for i in *.sha512
do
    echo "Checking $i"; shasum -a 512 `basename $i .sha512 ` | diff - $i
done

You should get output similar to:

Checking Airflow 3.1.8rc1 Checksums

Checking apache_airflow-3.1.8-py3-none-any.whl.sha512
Checking apache_airflow-3.1.8-source.tar.gz.sha512
Checking apache_airflow-3.1.8.tar.gz.sha512
Checking apache_airflow_core-3.1.8-py3-none-any.whl.sha512
Checking apache_airflow_core-3.1.8.tar.gz.sha512

Checking TaskSDK 1.1.8rc1 Checksums

Checking apache_airflow_task_sdk-1.1.8-py3-none-any.whl.sha512
Checking apache_airflow_task_sdk-1.1.8.tar.gz.sha512

Optional: Automated verification using Breeze

If you want to run the automated cross-check, use breeze release-management verify-rc-by-pmc.

What the automation does (high level):

  • Validates expected SVN files, signatures, checksums, Apache RAT licenses, and reproducible builds.
  • Uses a detached git worktree for reproducible builds so it can build from the release tag without changing your current checkout (and still use the latest Breeze code).
  • Fails early if the SVN working copy is locked (to avoid hanging on svn commands).

If the automation output disagrees with your manual verification, treat the manual results as authoritative and report the discrepancy.

For the full command documentation see Breeze Command to verify RC.

Example (run all checks):

breeze release-management verify-rc-by-pmc \
  --distribution airflow \
  --version ${VERSION_RC} \
  --task-sdk-version ${TASK_SDK_VERSION_RC} \
  --path-to-airflow-svn ~/asf-dist/dev/airflow \
  --verbose

Example (only signatures + checksums):

breeze release-management verify-rc-by-pmc \
  --distribution airflow \
  --version ${VERSION_RC} \
  --task-sdk-version ${TASK_SDK_VERSION_RC} \
  --path-to-airflow-svn ~/asf-dist/dev/airflow \
  --checks signatures,checksums

Verify the release candidate by Contributors

This can be done (and we encourage to) by any of the Contributors. In fact, it's best if the actual users of Apache Airflow test it in their own staging/test installations. Each release candidate is available on PyPI apart from SVN packages, so everyone should be able to install the release candidate version.

But you can use any of the installation methods you prefer (you can even install it via the binary wheels downloaded from the SVN).

Installing release candidate in your local virtual environment

pip install apache-airflow==<VERSION>rc<X>

Optionally it can be followed with constraints

pip install apache-airflow==<VERSION>rc<X> \
  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-<VERSION>rc<X>/constraints-3.10.txt"

Note that the constraints contain python version that you are installing it with.

You can use any of the installation methods you prefer (you can even install it via the binary wheel downloaded from the SVN).

There is also an easy way of installation with Breeze if you have the latest sources of Apache Airflow. Running the following command will use tmux inside breeze, create admin user and run Webserver & Scheduler:

breeze start-airflow --use-airflow-version 3.1.3rc1 --python 3.10 --backend postgres

You can also choose different executors and extras to install when you are installing airflow this way. For example in order to run Airflow with CeleryExecutor and install celery, google and amazon provider (as of Airflow 2.7.0, you need to have celery provider installed to run Airflow with CeleryExecutor) you can run:

breeze start-airflow --use-airflow-version 3.1.3rc1 --python 3.10 --backend postgres \
  --executor CeleryExecutor --airflow-extras "celery,google,amazon"

Once you install and run Airflow, you should perform any verification you see as necessary to check that the Airflow works as you expected.

Breeze also allows you to easily build and install pre-release candidates including providers by following simple instructions described in Manually testing release candidate packages

Publish the final Apache Airflow release

Summarize the voting for the Apache Airflow release

Once the vote has been passed, you will need to send a result vote to dev@airflow.apache.org:

Subject:

[RESULT][VOTE] Release Airflow 3.1.3 from 3.1.3rc1 & Task SDK 1.1.3 from 1.1.3rc1

Message:

Hello,

The vote to release Apache Airflow version 3.1.3 based on 3.1.3rc3 & Task SDK 1.1.3 from 1.1.3rc3 is now closed.

The vote PASSED with 6 binding "+1", 4 non-binding "+1" and 0 "-1" votes:

"+1" Binding votes:
- Kaxil Naik
- Jens Scheffler
- Jarek Potiuk
- Ash Berlin-Taylor
- Hussein Awala
- Amogh Desai

"+1" non-Binding votes:
- Wei Lee
- Pavankumar Gopidesu
- Ankit Chaurasia
- Rahul Vats

Vote thread: https://lists.apache.org/thread/f72gglg5vdxnfmjqtjlhwgvn2tnh4gx4

I will continue with the release process, and the release announcement will follow shortly.

Cheers,
<your name>

Publish release to SVN

You need to migrate the RC artifacts that passed to this repository: https://dist.apache.org/repos/dist/release/airflow/ (The migration should include renaming the files so that they no longer have the RC number in their filenames.)

The best way of doing this is to svn cp between the two repos (this avoids having to upload the binaries again, and gives a clearer history in the svn commit logs):

Before running start-release, configure a short-lived PyPI token for this upload only. Until Trusted Publishing is deployed for apache-airflow on PyPI, the recommended practice is:

  1. Log in to https://pypi.org and create an API token right before the upload step. Scope caveat: you would ideally create a project-scoped token for apache-airflow alone, but PyPI only allows project-scoped tokens for projects you already own/maintain on that account. Most Airflow release managers do not have per-project owner rights on apache-airflow, so in practice you will need to create an account-wide ("all projects") token. That is acceptable only if you treat it as single-use and delete it immediately after the upload (step 4 below). Never keep an all-projects token on disk longer than the upload itself.
  2. Put it in ~/.pypirc (or export as TWINE_USERNAME=__token__ TWINE_PASSWORD=pypi-...).
  3. Run the start-release command below — it uploads to PyPI under the hood.
  4. Immediately delete the token from the PyPI web UI after the upload completes. Do not keep long-lived release-manager tokens on disk.

This is a defence-in-depth practice: the RM machine becomes a one-time release vehicle, not a persistent point of compromise.

export VERSION=3.1.3
export TASK_SDK_VERSION=1.1.3
export PREVIOUS_RELEASE=3.1.2
# cd to the airflow repo directory and set the environment variable below
export AIRFLOW_REPO_ROOT=$(pwd)
# start the release process by running the below command
breeze release-management start-release \
    --version ${VERSION} \
    --task-sdk-version ${TASK_SDK_VERSION}

Note: The --task-sdk-version parameter is optional. If you are releasing Airflow without a corresponding Task SDK release, you can omit this parameter.

  1. Make sure to update Airflow version in v3-*-test branch after cherry-picking to X.Y.1 in airflow/__init__.py

Manually prepare production Docker Image

Building the image is triggered by running the Release PROD Images workflow.

When you trigger it you need to pass:

  • Airflow Version
  • Optional "true" in skip latest field if you do not want to re-tag the latest image

Make sure you use v3-*-test branch to run the workflow.

Release prod image

Note that by default the latest images tagged are aliased to the just released image which is the usual way we release. For example when you are releasing 2.3.N image and 2.3 is our latest branch the new image is marked as "latest".

In case we are releasing (which almost never happens so far) a critical bugfix release in one of the older branches, you should set the "skip" field to true.

Verify production images

for PYTHON in 3.10 3.11 3.12 3.13 3.14
do
    docker pull apache/airflow:${VERSION}-python${PYTHON}
    breeze prod-image verify --image-name apache/airflow:${VERSION}-python${PYTHON}
done
docker pull apache/airflow:${VERSION}
breeze prod-image verify --image-name apache/airflow:${VERSION}

Publish final documentation

Documentation is an essential part of the product and should be made available to users. In our cases, documentation for the released versions is published in the live S3 bucket, and the site is kept in a separate repository - apache/airflow-site, but the documentation source code and build tools are available in the apache/airflow repository, so you need to run several workflows to publish the documentation. More details about it can be found in Docs README showing the architecture and workflows including manual workflows for emergency cases.

We have two options publishing the documentation 1. Using breeze commands 2. Manually using GitHub Actions.:

Using breeze commands

You can use the breeze command to publish the documentation. The command does the following:

  1. Triggers Publish Docs to S3.
  2. Triggers workflow in apache/airflow-site to refresh
  3. Triggers S3 to GitHub Sync
# Example for final release
breeze workflow-run publish-docs --ref ${VERSION} --site-env live apache-airflow docker-stack task-sdk

The --ref parameter should be the tag of the final version you are publishing.

The --site-env parameter should be set to staging for pre-release versions or live for final releases. the default option is auto if the tag is rc it publishes to staging bucket, otherwise it publishes to live bucket.

Other available parameters can be found with:

breeze workflow-run publish-docs --help

Manually using GitHub Actions

There are two steps to publish the documentation:

  1. Publish the documentation to S3 bucket.

The release manager publishes the documentation using GitHub Actions workflow Publish Docs to S3. By default auto selection should publish to the live bucket - based on the tag you use - pre-release tags go to staging. But you can also override it and specify the destination manually to be live or staging.

After that step, the provider documentation should be available under the https://airflow.apache.org/ URL (also linked directly from the PyPI packages) but stable links and drop-down boxes should not be yet updated. That allows the Release Manager to verify if the documentation is published.

  1. Invalidate Fastly Cache, update version drop-down and stable links with the new versions of the documentation.

In order to do it, you need to run the Build docs workflow in airflow-site repository. Make sure to use main as the branch to run the workflow.

After that workflow completes, the new version should be available in the drop-down list and stable links should be updated, also Fastly cache will be invalidated.

Notify developers of release

Subject:

cat <<EOF
[ANNOUNCE] Apache Airflow ${VERSION} Released
EOF

Body:

cat <<EOF
Dear Airflow community,

I'm happy to announce that Airflow ${VERSION} was just released.

The released sources and packages can be downloaded via https://airflow.apache.org/docs/apache-airflow/${VERSION}/installation/installing-from-sources.html

Other installation methods are described in https://airflow.apache.org/docs/apache-airflow/stable/installation/

We also made this version available on PyPI for convenience:
\`pip install apache-airflow\`
https://pypi.org/project/apache-airflow/${VERSION}/

The documentation is available at:
https://airflow.apache.org/docs/apache-airflow/${VERSION}/

Find the release notes here for more details:
https://airflow.apache.org/docs/apache-airflow/${VERSION}/release_notes.html

Container images are published at:
https://hub.docker.com/r/apache/airflow/tags/?page=1&name=${VERSION}

Cheers,
<your name>
EOF

Send the same email to announce@apache.org, except change the opening line to Dear community,. It is more reliable to send it via the web ui at https://lists.apache.org/list.html?announce@apache.org (press "c" to compose a new thread)

Send announcements about security issues fixed in the release

The release manager should review and mark as READY all the security issues fixed in the release. Such issues are marked as affecting < <JUST_RELEASED_VERSION> in the CVE management tool at https://cveprocess.apache.org/. Then the release manager should announced the issues via the tool.

Once announced, each of the issue should be linked with a 'reference' with tag 'vendor advisory' with the URL to the announcement published automatically by the CVE management tool. Note that the announce@apache.org is moderated, and the link to the email thread will not be published immediately, that's why it is recommended to add the link to users@airflow.apache.org which takes usually few seconds to be published after the CVE tool sends them.

The ASF Security will be notified and will submit to the CVE project and will set the state to 'PUBLIC'.

Add release data to Apache Committee Report Helper

Add the release data (version and date) at: https://reporter.apache.org/addrelease.html?airflow

Update Announcements page

Update "Announcements" page at the Official Airflow website

Create release on GitHub

Create a new release on GitHub with the release notes and assets from the release svn.

Close the milestone

Before closing the milestone on GitHub, make sure that all PR marked for it are either part of the release (was cherry picked) or postponed to the next release, then close the milestone. Create the next one if it hasn't been already (it probably has been). Update the new milestone in the Currently we are working on issue make sure to update the last updated timestamp as well.

Close the testing status issue

Don't forget to thank the folks who tested and close the issue tracking the testing status.

Announce the release on the community slack

Post this in the #announce channel:

cat <<EOF
We've just released Apache Airflow $VERSION 🎉

📦 PyPI: https://pypi.org/project/apache-airflow/$VERSION/
📚 Docs: https://airflow.apache.org/docs/apache-airflow/$VERSION/
🛠 Release Notes: https://airflow.apache.org/docs/apache-airflow/$VERSION/release_notes.html
🐳 Docker Image: "docker pull apache/airflow:$VERSION"
🚏 Constraints: https://github.com/apache/airflow/tree/constraints-$VERSION

Thanks to all the contributors who made this possible.
EOF

Announce about the release in social media


Announcement is done from official Apache-Airflow accounts.

cat <<EOF
We've just released Apache Airflow $VERSION 🎉

📦 PyPI: https://pypi.org/project/apache-airflow/$VERSION/
📚 Docs: https://airflow.apache.org/docs/apache-airflow/$VERSION/
🛠 Release Notes: https://airflow.apache.org/docs/apache-airflow/$VERSION/release_notes.html
🐳 Docker Image: "docker pull apache/airflow:$VERSION"

Thanks to all the contributors who made this possible.
EOF

Post on social media about the release:

Make sure to attach the release image generated with Figma to the post. If you don't have access to the account ask a PMC member to post.


Update main with the latest release details

This includes:

  • Modify ./scripts/ci/prek/supported_versions.py and let prek do the job.
  • For major/minor release, update version in airflow/__init__.py and docker-stack-docs/ to the next likely major version release.
    • New version should be, current major release + 1.0
  • Sync RELEASE_NOTES.rst (including deleting relevant newsfragments) and README.md changes.
  • Updating Dockerfile with the new version.
  • Updating 1-airflow_bug_report.yml issue template in .github/ISSUE_TEMPLATE/ with the new version.
  • Update PROVIDERS_COMPATIBILITY_TESTS_MATRIX in src/airflow_breeze/global_constants.py so that latest compatibility check uses the latest released version of Airflow.

Update default Airflow version in the helm chart

Update the values of airflowVersion, defaultAirflowTag and appVersion in the helm chart so the next helm chart release will use the latest released version. You'll need to update chart/values.yaml, chart/values.schema.json and chart/Chart.yaml.

Add or adjust significant chart/newsfragments to express that the default version of Airflow has changed.

In chart/Chart.yaml, make sure the screenshot annotations are still all valid URLs.

Update airflow/config_templates/config.yml file

File airflow/config_templates/config.yml contains documentation on all configuration options available in Airflow. The version_added fields must be updated when a new Airflow version is released.

  • Get a diff between the released versions and the current local file on main branch:

    ./dev/validate_version_added_fields_in_config.py
  • Update airflow/config_templates/config.yml with the details, and commit it.

API clients

After releasing airflow core, we need to check if we have to follow up with API clients release.

Clients are released in a separate process, with their own vote.

Clients can be found here:

API Clients versioning policy

Clients and Core versioning are completely decoupled. Clients also follow SemVer and are updated when core introduce changes relevant to the clients. Most of the time, if the openapi specification has changed, clients need to be released.

To determine if you should release API clients, you can run from the airflow repository:

./dev/airflow-github api-clients-policy 2.3.2 2.4.0

All clients follow SemVer and you should check what the appropriate new version for each client should be. Depending on the current client version and type of content added (feature, fix, breaking change etc.) appropriately increment the version number.

Releasing the clients

According to the policy above, if we have to release clients:

Remove dependabot workflows for previous minor release

In case you release a new minor version, you should remove the dependabot workflow for the previous minor version to avoid confusion and unnecessary updates. For example, if you release 3.3.0, you should remove the dependabot workflows for v3-2-test from .github/workflows/dependabot.yml

Additional processes

Those processes are related to the release of Airflow but should be run in exceptional situations.

Fixing released documentation

Sometimes we want to rebuild the documentation with some fixes that were merged in main or v3-X-stable branch, for example when there are html layout changes or typo fixes, or formatting issue fixes.

In this case the process is as follows:

  • When you want to re-publish 3.X.Y docs, create (or pull if already created) 3.X.Y-docs branch
  • Cherry-pick changes you want to add and push to the main apache/airflow repo
  • Run the publishing workflow.

In case you are releasing latest released version of Airflow (which should be most of the cases), run this:

breeze workflow-run publish-docs --site-env live --ref 3.X.Y-docs \
   --skip-tag-validation --airflow-version 3.X.Y \
   apache-airflow

In case you are releasing an older version of Airflow, you should skip writing to the stable folder

breeze workflow-run publish-docs --site-env live --ref 3.X.Y-docs \
   --skip-tag-validation --airflow-version 3.X.Y \
   --skip-write-to-stable-folder \
   apache-airflow