Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changes release image preparation to use PyPI packages #12990

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 2 additions & 4 deletions BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1275,8 +1275,7 @@ This is the current syntax for `./breeze <./breeze>`_:
If specified, installs Airflow directly from PIP released version. This happens at
image building time in production image and at container entering time for CI image. One of:

1.10.14 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4
1.10.3 1.10.2 none wheel
1.10.14 1.10.12 1.10.11 1.10.10 1.10.9 none wheel

When 'none' is used, you can install airflow from local packages. When building image,
airflow package should be added to 'docker-context-files' and
Expand Down Expand Up @@ -2379,8 +2378,7 @@ This is the current syntax for `./breeze <./breeze>`_:
If specified, installs Airflow directly from PIP released version. This happens at
image building time in production image and at container entering time for CI image. One of:

1.10.14 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4
1.10.3 1.10.2 none wheel
1.10.14 1.10.12 1.10.11 1.10.10 1.10.9 none wheel

When 'none' is used, you can install airflow from local packages. When building image,
airflow package should be added to 'docker-context-files' and
Expand Down
44 changes: 32 additions & 12 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -167,15 +167,17 @@ ENV AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION}
ENV PATH=${PATH}:/root/.local/bin
RUN mkdir -p /root/.local/bin

ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="true"
ENV AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES}

RUN if [[ -f /docker-context-files/.pypirc ]]; then \
cp /docker-context-files/.pypirc /root/.pypirc; \
fi

RUN pip install --upgrade "pip==${PIP_VERSION}"

# By default we do not use pre-cached packages, but in CI/Breeze environment we override this to speed up
# builds in case setup.py/setup.cfg changed. This is pure optimisation of CI/Breeze builds.
ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="false"
ENV AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES}

# In case of Production build image segment we want to pre-install master version of airflow
# dependencies from GitHub so that we do not have to always reinstall it from the scratch.
RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
Expand All @@ -188,39 +190,55 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
&& pip uninstall --yes apache-airflow; \
fi

ARG AIRFLOW_SOURCES_FROM="."
# By default we install latest airflow from PyPI so we do not need to copy sources of Airflow
# but in case of breeze/CI builds we use latest sources and we override those
# those SOURCES_FROM/TO with "." and "/opt/airflow" respectively
ARG AIRFLOW_SOURCES_FROM="empty"
ENV AIRFLOW_SOURCES_FROM=${AIRFLOW_SOURCES_FROM}

ARG AIRFLOW_SOURCES_TO="/opt/airflow"
ARG AIRFLOW_SOURCES_TO="/empty"
ENV AIRFLOW_SOURCES_TO=${AIRFLOW_SOURCES_TO}

COPY ${AIRFLOW_SOURCES_FROM} ${AIRFLOW_SOURCES_TO}

ARG CASS_DRIVER_BUILD_CONCURRENCY
ENV CASS_DRIVER_BUILD_CONCURRENCY=${CASS_DRIVER_BUILD_CONCURRENCY}

# This is airflow version that is put in the label of the image build
ARG AIRFLOW_VERSION
ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}

ARG ADDITIONAL_PYTHON_DEPS=""
ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS}

ARG AIRFLOW_INSTALL_SOURCES="."
ENV AIRFLOW_INSTALL_SOURCES=${AIRFLOW_INSTALL_SOURCES}
# Determines the way airflow is installed. By default we install airflow from PyPI `apache-airflow` package
# But it also can be `.` from local installation or GitHub URL pointing to specific branch or tag
# Of Airflow. Note That for local source installation you need to have local sources of
# Airflow checked out together with the Dockerfile and AIRFLOW_SOURCES_FROM and AIRFLOW_SOURCES_TO
# set to "." and "/opt/airflow" respectively.
ARG AIRFLOW_INSTALLATION_METHOD="apache-airflow"
ENV AIRFLOW_INSTALLATION_METHOD=${AIRFLOW_INSTALLATION_METHOD}

# By default latest released version of airflow is installed (when empty) but this value can be overriden
# and we can install specific version of airflow this way.
ARG AIRFLOW_INSTALL_VERSION=""
ENV AIRFLOW_INSTALL_VERSION=${AIRFLOW_INSTALL_VERSION}

# We can seet this value to true in case we want to install .whl .tar.gz packages placed in the
# docker-context-files folder. This can be done for both - additional packages you want to install
# and for airflow as well (you have to set INSTALL_FROM_PYPI to false in this case)
ARG INSTALL_FROM_DOCKER_CONTEXT_FILES=""
ENV INSTALL_FROM_DOCKER_CONTEXT_FILES=${INSTALL_FROM_DOCKER_CONTEXT_FILES}

# By default we install latest airflow from PyPI. You can set it to false if you want to install
# Airflow from the .whl or .tar.gz packages placed in `docker-context-files` folder.
ARG INSTALL_FROM_PYPI="true"
ENV INSTALL_FROM_PYPI=${INSTALL_FROM_PYPI}

ARG SLUGIFY_USES_TEXT_UNIDECODE=""
ENV SLUGIFY_USES_TEXT_UNIDECODE=${SLUGIFY_USES_TEXT_UNIDECODE}

ARG INSTALL_PROVIDERS_FROM_SOURCES="true"
# By default we install providers from PyPI but in case of Breze build we want to install providers
# from local sources without the neeed of preparing provider packages upfront. This value is
# automatically overridden by Breeze scripts.
ARG INSTALL_PROVIDERS_FROM_SOURCES="false"
ENV INSTALL_PROVIDERS_FROM_SOURCES=${INSTALL_PROVIDERS_FROM_SOURCES}

WORKDIR /opt/airflow
Expand All @@ -230,7 +248,7 @@ RUN if [[ ${INSTALL_MYSQL_CLIENT} != "true" ]]; then \
AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/mysql,}; \
fi; \
if [[ ${INSTALL_FROM_PYPI} == "true" ]]; then \
pip install --user "${AIRFLOW_INSTALL_SOURCES}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" \
pip install --user "${AIRFLOW_INSTALLATION_METHOD}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" \
--constraint "${AIRFLOW_CONSTRAINTS_LOCATION}"; \
fi; \
if [[ -n "${ADDITIONAL_PYTHON_DEPS}" ]]; then \
Expand Down Expand Up @@ -273,6 +291,7 @@ LABEL org.apache.airflow.distro="debian" \
org.apache.airflow.module="airflow" \
org.apache.airflow.component="airflow" \
org.apache.airflow.image="airflow-build-image" \
org.apache.airflow.version="${AIRFLOW_VERSION}" \
org.apache.airflow.buildImage.buildId=${BUILD_ID} \
org.apache.airflow.buildImage.commitSha=${COMMIT_SHA}

Expand Down Expand Up @@ -434,6 +453,7 @@ LABEL org.apache.airflow.distro="debian" \
org.apache.airflow.module="airflow" \
org.apache.airflow.component="airflow" \
org.apache.airflow.image="airflow" \
org.apache.airflow.version="${AIRFLOW_VERSION}" \
org.apache.airflow.uid="${AIRFLOW_UID}" \
org.apache.airflow.gid="${AIRFLOW_GID}" \
org.apache.airflow.mainImage.buildId=${BUILD_ID} \
Expand Down
1 change: 1 addition & 0 deletions Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -396,6 +396,7 @@ LABEL org.apache.airflow.distro="debian" \
org.apache.airflow.module="airflow" \
org.apache.airflow.component="airflow" \
org.apache.airflow.image="airflow-ci" \
org.apache.airflow.version="${AIRFLOW_VERSION}" \
org.apache.airflow.uid="0" \
org.apache.airflow.gid="0" \
org.apache.airflow.buildId=${BUILD_ID} \
Expand Down
4 changes: 2 additions & 2 deletions IMAGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -310,7 +310,7 @@ additional apt dev and runtime dependencies.
docker build . -f Dockerfile.ci \
--build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
--build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
--build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
--build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
--build-arg AIRFLOW_VERSION="1.10.14" \
--build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
--build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
Expand Down Expand Up @@ -345,7 +345,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
docker build . -f Dockerfile.ci \
--build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
--build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
--build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
--build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
--build-arg AIRFLOW_VERSION="1.10.14" \
--build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
--build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
Expand Down
8 changes: 0 additions & 8 deletions breeze-complete
Original file line number Diff line number Diff line change
Expand Up @@ -52,18 +52,10 @@ _breeze_allowed_package_formats="wheel sdist both"

_breeze_allowed_install_airflow_versions=$(cat <<-EOF
1.10.14
1.10.13
1.10.12
1.10.11
1.10.10
1.10.9
1.10.8
1.10.7
1.10.6
1.10.5
1.10.4
1.10.3
1.10.2
none
wheel
EOF
Expand Down
122 changes: 111 additions & 11 deletions dev/README_RELEASE_AIRFLOW.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
- [Prepare the Apache Airflow Package RC](#prepare-the-apache-airflow-package-rc)
- [Build RC artifacts](#build-rc-artifacts)
- [Prepare PyPI convenience "snapshot" packages](#prepare-pypi-convenience-snapshot-packages)
- [\[Optional\] - Manually prepare production Docker Image](#%5Coptional%5C---manually-prepare-production-docker-image)
- [Prepare Vote email on the Apache Airflow release candidate](#prepare-vote-email-on-the-apache-airflow-release-candidate)
- [Verify the release candidate by PMCs](#verify-the-release-candidate-by-pmcs)
- [SVN check](#svn-check)
Expand All @@ -35,6 +36,7 @@
- [Publish release to SVN](#publish-release-to-svn)
- [Prepare PyPI "release" packages](#prepare-pypi-release-packages)
- [Update CHANGELOG.md](#update-changelogmd)
- [\[Optional\] - Manually prepare production Docker Image](#%5Coptional%5C---manually-prepare-production-docker-image-1)
- [Publish documentation](#publish-documentation)
- [Notify developers of release](#notify-developers-of-release)
- [Update Announcements page](#update-announcements-page)
Expand Down Expand Up @@ -112,18 +114,12 @@ The Release Candidate artifacts we vote upon should be the exact ones we vote ag
${AIRFLOW_REPO_ROOT}/dev/sign.sh apache_airflow-${VERSION}-py2.py3-none-any.whl
```

- Push Tags

```shell script
git push origin ${VERSION}
```

- Tag & Push latest constraints files
- Tag & Push latest constraints files. This pushes constraints with rc suffix (this is expected)!

```shell script
git checkout constraints-1-10
git tag -s "constraints-${VERSION%rc?}"
git push origin "constraints-${VERSION%rc?}"
git tag -s "constraints-${VERSION}"
git push origin "constraints-${VERSION}"
```

- Push the artifacts to ASF dev dist repo
Expand All @@ -146,7 +142,12 @@ svn commit -m "Add artifacts for Airflow ${VERSION}"
## Prepare PyPI convenience "snapshot" packages

At this point we have the artefact that we vote on, but as a convenience to developers we also want to
publish "snapshots" of the RC builds to pypi for installing via pip. To do this we need to
publish "snapshots" of the RC builds to pypi for installing via pip. Also those packages
are used to build the production docker image in DockerHub, so we need to upload the packages
before we push the tag to GitHub. Pushing the tag to GitHub automatically triggers image building in
DockerHub.

To do this we need to

- Edit the `setup.py` to include the RC suffix.

Expand Down Expand Up @@ -182,6 +183,51 @@ https://pypi.python.org/pypi/apache-airflow
It is important to stress that this snapshot should not be named "release", and it
is not supposed to be used by and advertised to the end-users who do not read the devlist.

- Push Tag for the release candidate

This step should only be done now and not before, because it triggers an automated build of
the production docker image, using the packages that are currently released in PyPI
(both airflow and latest provider packages).

```shell script
git push origin ${VERSION}
```

## \[Optional\] - Manually prepare production Docker Image

Production Docker images should be automatically built in 2-3 hours after the release tag has been
pushed. If this did not happen - please login to DockerHub and check the status of builds:
[Build Timeline](https://hub.docker.com/repository/docker/apache/airflow/timeline)

In case you need, you can also build and push the images manually:

Airflow 2+:

```shell script
export DOCKER_REPO=docker.io/apache/airflow
for python_version in "3.6" "3.7" "3.8"
(
export DOCKER_TAG=${VERSION}-python${python_version}
./scripts/ci/images/ci_build_dockerhub.sh
)
```

This will wipe Breeze cache and docker-context-files in order to make sure the build is "clean".

Airflow 1.10:

```shell script
for python_version in "2.7" "3.5" "3.6" "3.7" "3.8"
do
./breeze build-image --production-image --python ${python_version} \
--image-tag apache/airflow:${VERSION}-python${python_version} --build-cache-local
docker push apache/airflow:${VERSION}-python${python_version}
done
docker tag apache/airflow:${VERSION}-python3.6 apache/airflow:${VERSION}
docker push apache/airflow:${VERSION}
```


## Prepare Vote email on the Apache Airflow release candidate

- Use the dev/airflow-jira script to generate a list of Airflow JIRAs that were closed in the release.
Expand Down Expand Up @@ -251,6 +297,7 @@ Cheers,
<your name>
```


# Verify the release candidate by PMCs

The PMCs should verify the releases in order to make sure the release is following the
Expand Down Expand Up @@ -482,7 +529,7 @@ You need to migrate the RC artifacts that passed to this repository:
https://dist.apache.org/repos/dist/release/airflow/
(The migration should include renaming the files so that they no longer have the RC number in their filenames.)

The best way of doing this is to svn cp between the two repos (this avoids having to upload the binaries again, and gives a clearer history in the svn commit logs):
The best way of doing this is to svn cp between the two repos (this avoids having to upload the binaries again, and gives a clearer history in the svn commit logs):

```shell script
# First clone the repo
Expand Down Expand Up @@ -552,6 +599,59 @@ At this point we release an official package:

- Update CHANGELOG.md with the details, and commit it.

- Re-Tag & Push the constraints files with the final release version.

```shell script
git checkout constraints-${RC}
git tag -s "constraints-${VERSION}"
git push origin "constraints-${VERSION}"
```

- Push Tag for the final version

This step should only be done now and not before, because it triggers an automated build of
the production docker image, using the packages that are currently released in PyPI
(both airflow and latest provider packages).

```shell script
git push origin ${VERSION}
```

## \[Optional\] - Manually prepare production Docker Image

Production Docker images should be automatically built in 2-3 hours after the release tag has been
pushed. If this did not happen - please login to DockerHub and check the status of builds:
[Build Timeline](https://hub.docker.com/repository/docker/apache/airflow/timeline)

In case you need, you can also build and push the images manually:

Airflow 2+:

```shell script
export DOCKER_REPO=docker.io/apache/airflow
for python_version in "3.6" "3.7" "3.8"
(
export DOCKER_TAG=${VERSION}-python${python_version}
./scripts/ci/images/ci_build_dockerhub.sh
)
```

This will wipe Breeze cache and docker-context-files in order to make sure the build is "clean".


Airflow 1.10:

```shell script
for python_version in "2.7" "3.5" "3.6" "3.7" "3.8"
do
./breeze build-image --production-image --python ${python_version} \
--image-tag apache/airflow:${VERSION}-python${python_version} --build-cache-local
docker push apache/airflow:${VERSION}-python${python_version}
done
docker tag apache/airflow:${VERSION}-python3.6 apache/airflow:${VERSION}
docker push apache/airflow:${VERSION}
```

## Publish documentation

Documentation is an essential part of the product and should be made available to users.
Expand Down
Loading