Skip to content

Commit faee201

Browse files
committed
Changes release image preparation to use PyPI packages
Since we released all teh provider packages to PyPI now in RC version, we can now change the mechanism to prepare the production to use released packages in case of tagged builds. The "branch" production images are still prepared using the CI images and .whl packages built from sources, but the release packages are built from officially released PyPI packages. Also some corrections and updates were made to the release process: * the constraint tags when RC candidate is sent should contain rcn suffix. * there was missing step about pushing the release tag once the release is out * pushing tag to GitHub should be done after the PyPI packages are uploaded, so that automated image building in DockerHub can use those packages. * added a note that in case we will release some provider packages that depend on the just released airflow version they shoudl be released after airflow is in PyPI but before the tag is pushed to GitHub (also to allow the image to be build automatically from the released packages) Fixes: #12970
1 parent 3fbc8e6 commit faee201

12 files changed

+364
-214
lines changed

BREEZE.rst

+2-4
Original file line numberDiff line numberDiff line change
@@ -1275,8 +1275,7 @@ This is the current syntax for `./breeze <./breeze>`_:
12751275
If specified, installs Airflow directly from PIP released version. This happens at
12761276
image building time in production image and at container entering time for CI image. One of:
12771277
1278-
1.10.14 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4
1279-
1.10.3 1.10.2 none wheel
1278+
1.10.14 1.10.12 1.10.11 1.10.10 1.10.9 none wheel
12801279
12811280
When 'none' is used, you can install airflow from local packages. When building image,
12821281
airflow package should be added to 'docker-context-files' and
@@ -2379,8 +2378,7 @@ This is the current syntax for `./breeze <./breeze>`_:
23792378
If specified, installs Airflow directly from PIP released version. This happens at
23802379
image building time in production image and at container entering time for CI image. One of:
23812380
2382-
1.10.14 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4
2383-
1.10.3 1.10.2 none wheel
2381+
1.10.14 1.10.12 1.10.11 1.10.10 1.10.9 none wheel
23842382
23852383
When 'none' is used, you can install airflow from local packages. When building image,
23862384
airflow package should be added to 'docker-context-files' and

Dockerfile

+32-12
Original file line numberDiff line numberDiff line change
@@ -167,15 +167,17 @@ ENV AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION}
167167
ENV PATH=${PATH}:/root/.local/bin
168168
RUN mkdir -p /root/.local/bin
169169

170-
ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="true"
171-
ENV AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES}
172-
173170
RUN if [[ -f /docker-context-files/.pypirc ]]; then \
174171
cp /docker-context-files/.pypirc /root/.pypirc; \
175172
fi
176173

177174
RUN pip install --upgrade "pip==${PIP_VERSION}"
178175

176+
# By default we do not use pre-cached packages, but in CI/Breeze environment we override this to speed up
177+
# builds in case setup.py/setup.cfg changed. This is pure optimisation of CI/Breeze builds.
178+
ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="false"
179+
ENV AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES}
180+
179181
# In case of Production build image segment we want to pre-install master version of airflow
180182
# dependencies from GitHub so that we do not have to always reinstall it from the scratch.
181183
RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
@@ -188,39 +190,55 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
188190
&& pip uninstall --yes apache-airflow; \
189191
fi
190192

191-
ARG AIRFLOW_SOURCES_FROM="."
193+
# By default we install latest airflow from PyPI so we do not need to copy sources of Airflow
194+
# but in case of breeze/CI builds we use latest sources and we override those
195+
# those SOURCES_FROM/TO with "." and "/opt/airflow" respectively
196+
ARG AIRFLOW_SOURCES_FROM="empty"
192197
ENV AIRFLOW_SOURCES_FROM=${AIRFLOW_SOURCES_FROM}
193198

194-
ARG AIRFLOW_SOURCES_TO="/opt/airflow"
199+
ARG AIRFLOW_SOURCES_TO="/empty"
195200
ENV AIRFLOW_SOURCES_TO=${AIRFLOW_SOURCES_TO}
196201

197202
COPY ${AIRFLOW_SOURCES_FROM} ${AIRFLOW_SOURCES_TO}
198203

199204
ARG CASS_DRIVER_BUILD_CONCURRENCY
200205
ENV CASS_DRIVER_BUILD_CONCURRENCY=${CASS_DRIVER_BUILD_CONCURRENCY}
201206

207+
# This is airflow version that is put in the label of the image build
202208
ARG AIRFLOW_VERSION
203209
ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}
204210

205211
ARG ADDITIONAL_PYTHON_DEPS=""
206212
ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS}
207213

208-
ARG AIRFLOW_INSTALL_SOURCES="."
209-
ENV AIRFLOW_INSTALL_SOURCES=${AIRFLOW_INSTALL_SOURCES}
214+
# Determines the way airflow is installed. By default we install airflow from PyPI `apache-airflow` package
215+
# But it also can be `.` from local installation or GitHub URL pointing to specific branch or tag
216+
# Of Airflow. Note That for local source installation you need to have local sources of
217+
# Airflow checked out together with the Dockerfile and AIRFLOW_SOURCES_FROM and AIRFLOW_SOURCES_TO
218+
# set to "." and "/opt/airflow" respectively.
219+
ARG AIRFLOW_INSTALLATION_METHOD="apache-airflow"
220+
ENV AIRFLOW_INSTALLATION_METHOD=${AIRFLOW_INSTALLATION_METHOD}
210221

222+
# By default latest released version of airflow is installed (when empty) but this value can be overriden
223+
# and we can install specific version of airflow this way.
211224
ARG AIRFLOW_INSTALL_VERSION=""
212225
ENV AIRFLOW_INSTALL_VERSION=${AIRFLOW_INSTALL_VERSION}
213226

227+
# We can seet this value to true in case we want to install .whl .tar.gz packages placed in the
228+
# docker-context-files folder. This can be done for both - additional packages you want to install
229+
# and for airflow as well (you have to set INSTALL_FROM_PYPI to false in this case)
214230
ARG INSTALL_FROM_DOCKER_CONTEXT_FILES=""
215231
ENV INSTALL_FROM_DOCKER_CONTEXT_FILES=${INSTALL_FROM_DOCKER_CONTEXT_FILES}
216232

233+
# By default we install latest airflow from PyPI. You can set it to false if you want to install
234+
# Airflow from the .whl or .tar.gz packages placed in `docker-context-files` folder.
217235
ARG INSTALL_FROM_PYPI="true"
218236
ENV INSTALL_FROM_PYPI=${INSTALL_FROM_PYPI}
219237

220-
ARG SLUGIFY_USES_TEXT_UNIDECODE=""
221-
ENV SLUGIFY_USES_TEXT_UNIDECODE=${SLUGIFY_USES_TEXT_UNIDECODE}
222-
223-
ARG INSTALL_PROVIDERS_FROM_SOURCES="true"
238+
# By default we install providers from PyPI but in case of Breze build we want to install providers
239+
# from local sources without the neeed of preparing provider packages upfront. This value is
240+
# automatically overridden by Breeze scripts.
241+
ARG INSTALL_PROVIDERS_FROM_SOURCES="false"
224242
ENV INSTALL_PROVIDERS_FROM_SOURCES=${INSTALL_PROVIDERS_FROM_SOURCES}
225243

226244
WORKDIR /opt/airflow
@@ -230,7 +248,7 @@ RUN if [[ ${INSTALL_MYSQL_CLIENT} != "true" ]]; then \
230248
AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/mysql,}; \
231249
fi; \
232250
if [[ ${INSTALL_FROM_PYPI} == "true" ]]; then \
233-
pip install --user "${AIRFLOW_INSTALL_SOURCES}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" \
251+
pip install --user "${AIRFLOW_INSTALLATION_METHOD}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" \
234252
--constraint "${AIRFLOW_CONSTRAINTS_LOCATION}"; \
235253
fi; \
236254
if [[ -n "${ADDITIONAL_PYTHON_DEPS}" ]]; then \
@@ -273,6 +291,7 @@ LABEL org.apache.airflow.distro="debian" \
273291
org.apache.airflow.module="airflow" \
274292
org.apache.airflow.component="airflow" \
275293
org.apache.airflow.image="airflow-build-image" \
294+
org.apache.airflow.version="${AIRFLOW_VERSION}" \
276295
org.apache.airflow.buildImage.buildId=${BUILD_ID} \
277296
org.apache.airflow.buildImage.commitSha=${COMMIT_SHA}
278297

@@ -434,6 +453,7 @@ LABEL org.apache.airflow.distro="debian" \
434453
org.apache.airflow.module="airflow" \
435454
org.apache.airflow.component="airflow" \
436455
org.apache.airflow.image="airflow" \
456+
org.apache.airflow.version="${AIRFLOW_VERSION}" \
437457
org.apache.airflow.uid="${AIRFLOW_UID}" \
438458
org.apache.airflow.gid="${AIRFLOW_GID}" \
439459
org.apache.airflow.mainImage.buildId=${BUILD_ID} \

Dockerfile.ci

+1
Original file line numberDiff line numberDiff line change
@@ -396,6 +396,7 @@ LABEL org.apache.airflow.distro="debian" \
396396
org.apache.airflow.module="airflow" \
397397
org.apache.airflow.component="airflow" \
398398
org.apache.airflow.image="airflow-ci" \
399+
org.apache.airflow.version="${AIRFLOW_VERSION}" \
399400
org.apache.airflow.uid="0" \
400401
org.apache.airflow.gid="0" \
401402
org.apache.airflow.buildId=${BUILD_ID} \

IMAGES.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -310,7 +310,7 @@ additional apt dev and runtime dependencies.
310310
docker build . -f Dockerfile.ci \
311311
--build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
312312
--build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
313-
--build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
313+
--build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
314314
--build-arg AIRFLOW_VERSION="1.10.14" \
315315
--build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
316316
--build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
@@ -345,7 +345,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
345345
docker build . -f Dockerfile.ci \
346346
--build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
347347
--build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
348-
--build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
348+
--build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
349349
--build-arg AIRFLOW_VERSION="1.10.14" \
350350
--build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
351351
--build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \

breeze-complete

-8
Original file line numberDiff line numberDiff line change
@@ -52,18 +52,10 @@ _breeze_allowed_package_formats="wheel sdist both"
5252

5353
_breeze_allowed_install_airflow_versions=$(cat <<-EOF
5454
1.10.14
55-
1.10.13
5655
1.10.12
5756
1.10.11
5857
1.10.10
5958
1.10.9
60-
1.10.8
61-
1.10.7
62-
1.10.6
63-
1.10.5
64-
1.10.4
65-
1.10.3
66-
1.10.2
6759
none
6860
wheel
6961
EOF

dev/README_RELEASE_AIRFLOW.md

+111-11
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@
2323
- [Prepare the Apache Airflow Package RC](#prepare-the-apache-airflow-package-rc)
2424
- [Build RC artifacts](#build-rc-artifacts)
2525
- [Prepare PyPI convenience "snapshot" packages](#prepare-pypi-convenience-snapshot-packages)
26+
- [\[Optional\] - Manually prepare production Docker Image](#%5Coptional%5C---manually-prepare-production-docker-image)
2627
- [Prepare Vote email on the Apache Airflow release candidate](#prepare-vote-email-on-the-apache-airflow-release-candidate)
2728
- [Verify the release candidate by PMCs](#verify-the-release-candidate-by-pmcs)
2829
- [SVN check](#svn-check)
@@ -35,6 +36,7 @@
3536
- [Publish release to SVN](#publish-release-to-svn)
3637
- [Prepare PyPI "release" packages](#prepare-pypi-release-packages)
3738
- [Update CHANGELOG.md](#update-changelogmd)
39+
- [\[Optional\] - Manually prepare production Docker Image](#%5Coptional%5C---manually-prepare-production-docker-image-1)
3840
- [Publish documentation](#publish-documentation)
3941
- [Notify developers of release](#notify-developers-of-release)
4042
- [Update Announcements page](#update-announcements-page)
@@ -112,18 +114,12 @@ The Release Candidate artifacts we vote upon should be the exact ones we vote ag
112114
${AIRFLOW_REPO_ROOT}/dev/sign.sh apache_airflow-${VERSION}-py2.py3-none-any.whl
113115
```
114116

115-
- Push Tags
116-
117-
```shell script
118-
git push origin ${VERSION}
119-
```
120-
121-
- Tag & Push latest constraints files
117+
- Tag & Push latest constraints files. This pushes constraints with rc suffix (this is expected)!
122118

123119
```shell script
124120
git checkout constraints-1-10
125-
git tag -s "constraints-${VERSION%rc?}"
126-
git push origin "constraints-${VERSION%rc?}"
121+
git tag -s "constraints-${VERSION}"
122+
git push origin "constraints-${VERSION}"
127123
```
128124

129125
- Push the artifacts to ASF dev dist repo
@@ -146,7 +142,12 @@ svn commit -m "Add artifacts for Airflow ${VERSION}"
146142
## Prepare PyPI convenience "snapshot" packages
147143
148144
At this point we have the artefact that we vote on, but as a convenience to developers we also want to
149-
publish "snapshots" of the RC builds to pypi for installing via pip. To do this we need to
145+
publish "snapshots" of the RC builds to pypi for installing via pip. Also those packages
146+
are used to build the production docker image in DockerHub, so we need to upload the packages
147+
before we push the tag to GitHub. Pushing the tag to GitHub automatically triggers image building in
148+
DockerHub.
149+
150+
To do this we need to
150151
151152
- Edit the `setup.py` to include the RC suffix.
152153
@@ -182,6 +183,51 @@ https://pypi.python.org/pypi/apache-airflow
182183
It is important to stress that this snapshot should not be named "release", and it
183184
is not supposed to be used by and advertised to the end-users who do not read the devlist.
184185
186+
- Push Tag for the release candidate
187+
188+
This step should only be done now and not before, because it triggers an automated build of
189+
the production docker image, using the packages that are currently released in PyPI
190+
(both airflow and latest provider packages).
191+
192+
```shell script
193+
git push origin ${VERSION}
194+
```
195+
196+
## \[Optional\] - Manually prepare production Docker Image
197+
198+
Production Docker images should be automatically built in 2-3 hours after the release tag has been
199+
pushed. If this did not happen - please login to DockerHub and check the status of builds:
200+
[Build Timeline](https://hub.docker.com/repository/docker/apache/airflow/timeline)
201+
202+
In case you need, you can also build and push the images manually:
203+
204+
Airflow 2+:
205+
206+
```shell script
207+
export DOCKER_REPO=docker.io/apache/airflow
208+
for python_version in "3.6" "3.7" "3.8"
209+
(
210+
export DOCKER_TAG=${VERSION}-python${python_version}
211+
./scripts/ci/images/ci_build_dockerhub.sh
212+
)
213+
```
214+
215+
This will wipe Breeze cache and docker-context-files in order to make sure the build is "clean".
216+
217+
Airflow 1.10:
218+
219+
```shell script
220+
for python_version in "2.7" "3.5" "3.6" "3.7" "3.8"
221+
do
222+
./breeze build-image --production-image --python ${python_version} \
223+
--image-tag apache/airflow:${VERSION}-python${python_version} --build-cache-local
224+
docker push apache/airflow:${VERSION}-python${python_version}
225+
done
226+
docker tag apache/airflow:${VERSION}-python3.6 apache/airflow:${VERSION}
227+
docker push apache/airflow:${VERSION}
228+
```
229+
230+
185231
## Prepare Vote email on the Apache Airflow release candidate
186232
187233
- Use the dev/airflow-jira script to generate a list of Airflow JIRAs that were closed in the release.
@@ -251,6 +297,7 @@ Cheers,
251297
<your name>
252298
```
253299
300+
254301
# Verify the release candidate by PMCs
255302
256303
The PMCs should verify the releases in order to make sure the release is following the
@@ -482,7 +529,7 @@ You need to migrate the RC artifacts that passed to this repository:
482529
https://dist.apache.org/repos/dist/release/airflow/
483530
(The migration should include renaming the files so that they no longer have the RC number in their filenames.)
484531
485-
The best way of doing this is to svn cp between the two repos (this avoids having to upload the binaries again, and gives a clearer history in the svn commit logs):
532+
The best way of doing this is to svn cp between the two repos (this avoids having to upload the binaries again, and gives a clearer history in the svn commit logs):
486533
487534
```shell script
488535
# First clone the repo
@@ -552,6 +599,59 @@ At this point we release an official package:
552599
553600
- Update CHANGELOG.md with the details, and commit it.
554601
602+
- Re-Tag & Push the constraints files with the final release version.
603+
604+
```shell script
605+
git checkout constraints-${RC}
606+
git tag -s "constraints-${VERSION}"
607+
git push origin "constraints-${VERSION}"
608+
```
609+
610+
- Push Tag for the final version
611+
612+
This step should only be done now and not before, because it triggers an automated build of
613+
the production docker image, using the packages that are currently released in PyPI
614+
(both airflow and latest provider packages).
615+
616+
```shell script
617+
git push origin ${VERSION}
618+
```
619+
620+
## \[Optional\] - Manually prepare production Docker Image
621+
622+
Production Docker images should be automatically built in 2-3 hours after the release tag has been
623+
pushed. If this did not happen - please login to DockerHub and check the status of builds:
624+
[Build Timeline](https://hub.docker.com/repository/docker/apache/airflow/timeline)
625+
626+
In case you need, you can also build and push the images manually:
627+
628+
Airflow 2+:
629+
630+
```shell script
631+
export DOCKER_REPO=docker.io/apache/airflow
632+
for python_version in "3.6" "3.7" "3.8"
633+
(
634+
export DOCKER_TAG=${VERSION}-python${python_version}
635+
./scripts/ci/images/ci_build_dockerhub.sh
636+
)
637+
```
638+
639+
This will wipe Breeze cache and docker-context-files in order to make sure the build is "clean".
640+
641+
642+
Airflow 1.10:
643+
644+
```shell script
645+
for python_version in "2.7" "3.5" "3.6" "3.7" "3.8"
646+
do
647+
./breeze build-image --production-image --python ${python_version} \
648+
--image-tag apache/airflow:${VERSION}-python${python_version} --build-cache-local
649+
docker push apache/airflow:${VERSION}-python${python_version}
650+
done
651+
docker tag apache/airflow:${VERSION}-python3.6 apache/airflow:${VERSION}
652+
docker push apache/airflow:${VERSION}
653+
```
654+
555655
## Publish documentation
556656
557657
Documentation is an essential part of the product and should be made available to users.

0 commit comments

Comments
 (0)