You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: RELEASE.md
+4-5
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
# Release 0.17.8
2
2
3
3
## Major features and improvements
4
+
* Documented distribution of Kedro pipelines with Dask.
4
5
5
-
* Added option to `SparkDataSet` to specify a `schema` load argument that allows for supplying a user-defined schema as opposed to relying on the schema inference of Spark.
6
+
## Bug fixes and other changes
6
7
7
-
## Thanks for supporting contributions
8
-
[Laurens Vijnck](https://github.com/lvijnck)
8
+
## Upcoming deprecations for Kedro 0.18.0
9
9
10
10
# Release 0.17.7
11
11
@@ -24,7 +24,6 @@
24
24
* Added `astro-iris` as alias for `astro-airlow-iris`, so that old tutorials can still be followed.
25
25
* Added details about [Kedro's Technical Steering Committee and governance model](https://kedro.readthedocs.io/en/0.17.7/14_contribution/technical_steering_committee.html).
26
26
27
-
28
27
## Upcoming deprecations for Kedro 0.18.0
29
28
*`kedro pipeline pull` and `kedro pipeline package` will be deprecated. Please use `kedro micropkg` instead.
30
29
@@ -415,7 +414,7 @@ Check your source directory. If you defined a different source directory (`sourc
415
414
416
415
## Major features and improvements
417
416
418
-
* Added documentation with a focus on single machine and distributed environment deployment; the series includes Docker, Argo, Prefect, Kubeflow, AWS Batch, AWS Sagemaker and extends our section on Databricks
417
+
* Added documentation with a focus on single machine and distributed environment deployment; the series includes Docker, Argo, Prefect, Kubeflow, AWS Batch, AWS Sagemaker and extends our section on Databricks.
419
418
* Added [kedro-starter-spaceflights](https://github.com/kedro-org/kedro-starter-spaceflights/) alias for generating a project: `kedro new --starter spaceflights`.
Copy file name to clipboardexpand all lines: docs/source/10_deployment/04_argo.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Deployment with Argo Workflows
2
2
3
-
This page explains how to convert your Kedro pipeline to use [Argo Workflows](https://github.com/argoproj/argo-workflows), an opensource container-native workflow engine for orchestrating parallel jobs on [Kubernetes](https://kubernetes.io/).
3
+
This page explains how to convert your Kedro pipeline to use [Argo Workflows](https://github.com/argoproj/argo-workflows), an open-source container-native workflow engine for orchestrating parallel jobs on [Kubernetes](https://kubernetes.io/).
Copy file name to clipboardexpand all lines: docs/source/10_deployment/05_prefect.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
# Deployment with Prefect
2
2
3
-
This page explains how to run your Kedro pipeline using [Prefect Core](https://www.prefect.io/products/core/), an opensource workflow management system.
3
+
This page explains how to run your Kedro pipeline using [Prefect Core](https://www.prefect.io/products/core/), an open-source workflow management system.
4
4
5
-
In scope of this deployment we are interested in [Prefect Server](https://docs.prefect.io/orchestration/server/overview.html#what-is-prefect-server) which is an open-source backend that makes it easy to monitor and execute your Prefect flows and automatically extends the Prefect Core.
5
+
In scope of this deployment, we are interested in [Prefect Server](https://docs.prefect.io/orchestration/server/overview.html#what-is-prefect-server), an open-source backend that makes it easy to monitor and execute your Prefect flows and automatically extends the Prefect Core.
6
6
7
7
```eval_rst
8
8
.. note:: Prefect Server ships out-of-the-box with a fully featured user interface.
Copy file name to clipboardexpand all lines: docs/source/10_deployment/07_aws_batch.md
+11-7
Original file line number
Diff line number
Diff line change
@@ -118,12 +118,14 @@ Now that all the resources are in place, it's time to submit jobs to Batch progr
118
118
119
119
#### Create a custom runner
120
120
121
-
Create a new Python package `runner` in your `src` folder, i.e. `kedro_tutorial/src/kedro_tutorial/runner/`. Make sure there is an `__init__.py` file at this location and add another file named `batch_runner.py`, which will contain the implementation of your custom runner, `AWSBatchRunner`. The `AWSBatchRunner` will submit and monitor jobs asynchronously, surfacing any errors that occur on Batch.
121
+
Create a new Python package `runner` in your `src` folder, i.e. `kedro_tutorial/src/kedro_tutorial/runner/`. Make sure there is an `__init__.py` file at this location, and add another file named `batch_runner.py`, which will contain the implementation of your custom runner, `AWSBatchRunner`. The `AWSBatchRunner` will submit and monitor jobs asynchronously, surfacing any errors that occur on Batch.
122
122
123
-
Make sure the `__init__.py` file in the `runner` folder includes the following import:
123
+
Make sure the `__init__.py` file in the `runner` folder includes the following import and declaration:
124
124
125
125
```python
126
-
from .batch_runner import AWSBatchRunner # NOQA
126
+
from .batch_runner import AWSBatchRunner
127
+
128
+
__all__ = ["AWSBatchRunner"]
127
129
```
128
130
129
131
Copy the contents of the script below into `batch_runner.py`:
You'll need to set the Batch-related configuration that the runner will use. Add a `parameters.yml` file inside the `conf/aws_batch/` directory created as part of the prerequistes steps, which will include the following keys:
291
+
You'll need to set the Batch-related configuration that the runner will use. Add a `parameters.yml` file inside the `conf/aws_batch/` directory created as part of the prerequistes with the following keys:
0 commit comments