Skip to content

Commit b0d68f6

Browse files
🎨 🐛 Fix broken annotations in 3.9 due to #633 and upgrade all type annotations to python 3.9 standard
1 parent 4311e34 commit b0d68f6

25 files changed

+135
-134
lines changed

CHANGELOG.md

+4
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,10 @@
77
- :sparkles: Add the ``tracking.disable_tracking.disable_autologging`` configuration option in ``mlflow.yml `` to disable autologging by default. This simplify the workflow for Databricks users who have autologging activated by default, which conflicts with ``kedro-mlflow`` ([[#610](https://github.com/Galileo-Galilei/kedro-mlflow/issues/610)]).
88
- :sparkles: Add ``tracking.experiment.create_experiment_kwargs.artifact_location`` and ``tracking.experiment.create_experiment_kwargs.tags`` configuration options in ``mlflow.yml `` to enable advanced configuration of mlflow experiment created at runtime by ``kedro-mlflow`` ([[#557](https://github.com/Galileo-Galilei/kedro-mlflow/issues/557)]).
99

10+
### Fixed
11+
12+
- :bug: Fix type annotations introduced in [#633](https://github.com/Galileo-Galilei/kedro-mlflow/pull/633) which are not compatible with ``python==3.9``.
13+
1014
## [0.14.3] - 2025-02-17
1115

1216
### Added

docs/source/03_experiment_tracking/01_experiment_tracking/02_version_parameters.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
## Automatic parameters tracking
44

5-
Parameters tracking is automatic when the ``MlflowHook`` is added to [the hook list of the ``ProjectContext``](https://kedro-mlflow.readthedocs.io/en/latest/source/02_getting_started/01_installation/02_setup.html). The `mlflow.yml` configuration file has a parameter called ``flatten_dict_params`` which enables to [log as distinct parameters the (key, value) pairs of a ```Dict`` parameter](https://kedro-mlflow.readthedocs.io/en/latest/source/05_API/01_python_objects/02_Hooks.html).
5+
Parameters tracking is automatic when the ``MlflowHook`` is added to [the hook list of the ``ProjectContext``](https://kedro-mlflow.readthedocs.io/en/latest/source/02_getting_started/01_installation/02_setup.html). The `mlflow.yml` configuration file has a parameter called ``flatten_dict_params`` which enables to [log as distinct parameters the (key, value) pairs of a ```dict`` parameter](https://kedro-mlflow.readthedocs.io/en/latest/source/05_API/01_python_objects/02_Hooks.html).
66

77
You **do not need any additional configuration** to benefit from parameters versioning.
88

docs/source/03_experiment_tracking/01_experiment_tracking/05_version_metrics.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -172,7 +172,7 @@ my_model_metrics:
172172
Let assume that you have node which doesn't have any inputs and returns dictionary with metrics to log:
173173
174174
```python
175-
def metrics_node() -> Dict[str, Union[float, List[float]]]:
175+
def metrics_node() -> dict[str, Union[float, list[float]]]:
176176
return {
177177
"metric1": {"value": 1.1, "step": 1},
178178
"metric2": [{"value": 1.1, "step": 1}, {"value": 1.2, "step": 2}],
@@ -181,8 +181,8 @@ def metrics_node() -> Dict[str, Union[float, List[float]]]:
181181

182182
As you can see above, ``kedro_mlflow.io.metrics.MlflowMetricsHistoryDataset`` can take metrics as:
183183

184-
- ``Dict[str, key]``
185-
- ``List[Dict[str, key]]``
184+
- ``[str, key]``
185+
- ``list[[str, key]]``
186186

187187
To store metrics we need to define metrics dataset in Kedro Catalog:
188188

docs/source/04_pipeline_as_model/01_pipeline_as_custom_model/02_scikit_learn_like_pipeline.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ You can configure your project as follows:
2323
from kedro_mlflow_tutorial.pipelines.ml_app.pipeline import create_ml_pipeline
2424

2525

26-
def register_pipelines(self) -> Dict[str, Pipeline]:
26+
def register_pipelines(self) -> [str, Pipeline]:
2727
ml_pipeline = create_ml_pipeline()
2828
training_pipeline_ml = pipeline_ml_factory(
2929
training=ml_pipeline.only_nodes_with_tags(

docs/source/04_pipeline_as_model/02_framework_ml/03_framework_solutions.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ from kedro_mlflow_tutorial.pipelines.ml_app.pipeline import create_ml_pipeline
2424

2525
class ProjectHooks:
2626
@hook_impl
27-
def register_pipelines(self) -> Dict[str, Pipeline]:
27+
def register_pipelines(self) -> [str, Pipeline]:
2828
ml_pipeline = create_ml_pipeline()
2929

3030
# convert your two pipelines to a PipelinML object

docs/source/05_API/01_python_objects/01_Datasets.md

+5-5
Original file line numberDiff line numberDiff line change
@@ -62,8 +62,8 @@ The ``MlflowModelTrackingDataset`` accepts the following arguments:
6262
- run_id (Optional[str], optional): MLflow run ID to use to load the model from or save the model to. It plays the same role as "filepath" for standard mlflow datasets. Defaults to None.
6363
- artifact_path (str, optional): the run relative path to the model.
6464
- pyfunc_workflow (str, optional): Either `python_model` or `loader_module`.See [mlflow workflows](https://www.mlflow.org/docs/latest/python_api/mlflow.pyfunc.html#workflows).
65-
- load_args (Dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
66-
- save_args (Dict[str, Any], optional): Arguments to `log_model` function from specified `flavor`. Defaults to None.
65+
- load_args (dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
66+
- save_args (dict[str, Any], optional): Arguments to `log_model` function from specified `flavor`. Defaults to None.
6767

6868
You can either only specify the flavor:
6969

@@ -122,8 +122,8 @@ The ``MlflowModelLocalFileSystemDataset`` accepts the following arguments:
122122
- flavor (str): Built-in or custom MLflow model flavor module. Must be Python-importable.
123123
- filepath (str): Path to store the dataset locally.
124124
- pyfunc_workflow (str, optional): Either `python_model` or `loader_module`. See [mlflow workflows](https://www.mlflow.org/docs/latest/python_api/mlflow.pyfunc.html#workflows).
125-
- load_args (Dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
126-
- save_args (Dict[str, Any], optional): Arguments to `save_model` function from specified `flavor`. Defaults to None.
125+
- load_args (dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
126+
- save_args (dict[str, Any], optional): Arguments to `save_model` function from specified `flavor`. Defaults to None.
127127
- version (Version, optional): Kedro version to use. Defaults to None.
128128

129129
The use is very similar to ``MlflowModelTrackingDataset``, but you have to specify a local ``filepath`` instead of a `run_id`:
@@ -168,7 +168,7 @@ The ``MlflowModelRegistryDataset`` accepts the following arguments:
168168
- ``alias`` (str): A valid alias, which is used instead of stage to filter model since mlflow 2.9.0. Will raise an error if both ``stage_or_version`` and ``alias`` are provided.
169169
- ``flavor`` (str): Built-in or custom MLflow model flavor module. Must be Python-importable.
170170
- ``pyfunc_workflow`` (str, optional): Either `python_model` or `loader_module`. See [mlflow workflows](https://www.mlflow.org/docs/latest/python_api/mlflow.pyfunc.html#workflows).
171-
- ``load_args`` (Dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
171+
- ``load_args`` (dict[str, Any], optional): Arguments to `load_model` function from specified `flavor`. Defaults to None.
172172

173173
We assume you have registered a mlflow model first, either [with the ``MlflowClient``](https://mlflow.org/docs/latest/model-registry.html#adding-an-mlflow-model-to-the-model-registry) or [within the mlflow ui](https://mlflow.org/docs/latest/model-registry.html#ui-workflow), e.g. :
174174

docs/source/05_API/01_python_objects/03_Pipelines.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Example within kedro template:
1414
from PYTHON_PACKAGE.pipelines import data_science as ds
1515

1616

17-
def create_pipelines(**kwargs) -> Dict[str, Pipeline]:
17+
def create_pipelines(**kwargs) -> dict[str, Pipeline]:
1818
data_science_pipeline = ds.create_pipeline()
1919
training_pipeline = pipeline_ml_factory(
2020
training=data_science_pipeline.only_nodes_with_tags(

kedro_mlflow/framework/cli/cli.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
from pathlib import Path
55
from platform import python_version
66
from tempfile import TemporaryDirectory
7-
from typing import Dict, Optional, Union
7+
from typing import Optional, Union
88

99
import click
1010
import mlflow
@@ -295,7 +295,7 @@ def modelify(
295295
flag_infer_input_example: Optional[bool],
296296
run_id: Optional[str],
297297
run_name: Optional[str],
298-
copy_mode: Optional[Union[str, Dict[str, str]]],
298+
copy_mode: Optional[Union[str, dict[str, str]]],
299299
artifact_path: str,
300300
code_path: str,
301301
conda_env: Optional[str],

kedro_mlflow/framework/hooks/mlflow_hook.py

+30-30
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
from logging import Logger, getLogger
44
from pathlib import Path
55
from tempfile import TemporaryDirectory
6-
from typing import Any, Dict, Union
6+
from typing import Any, Union
77

88
import mlflow
99
from kedro.config import MissingConfigException
@@ -145,9 +145,9 @@ def after_context_created(
145145
def after_catalog_created(
146146
self,
147147
catalog: DataCatalog,
148-
conf_catalog: Dict[str, Any],
149-
conf_creds: Dict[str, Any],
150-
feed_dict: Dict[str, Any],
148+
conf_catalog: dict[str, Any],
149+
conf_creds: dict[str, Any],
150+
feed_dict: dict[str, Any],
151151
save_version: str,
152152
load_versions: str,
153153
):
@@ -197,7 +197,7 @@ def after_catalog_created(
197197

198198
@hook_impl
199199
def before_pipeline_run(
200-
self, run_params: Dict[str, Any], pipeline: Pipeline, catalog: DataCatalog
200+
self, run_params: dict[str, Any], pipeline: Pipeline, catalog: DataCatalog
201201
) -> None:
202202
"""Hook to be invoked before a pipeline runs.
203203
Args:
@@ -208,14 +208,14 @@ def before_pipeline_run(
208208
"project_path": str,
209209
"env": str,
210210
"kedro_version": str,
211-
"tags": Optional[List[str]],
212-
"from_nodes": Optional[List[str]],
213-
"to_nodes": Optional[List[str]],
214-
"node_names": Optional[List[str]],
215-
"from_inputs": Optional[List[str]],
216-
"load_versions": Optional[List[str]],
211+
"tags": Optional[list[str]],
212+
"from_nodes": Optional[list[str]],
213+
"to_nodes": Optional[list[str]],
214+
"node_names": Optional[list[str]],
215+
"from_inputs": Optional[list[str]],
216+
"load_versions": Optional[list[str]],
217217
"pipeline_name": str,
218-
"extra_params": Optional[Dict[str, Any]],
218+
"extra_params": Optional[dict[str, Any]],
219219
}
220220
pipeline: The ``Pipeline`` that will be run.
221221
catalog: The ``DataCatalog`` to be used during the run.
@@ -282,7 +282,7 @@ def before_pipeline_run(
282282

283283
@hook_impl
284284
def before_node_run(
285-
self, node: Node, catalog: DataCatalog, inputs: Dict[str, Any], is_async: bool
285+
self, node: Node, catalog: DataCatalog, inputs: dict[str, Any], is_async: bool
286286
) -> None:
287287
"""Hook to be invoked before a node runs.
288288
This hook logs all the parameters of the nodes in mlflow.
@@ -342,7 +342,7 @@ def before_node_run(
342342
for k, v in params_inputs.items():
343343
self._log_param(k, v)
344344

345-
def _log_param(self, name: str, value: Union[Dict, int, bool, str]) -> None:
345+
def _log_param(self, name: str, value: Union[dict, int, bool, str]) -> None:
346346
str_value = str(value)
347347
str_value_length = len(str_value)
348348
if str_value_length <= MAX_PARAM_VAL_LENGTH:
@@ -368,7 +368,7 @@ def _log_param(self, name: str, value: Union[Dict, int, bool, str]) -> None:
368368
@hook_impl
369369
def after_pipeline_run(
370370
self,
371-
run_params: Dict[str, Any],
371+
run_params: dict[str, Any],
372372
pipeline: Pipeline,
373373
catalog: DataCatalog,
374374
) -> None:
@@ -381,14 +381,14 @@ def after_pipeline_run(
381381
"project_path": str,
382382
"env": str,
383383
"kedro_version": str,
384-
"tags": Optional[List[str]],
385-
"from_nodes": Optional[List[str]],
386-
"to_nodes": Optional[List[str]],
387-
"node_names": Optional[List[str]],
388-
"from_inputs": Optional[List[str]],
389-
"load_versions": Optional[List[str]],
384+
"tags": Optional[list[str]],
385+
"from_nodes": Optional[list[str]],
386+
"to_nodes": Optional[list[str]],
387+
"node_names": Optional[list[str]],
388+
"from_inputs": Optional[list[str]],
389+
"load_versions": Optional[list[str]],
390390
"pipeline_name": str,
391-
"extra_params": Optional[Dict[str, Any]],
391+
"extra_params": Optional[dict[str, Any]],
392392
}
393393
pipeline: The ``Pipeline`` that was run.
394394
catalog: The ``DataCatalog`` used during the run.
@@ -451,7 +451,7 @@ def after_pipeline_run(
451451
def on_pipeline_error(
452452
self,
453453
error: Exception,
454-
run_params: Dict[str, Any],
454+
run_params: dict[str, Any],
455455
pipeline: Pipeline,
456456
catalog: DataCatalog,
457457
):
@@ -467,14 +467,14 @@ def on_pipeline_error(
467467
"project_path": str,
468468
"env": str,
469469
"kedro_version": str,
470-
"tags": Optional[List[str]],
471-
"from_nodes": Optional[List[str]],
472-
"to_nodes": Optional[List[str]],
473-
"node_names": Optional[List[str]],
474-
"from_inputs": Optional[List[str]],
475-
"load_versions": Optional[List[str]],
470+
"tags": Optional[list[str]],
471+
"from_nodes": Optional[list[str]],
472+
"to_nodes": Optional[list[str]],
473+
"node_names": Optional[list[str]],
474+
"from_inputs": Optional[list[str]],
475+
"load_versions": Optional[list[str]],
476476
"pipeline_name": str,
477-
"extra_params": Optional[Dict[str, Any]]
477+
"extra_params": Optional[dict[str, Any]]
478478
}
479479
pipeline: (Not used) The ``Pipeline`` that will was run.
480480
catalog: (Not used) The ``DataCatalog`` used during the run.

kedro_mlflow/framework/hooks/utils.py

+1-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,3 @@
1-
from typing import Dict
2-
31
from kedro_mlflow.config.kedro_mlflow_config import KedroMlflowConfig
42

53

@@ -43,7 +41,7 @@ def _generate_kedro_command(
4341
return kedro_cmd
4442

4543

46-
def _flatten_dict(d: Dict, recursive: bool = True, sep: str = ".") -> Dict:
44+
def _flatten_dict(d: dict, recursive: bool = True, sep: str = ".") -> dict:
4745
def expand(key, value):
4846
if isinstance(value, dict):
4947
new_value = (

kedro_mlflow/io/artifacts/mlflow_artifact_dataset.py

+5-5
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
import shutil
22
from pathlib import Path
3-
from typing import Any, Dict, Union
3+
from typing import Any, Optional, Union
44

55
import mlflow
66
from kedro.io import AbstractVersionedDataset
@@ -15,11 +15,11 @@ class MlflowArtifactDataset(AbstractVersionedDataset):
1515

1616
def __new__(
1717
cls,
18-
dataset: Union[str, Dict],
18+
dataset: Union[str, dict],
1919
run_id: str = None,
2020
artifact_path: str = None,
21-
credentials: Dict[str, Any] = None,
22-
metadata: Dict[str, Any] | None = None,
21+
credentials: dict[str, Any] = None,
22+
metadata: Optional[dict[str, Any]] = None,
2323
):
2424
dataset_obj, dataset_args = parse_dataset_definition(config=dataset)
2525

@@ -169,7 +169,7 @@ def _save(self, data: Any) -> None: # pragma: no cover
169169
"""
170170
pass
171171

172-
def _describe(self) -> Dict[str, Any]: # pragma: no cover
172+
def _describe(self) -> dict[str, Any]: # pragma: no cover
173173
"""
174174
MlflowArtifactDataset is a factory for DataSet
175175
and consequently does not implements abtracts methods

kedro_mlflow/io/metrics/mlflow_abstract_metric_dataset.py

+6-6
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from typing import Any, Dict, Union
1+
from typing import Any, Optional, Union
22

33
import mlflow
44
from kedro.io import AbstractDataset
@@ -10,9 +10,9 @@ def __init__(
1010
self,
1111
key: str = None,
1212
run_id: str = None,
13-
load_args: Dict[str, Any] = None,
14-
save_args: Dict[str, Any] = None,
15-
metadata: Dict[str, Any] | None = None,
13+
load_args: dict[str, Any] = None,
14+
save_args: dict[str, Any] = None,
15+
metadata: Optional[dict[str, Any]] = None,
1616
):
1717
"""Initialise MlflowMetricsHistoryDataset.
1818
@@ -81,11 +81,11 @@ def _exists(self) -> bool:
8181
flag_exist = self.key in run.data.metrics.keys() if run else False
8282
return flag_exist
8383

84-
def _describe(self) -> Dict[str, Any]:
84+
def _describe(self) -> dict[str, Any]:
8585
"""Describe MLflow metrics dataset.
8686
8787
Returns:
88-
Dict[str, Any]: Dictionary with MLflow metrics dataset description.
88+
dict[str, Any]: dictionary with MLflow metrics dataset description.
8989
"""
9090
return {
9191
"key": self.key,

kedro_mlflow/io/metrics/mlflow_metric_dataset.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
from copy import deepcopy
2-
from typing import Any, Dict
2+
from typing import Any, Optional
33

44
from mlflow.tracking import MlflowClient
55

@@ -16,9 +16,9 @@ def __init__(
1616
self,
1717
key: str = None,
1818
run_id: str = None,
19-
load_args: Dict[str, Any] = None,
20-
save_args: Dict[str, Any] = None,
21-
metadata: Dict[str, Any] | None = None,
19+
load_args: dict[str, Any] = None,
20+
save_args: dict[str, Any] = None,
21+
metadata: Optional[dict[str, Any]] = None,
2222
):
2323
"""Initialise MlflowMetricDataset.
2424
Args:

kedro_mlflow/io/metrics/mlflow_metric_history_dataset.py

+5-5
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from typing import Any, Dict, List, Union
1+
from typing import Any, Optional, Union
22

33
from mlflow.tracking import MlflowClient
44

@@ -12,9 +12,9 @@ def __init__(
1212
self,
1313
key: str = None,
1414
run_id: str = None,
15-
load_args: Dict[str, Any] = None,
16-
save_args: Dict[str, Any] = None,
17-
metadata: Dict[str, Any] | None = None,
15+
load_args: dict[str, Any] = None,
16+
save_args: dict[str, Any] = None,
17+
metadata: Optional[dict[str, Any]] = None,
1818
):
1919
"""Initialise MlflowMetricDataset.
2020
Args:
@@ -51,7 +51,7 @@ def _load(self):
5151

5252
def _save(
5353
self,
54-
data: Union[List[int], Dict[int, float], List[Dict[str, Union[float, str]]]],
54+
data: Union[list[int], dict[int, float], list[dict[str, Union[float, str]]]],
5555
):
5656
if self._logging_activated:
5757
self._validate_run_id()

0 commit comments

Comments
 (0)