You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: UPDATING.md
+17-5
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,18 @@ under the License.
22
22
This file documents any backwards-incompatible changes in Airflow and
23
23
assists users migrating to a new version.
24
24
25
+
## CP
26
+
27
+
### Ability to patch Pool.DEFAULT_POOL_NAME in BaseOperator
28
+
It was not possible to patch pool in BaseOperator as the signature sets the default value of pool
29
+
as Pool.DEFAULT_POOL_NAME.
30
+
While using subdagoperator in unittest(without initializing the sqlite db), it was throwing the
31
+
following error:
32
+
```
33
+
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: slot_pool.
34
+
```
35
+
Fix for this, https://github.com/apache/airflow/pull/8587
36
+
25
37
## Airflow 1.10.4
26
38
27
39
### Python 2 support is going away
@@ -36,12 +48,12 @@ If you have a specific task that still requires Python 2 then you can use the Py
36
48
37
49
### Changes to GoogleCloudStorageHook
38
50
39
-
* the discovery-based api (`googleapiclient.discovery`) used in `GoogleCloudStorageHook` is now replaced by the recommended client based api (`google-cloud-storage`). To know the difference between both the libraries, read https://cloud.google.com/apis/docs/client-libraries-explained. PR: [#5054](https://github.com/apache/airflow/pull/5054)
51
+
* the discovery-based api (`googleapiclient.discovery`) used in `GoogleCloudStorageHook` is now replaced by the recommended client based api (`google-cloud-storage`). To know the difference between both the libraries, read https://cloud.google.com/apis/docs/client-libraries-explained. PR: [#5054](https://github.com/apache/airflow/pull/5054)
40
52
* as a part of this replacement, the `multipart` & `num_retries` parameters for `GoogleCloudStorageHook.upload` method have been deprecated.
41
-
53
+
42
54
The client library uses multipart upload automatically if the object/blob size is more than 8 MB - [source code](https://github.com/googleapis/google-cloud-python/blob/11c543ce7dd1d804688163bc7895cf592feb445f/storage/google/cloud/storage/blob.py#L989-L997). The client also handles retries automatically
43
55
44
-
* the `generation` parameter is deprecated in `GoogleCloudStorageHook.delete` and `GoogleCloudStorageHook.insert_object_acl`.
56
+
* the `generation` parameter is deprecated in `GoogleCloudStorageHook.delete` and `GoogleCloudStorageHook.insert_object_acl`.
45
57
46
58
Updating to `google-cloud-storage >= 1.16` changes the signature of the upstream `client.get_bucket()` method from `get_bucket(bucket_name: str)` to `get_bucket(bucket_or_name: Union[str, Bucket])`. This method is not directly exposed by the airflow hook, but any code accessing the connection directly (`GoogleCloudStorageHook().get_conn().get_bucket(...)` or similar) will need to be updated.
47
59
@@ -305,7 +317,7 @@ then you need to change it like this
305
317
@property
306
318
def is_active(self):
307
319
return self.active
308
-
320
+
309
321
### Support autodetected schemas to GoogleCloudStorageToBigQueryOperator
310
322
311
323
GoogleCloudStorageToBigQueryOperator is now support schema auto-detection is available when you load data into BigQuery. Unfortunately, changes can be required.
0 commit comments