You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I get the error "Attempting to execute an unsuccessful or closed pending query result" when trying to call read_ndjson_objects with no limit.
To Reproduce
I am downloading data from openstreetmap.org, and attempting to load it.
This fails.
droptable if exists acquire.openaddress_parcel;
createtableif not exists acquire.openaddress_parcelas (
select*fromduckdb.query('SELECT cast(json->''$.properties.hash'' as text) as hash, cast(json->''$.properties.pid'' as text) as pid, st_astext(st_geomfromgeojson(json->''$.geometry'')) as geomFROM read_ndjson_objects(''/mnt/shared_data/openaddress/us/**/*-parcel*.geojson'')'));
This produces the following error:
ERROR: (PGDuckDB/Duckdb_ExecCustomScan) Invalid Input Error: Attempting to execute an unsuccessful or closed pending query result
Error: Invalid Error: basic_string::_M_construct null not valid
SQL state: XX000
But if I add a limit, even a very large one, it completes with no error, e.g.
droptable if exists acquire.openaddress_parcel;
createtableif not exists acquire.openaddress_parcelas (
select*fromduckdb.query('SELECT cast(json->''$.properties.hash'' as text) as hash, cast(json->''$.properties.pid'' as text) as pid, st_astext(st_geomfromgeojson(json->''$.geometry'')) as geomFROM read_ndjson_objects(''/mnt/shared_data/openaddress/us/**/*-parcel*.geojson'')limit 1000000'));
OS:
Linux on Docker on Mac
pg_duckdb Version (if built from source use commit hash):
0.3.1
Postgres Version (if built from source use commit hash):
17
Hardware:
Mac
Full Name:
Travis Webb
Affiliation:
myself
What is the latest build you tested with? If possible, we recommend testing with the latest nightly build.
I have tested with a stable release
Did you include all relevant data sets for reproducing the issue?
No - I cannot easily share my data sets due to their large size
Did you include all code required to reproduce the issue?
Yes, I have
Did you include all relevant configuration (e.g., CPU architecture, Linux distribution) to reproduce the issue?
Yes, I have
The text was updated successfully, but these errors were encountered:
What happens?
I get the error "Attempting to execute an unsuccessful or closed pending query result" when trying to call read_ndjson_objects with no limit.
To Reproduce
I am downloading data from openstreetmap.org, and attempting to load it.
This fails.
This produces the following error:
But if I add a limit, even a very large one, it completes with no error, e.g.
OS:
Linux on Docker on Mac
pg_duckdb Version (if built from source use commit hash):
0.3.1
Postgres Version (if built from source use commit hash):
17
Hardware:
Mac
Full Name:
Travis Webb
Affiliation:
myself
What is the latest build you tested with? If possible, we recommend testing with the latest nightly build.
I have tested with a stable release
Did you include all relevant data sets for reproducing the issue?
No - I cannot easily share my data sets due to their large size
Did you include all code required to reproduce the issue?
Did you include all relevant configuration (e.g., CPU architecture, Linux distribution) to reproduce the issue?
The text was updated successfully, but these errors were encountered: