Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attempting to execute an unsuccessful or closed pending query result #610

Open
2 tasks done
tjwebb opened this issue Feb 17, 2025 · 2 comments
Open
2 tasks done

Attempting to execute an unsuccessful or closed pending query result #610

tjwebb opened this issue Feb 17, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@tjwebb
Copy link

tjwebb commented Feb 17, 2025

What happens?

I get the error "Attempting to execute an unsuccessful or closed pending query result" when trying to call read_ndjson_objects with no limit.

To Reproduce

I am downloading data from openstreetmap.org, and attempting to load it.

This fails.

drop table if exists acquire.openaddress_parcel;
create table if not exists acquire.openaddress_parcel as (
	select * from duckdb.query('
SELECT

	cast(json->''$.properties.hash'' as text) as hash,
	cast(json->''$.properties.pid'' as text) as pid,
	st_astext(st_geomfromgeojson(json->''$.geometry'')) as geom
	
FROM read_ndjson_objects(''/mnt/shared_data/openaddress/us/**/*-parcel*.geojson'')

'));

This produces the following error:

ERROR:  (PGDuckDB/Duckdb_ExecCustomScan) Invalid Input Error: Attempting to execute an unsuccessful or closed pending query result
Error: Invalid Error: basic_string::_M_construct null not valid 

SQL state: XX000

But if I add a limit, even a very large one, it completes with no error, e.g.

drop table if exists acquire.openaddress_parcel;
create table if not exists acquire.openaddress_parcel as (
	select * from duckdb.query('
SELECT

	cast(json->''$.properties.hash'' as text) as hash,
	cast(json->''$.properties.pid'' as text) as pid,
	st_astext(st_geomfromgeojson(json->''$.geometry'')) as geom
	
FROM read_ndjson_objects(''/mnt/shared_data/openaddress/us/**/*-parcel*.geojson'')
limit 1000000

'));

OS:

Linux on Docker on Mac

pg_duckdb Version (if built from source use commit hash):

0.3.1

Postgres Version (if built from source use commit hash):

17

Hardware:

Mac

Full Name:

Travis Webb

Affiliation:

myself

What is the latest build you tested with? If possible, we recommend testing with the latest nightly build.

I have tested with a stable release

Did you include all relevant data sets for reproducing the issue?

No - I cannot easily share my data sets due to their large size

Did you include all code required to reproduce the issue?

  • Yes, I have

Did you include all relevant configuration (e.g., CPU architecture, Linux distribution) to reproduce the issue?

  • Yes, I have
@JelteF
Copy link
Collaborator

JelteF commented Feb 25, 2025

Could you check if this query works in plain duckdb?

@JelteF JelteF added the bug Something isn't working label Feb 25, 2025
@JelteF
Copy link
Collaborator

JelteF commented Feb 25, 2025

Also, are there any other errors in the postgres log or something?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants