You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have tables with hadoop and jdbc-catalog saved in hdfs file systems. When query the data by directly giving metadata.json file it throws no such file directory exist IO Error: Cannot open file "hdfs:///warehouse/user_journey_analytics/engagement_analytics/metadata/00065-83cdaa3e-0d01-40e2-acd4-431f560f0aab.metadata.json": No such file or directory. This is my query for the table with jdbc-catalog SELECT count(*) FROM iceberg_scan('hdfs:///warehouse/user_journey_analytics/engagement_analytics/metadata/00065-83cdaa3e-0d01-40e2-acd4-431f560f0aab.metadata.json'); and hadoop catalog with SELECT count(*) FROM iceberg_scan('hdfs:///warehouse/broadcast/aggregated_analytics/metadata/v20.metadata.json'); I installed duckdb on hadoop box to avoid network call and directly query the table that is why /// are present in the query
The text was updated successfully, but these errors were encountered:
UtkarshSharma2612
changed the title
Not able to iceberg using duckdb extension for hdfs warehouse
Not able to query iceberg using duckdb extension for hdfs warehouse
Jun 18, 2024
We have tables with hadoop and jdbc-catalog saved in hdfs file systems. When query the data by directly giving metadata.json file it throws no such file directory exist
IO Error: Cannot open file "hdfs:///warehouse/user_journey_analytics/engagement_analytics/metadata/00065-83cdaa3e-0d01-40e2-acd4-431f560f0aab.metadata.json": No such file or directory
. This is my query for the table with jdbc-catalogSELECT count(*) FROM iceberg_scan('hdfs:///warehouse/user_journey_analytics/engagement_analytics/metadata/00065-83cdaa3e-0d01-40e2-acd4-431f560f0aab.metadata.json');
and hadoop catalog withSELECT count(*) FROM iceberg_scan('hdfs:///warehouse/broadcast/aggregated_analytics/metadata/v20.metadata.json');
I installed duckdb on hadoop box to avoid network call and directly query the table that is why///
are present in the queryThe text was updated successfully, but these errors were encountered: