You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Paddy0523
changed the title
[Bug] GenericCatalog uses localFileIO when hadoop configuration is loaded
[Bug] HiveCatalog uses LocalFileIO when hadoop configuration is loaded
Sep 15, 2023
I am also facing quite similar problems. Although I have tried to define hive.metastore.warehouse.dir to s3://<mybucket>/, Flink still try to write the databases or tables to local file.
Have you overcome this issue yet, @Paddy0523
Search before asking
Paimon version
0.6-SNAPSHOT
Compute Engine
flink1.16
Minimal reproduce step
CREATE CATALOG my_catalog WITH (
'type'='paimon-generic',
'hive-conf-dir' = '/opt/Hive/hive_pkg/conf/',
'hadoop-conf-dir' = '/opt/Hadoop/hadoop_pkg/etc/hadoop/'
);
CREATE DATABASE my_catalog.paimon_test;
CREATE TABLE my_catalog.paimon_test.word_count (
word STRING PRIMARY KEY NOT ENFORCED,
cnt BIGINT
) WITH (
'connector' = 'paimon'
);
What doesn't meet your expectations?
I expected the paimon table file to be created on hdfs, but it was actually created to the local path
Anything else?
It seems that because of my configuration, it does not start with "hdfs: //"
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: