Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Writing table SQL issue. Field names must be unique. #2121

Closed
1 of 2 tasks
phoesh opened this issue Oct 12, 2023 · 1 comment
Closed
1 of 2 tasks

[Bug] Writing table SQL issue. Field names must be unique. #2121

phoesh opened this issue Oct 12, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@phoesh
Copy link

phoesh commented Oct 12, 2023

Search before asking

  • I searched in the issues and found nothing similar.

Paimon version

0.5.0-incubating

Compute Engine

Flink 1.17.1

Minimal reproduce step

CREATE TABLE IF NOT EXISTS paimon_t1min
(
rowtime TIMESTAMP_LTZ(3) ,
pdate STRING,
symbol STRING ,
event_id STRING,
price_change_t1min DECIMAL(38, 17),
price_change_percent_t1min DECIMAL(38, 6),
PRIMARY KEY (symbol, event_id) NOT ENFORCED
) PARTITIONED BY (pdate)
WITH (
'bucket' = '20',
'bucket-key' = 'event_id',
'format' = 'csv',
'changelog-producer' = 'full-compaction',
'changelog-producer.compaction-interval' = '5 s',
'merge-engine' = 'partial-update',
'partial-update.ignore-delete' = 'true',
'partition.expiration-time' = '7 d',
'partition.expiration-check-interval' = '1 d',
'partition.timestamp-formatter' = 'yyyy-MM-dd',
'partition.timestamp-pattern' = '$pdate',
'snapshot.num-retained.max' = '20'
);

INSERT INTO paimon_t1min
SELECT rowtime, pdate, symbol, event_id, price_change_t1min, price_change_percent_t1min
FROM t1min

What doesn't meet your expectations?

I developed my Flink streaming app with Paimon and tested it running on the localhost.
It parses to the partition keys are ["symbol", "pdate"] and the primary keys are ["symbol", "event_id", "rowtime"].
So I met the error.
Does it take the wrong with my SQL queries?
java.lang.IllegalArgumentException: Field names must be unique. Found duplicates: [symbol]
at org.apache.paimon.types.RowType.validateFields(RowType.java:159) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.types.RowType.(RowType.java:65) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.types.RowType.(RowType.java:69) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.schema.TableSchema.projectedLogicalRowType(TableSchema.java:253) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.flink.sink.index.GlobalDynamicBucketSink.build(GlobalDynamicBucketSink.java:77) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.flink.sink.FlinkSinkBuilder.buildDynamicBucketSink(FlinkSinkBuilder.java:102) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.flink.sink.FlinkSinkBuilder.build(FlinkSinkBuilder.java:90) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.flink.sink.FlinkTableSinkBase.lambda$getSinkRuntimeProvider$0(FlinkTableSinkBase.java:140) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.paimon.flink.PaimonDataStreamSinkProvider.consumeDataStream(PaimonDataStreamSinkProvider.java:41) ~[paimon-flink-1.17-0.5.0-incubating.jar:0.5.0-incubating]
at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.applySinkProvider(CommonExecSink.java:507) ~[?:?]
at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSinkTransformation(CommonExecSink.java:218) ~[?:?]
at org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:176) ~[?:?]
at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:161) ~[?:?]
at org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:85) ~[?:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[scala-library-2.12.7.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:937) ~[scala-library-2.12.7.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[scala-library-2.12.7.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[scala-library-2.12.7.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[scala-library-2.12.7.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[scala-library-2.12.7.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[scala-library-2.12.7.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[scala-library-2.12.7.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[scala-library-2.12.7.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[scala-library-2.12.7.jar:?]
at org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:84) ~[?:?]
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:197) ~[?:?]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1803) ~[flink-table-api-java-1.17.1.jar:1.17.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:881) ~[flink-table-api-java-1.17.1.jar:1.17.1]
at org.apache.flink.table.api.internal.StatementSetImpl.execute(StatementSetImpl.java:109) ~[flink-table-api-java-1.17.1.jar:1.17.1]

Anything else?

No response

Are you willing to submit a PR?

  • I'm willing to submit a PR!
@phoesh phoesh added the bug Something isn't working label Oct 12, 2023
@JingsongLi
Copy link
Contributor

Try latest 0.8.0 version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants