[Bug] write consumer file exception will cause streaming read task with consumerId error #2144
Closed
1 of 2 tasks
Labels
bug
Something isn't working
Search before asking
Paimon version
0.5-SNAPSHOT
Compute Engine
Flink 1.17
Minimal reproduce step
STEP 1:when checkpoint complete, org.apache.paimon.consumer.ConsumerManager.recordConsumer report error:
java.lang.IllegalArgumentException: Self-suppression not permitted
at java.lang.Throwable.addSuppressed(Throwable.java:1043) ~[?:1.8.0_191]
at org.apache.paimon.consumer.ConsumerManager.recordConsumer(ConsumerManager.java:61) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.table.source.InnerStreamTableScanImpl.notifyCheckpointComplete(InnerStreamTableScanImpl.java:240) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at java.util.OptionalLong.ifPresent(OptionalLong.java:142) ~[?:1.8.0_191]
at org.apache.paimon.flink.source.operator.MonitorFunction.notifyCheckpointComplete(MonitorFunction.java:207) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
Caused by: java.io.IOException: Could not get block locations. Source file "/xxxx/jarvis_run_history_compute/consumer/consumer-sr_jarvis_run_history_compute" - Aborting...block==null
at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1508) ~[hadoop-hdfs-client-3.3.1.jar:?]
STEP2:The conent of consumer file is blank
STEP3:streaming read task with consumerId error:
2023-10-17 10:39:18,528 WARN org.apache.flink.runtime.taskmanager.Task [] - Source: default_catalog.default_database.jarvis_run_history_compute-Monitor (1/1)#3 (c4e704687898bfd0f5d0bd2644629fad_bc764cd8ddf7a0cff126f51c16239658_0_3) switched from RUNNING to FAILED with failure cause:
java.io.UncheckedIOException: org.apache.paimon.shade.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: No content to map due to end-of-input
at [Source: (String)""; line: 1, column: 0]
at org.apache.paimon.utils.JsonSerdeUtil.fromJson(JsonSerdeUtil.java:59) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.consumer.Consumer.fromJson(Consumer.java:54) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.consumer.Consumer.fromPath(Consumer.java:64) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.consumer.ConsumerManager.consumer(ConsumerManager.java:53) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.table.source.AbstractInnerTableScan.createStartingScanner(AbstractInnerTableScan.java:88) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.table.source.InnerStreamTableScanImpl.plan(InnerStreamTableScanImpl.java:81) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.paimon.flink.source.operator.MonitorFunction.run(MonitorFunction.java:180) ~[paimon-flink-1.17-0.5-SNAPSHOT.jar:0.5-SNAPSHOT]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) ~[flink-dist-1.17.0.jar:1.17.0]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67) ~[flink-dist-1.17.0.jar:1.17.0]
What doesn't meet your expectations?
streaming read task failed and kept restarting . And it lost consumer offset info .
Anything else?
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: