Postgres destination accepts only 10k rows from a table containing json #48410
Unanswered
D1111111
asked this question in
Connector Questions
Replies: 1 comment
-
The issue is that there are some json values with more than 25 MB in size. When such records are removed loader works as expected. Can the loader be configured to handle bigger payloads? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Howdy ✋
I have a Postgres -> Postgres connection that fails to load more than 10k rows. It is stuck in some kind of loop. After several hours or days of loading, I need to cancel it manually. Job history shows a larger number of rows extracted (around 30k), but only 10k inserted.
The problem is a json column inside the source table, which is not too big, let's say a couple of kB per row. When I remove that json column from the pipeline rows are loaded as expected.
I tried all versions of Postgres destination connector - same problem.
This is the example of the log, these lines are repeated over and over...
Beta Was this translation helpful? Give feedback.
All reactions