Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[doc] Adding Considerations for Spark's Insert Overwrite #2111

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions docs/content/how-to/writing-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,11 @@ INSERT INTO MyTable SELECT ...

{{< /tabs >}}

## Overwriting the Whole Table
## Overwriting
Note :If `spark.sql.sources.partitionOverwriteMode` is set to `dynamic` by default in Spark,
in order to ensure that the insert overwrite function of the Paimon table can be used normally,
`spark.sql.extensions` should be set to `org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions`.
### Overwriting the Whole Table

For unpartitioned tables, Paimon supports overwriting the whole table.

Expand All @@ -153,7 +157,7 @@ INSERT OVERWRITE MyTable SELECT ...

{{< /tabs >}}

## Overwriting a Partition
### Overwriting a Partition

For partitioned tables, Paimon supports overwriting a partition.

Expand All @@ -179,7 +183,7 @@ INSERT OVERWRITE MyTable PARTITION (key1 = value1, key2 = value2, ...) SELECT ..

{{< /tabs >}}

## Dynamic Overwrite
### Dynamic Overwrite

{{< tabs "dynamic-overwrite" >}}

Expand Down