Skip to content

Commit

Permalink
Adding Considerations for Spark's Insert Overwrite
Browse files Browse the repository at this point in the history
  • Loading branch information
wg1026688210 committed Oct 10, 2023
1 parent cef2fc1 commit af33530
Showing 1 changed file with 7 additions and 3 deletions.
10 changes: 7 additions & 3 deletions docs/content/how-to/writing-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,11 @@ INSERT INTO MyTable SELECT ...

{{< /tabs >}}

## Overwriting the Whole Table
## Overwriting
Note :If `spark.sql.sources.partitionOverwriteMode` is set to `dynamic` by default in Spark,
in order to ensure that the insert overwrite function of the Paimon table can be used normally,
`spark.sql.extensions` should be set to `org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions`.
### Overwriting the Whole Table

For unpartitioned tables, Paimon supports overwriting the whole table.

Expand All @@ -153,7 +157,7 @@ INSERT OVERWRITE MyTable SELECT ...

{{< /tabs >}}

## Overwriting a Partition
### Overwriting a Partition

For partitioned tables, Paimon supports overwriting a partition.

Expand All @@ -179,7 +183,7 @@ INSERT OVERWRITE MyTable PARTITION (key1 = value1, key2 = value2, ...) SELECT ..

{{< /tabs >}}

## Dynamic Overwrite
### Dynamic Overwrite

{{< tabs "dynamic-overwrite" >}}

Expand Down

0 comments on commit af33530

Please sign in to comment.