Skip to content

Commit

Permalink
1
Browse files Browse the repository at this point in the history
  • Loading branch information
Zouxxyy committed Nov 15, 2023
1 parent bb2f7d8 commit 55b0236
Show file tree
Hide file tree
Showing 3 changed files with 0 additions and 13 deletions.
1 change: 0 additions & 1 deletion docs/content/how-to/querying-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,6 @@ Paimon supports that use Spark SQL to do the incremental query that implemented
To enable this needs these configs below:

```text
--conf spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
--conf spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
```

Expand Down
3 changes: 0 additions & 3 deletions docs/content/how-to/writing-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,6 @@ INSERT OVERWRITE MyTable /*+ OPTIONS('dynamic-partition-overwrite' = 'false') */
Spark's default overwrite mode is static partition overwrite. To enable dynamic overwritten needs these configs below:

```text
--conf spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
--conf spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
```

Expand Down Expand Up @@ -371,7 +370,6 @@ UPDATE MyTable SET b = 1, c = 2 WHERE a = 'myTable';
To enable update needs these configs below:
```text
--conf spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
--conf spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
```
Expand Down Expand Up @@ -478,7 +476,6 @@ Important table properties setting:
To enable delete needs these configs below:
```text
--conf spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
--conf spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,10 @@
package org.apache.paimon.spark.sql

import org.apache.paimon.spark.PaimonSparkTestBase
import org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions

import org.apache.spark.SparkConf
import org.apache.spark.sql.{DataFrame, Row}

class TableValuedFunctionsTest extends PaimonSparkTestBase {

override protected def sparkConf: SparkConf = {
super.sparkConf
.set("spark.sql.catalog.spark_catalog", "org.apache.paimon.spark.SparkGenericCatalog")
.set("spark.sql.extensions", classOf[PaimonSparkSessionExtensions].getName)
}

withPk.foreach {
hasPk =>
bucketModes.foreach {
Expand Down

0 comments on commit 55b0236

Please sign in to comment.