forked from data-integrations/google-cloud
-
Notifications
You must be signed in to change notification settings - Fork 2
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
7 changed files
with
413 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,130 @@ | ||
@BigTable | ||
Feature: BigTable source - Verification of BigTable to BigTable successful data transfer without using connections | ||
|
||
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST | ||
Scenario: To verify data is getting transferred from BigTable source table to BigTable sink table | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Bigtable" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Bigtable" from the plugins list as: "Sink" | ||
Then Connect plugins: "Bigtable" and "Bigtable2" to establish connection | ||
Then Navigate to the properties page of plugin: "Bigtable" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSourceReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableSourceTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsourceMappings" | ||
Then Select Macro action of output schema property: "outputSchemaMacroInput" and set the value to "cbtSourceOutputSchema" | ||
Then Validate "Bigtable" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Bigtable2" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSinkReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableTargetInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableTargetTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsinkMappings" | ||
Then Validate "Bigtable" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate OUT record count is equal to IN record count | ||
|
||
@BIGTABLE_SOURCE_TEST @EXISTING_BIGTABLE_SINK | ||
Scenario: To verify data is getting transferred from BigTable source table to existing BigTable sink | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Bigtable" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Bigtable" from the plugins list as: "Sink" | ||
Then Connect plugins: "Bigtable" and "Bigtable2" to establish connection | ||
Then Navigate to the properties page of plugin: "Bigtable" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSourceReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableSourceTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsourceMappings" | ||
Then Select Macro action of output schema property: "outputSchemaMacroInput" and set the value to "cbtSourceOutputSchema" | ||
Then Validate "Bigtable" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Bigtable2" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSinkReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableTargetInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableTargetExistingTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsinkMappings" | ||
Then Validate "Bigtable" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate OUT record count is equal to IN record count | ||
|
||
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST | ||
Scenario: To verify data is getting transferred from not existing BigTable source table to BigTable sink table | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Bigtable" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Bigtable" from the plugins list as: "Sink" | ||
Then Connect plugins: "Bigtable" and "Bigtable2" to establish connection | ||
Then Navigate to the properties page of plugin: "Bigtable" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSourceReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableSourceTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsourceMappings" | ||
Then Select Macro action of output schema property: "outputSchemaMacroInput" and set the value to "cbtSourceOutputSchema" | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Bigtable2" | ||
Then Enter input plugin property: "referenceName" with value: "CBTSinkReferenceName" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "instance" with value: "bigtableTargetInstance" | ||
Then Enter input plugin property: "table" with value: "bigtableTargetTable" | ||
Then Replace input plugin property: "keyAlias" with value: "id" | ||
Then Enter key value pairs for plugin property: "columnMappings" with values from json: "cbtsinkMappings" | ||
Then Validate "Bigtable" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "cbtSourceOutputSchema" for key "cbtSourceOutputSchema" | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate OUT record count is equal to IN record count |
22 changes: 22 additions & 0 deletions
22
src/e2e-test/java/io/cdap/plugin/bigtable/runners/TestRunner.java
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
package io.cdap.plugin.bigtable.runners; | ||
|
||
|
||
import io.cucumber.junit.Cucumber; | ||
import io.cucumber.junit.CucumberOptions; | ||
import org.junit.runner.RunWith; | ||
|
||
/** | ||
* Test Runner to execute Bigtable testcases. | ||
*/ | ||
@RunWith(Cucumber.class) | ||
@CucumberOptions( | ||
features = {"src/e2e-test/features"}, | ||
glue = {"io.cdap.plugin.bigtable.stepsdesign", "io.cdap.plugin.common.stepsdesign", "stepsdesign"}, | ||
tags = {"@BigTable"}, | ||
monochrome = true, | ||
plugin = {"pretty", "html:target/cucumber-html-report/bigtable", | ||
"json:target/cucumber-reports/cucumber-bigtable.json", | ||
"junit:target/cucumber-reports/cucumber-bigtable.xml"} | ||
) | ||
public class TestRunner { | ||
} |
1 change: 1 addition & 0 deletions
1
src/e2e-test/java/io/cdap/plugin/bigtable/runners/package-info.java
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
package io.cdap.plugin.bigtable.runners; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.