forked from data-integrations/google-cloud
-
Notifications
You must be signed in to change notification settings - Fork 2
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request data-integrations#1363 from cloudsufi/SpannerE2EAd…
…ditional Spanner To Spanner Additional Cases as per ITN class.
- Loading branch information
Showing
7 changed files
with
272 additions
and
0 deletions.
There are no files selected for viewing
119 changes: 119 additions & 0 deletions
119
src/e2e-test/features/spanner/source/SpannerToSpanner_Additional.feature
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,119 @@ | ||
@Spanner_Source | ||
Feature: Spanner source - Verification of Additional Spanner to Spanner successful data transfer without using connections | ||
|
||
@SPANNER_TEST @EXISTING_SPANNER_SINK | ||
Scenario: To verify data is getting transferred from no schema Spanner source to Spanner sink having existing table | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Spanner" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Spanner" from the plugins list as: "Sink" | ||
Then Connect plugins: "Spanner" and "Spanner2" to establish connection | ||
Then Navigate to the properties page of plugin: "Spanner" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerSourceTable" | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Spanner2" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerExistingTargetTable" | ||
Then Enter Spanner sink property primary key "spannerSinkPrimaryKeySpanner" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate records transferred to already existing target spanner table with record counts of source table | ||
|
||
@SPANNER_TEST @EXISTING_SPANNER_SINK | ||
Scenario: To verify data is getting transferred from Spanner source to Spanner sink having existing table | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Spanner" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Spanner" from the plugins list as: "Sink" | ||
Then Connect plugins: "Spanner" and "Spanner2" to establish connection | ||
Then Navigate to the properties page of plugin: "Spanner" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerSourceTable" | ||
Then Validate output schema with expectedSchema "spannerSourceSchema" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Spanner2" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerExistingTargetTable" | ||
Then Enter Spanner sink property primary key "spannerSinkPrimaryKeySpanner" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate records transferred to already existing target spanner table with record counts of source table | ||
|
||
@SPANNER_TEST @SPANNER_SINK_TEST | ||
Scenario: To verify data is getting transferred from no schema Spanner source to Spanner non existing sink table | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Spanner" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Spanner" from the plugins list as: "Sink" | ||
Then Connect plugins: "Spanner" and "Spanner2" to establish connection | ||
Then Navigate to the properties page of plugin: "Spanner" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerSourceTable" | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Spanner2" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Enter input plugin property: "instanceId" with value: "spannerInstance" | ||
Then Enter input plugin property: "databaseName" with value: "spannerTargetDatabase" | ||
Then Enter input plugin property: "tableName" with value: "spannerTargetTable" | ||
Then Enter Spanner sink property primary key "spannerSinkPrimaryKeySpanner" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate records transferred to target spanner table with record counts of source spanner table | ||
|
55 changes: 55 additions & 0 deletions
55
src/e2e-test/features/spanner/source/SpannerToSpanner_WithMacro.feature
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
@Spanner_Source | ||
Feature: Spanner source - Verification of Spanner to Spanner Successful data transfer with macro arguments | ||
|
||
@SPANNER_SINK_TEST @SPANNER_TEST | ||
Scenario: To verify data is getting transferred from Spanner to Spanner with macro arguments | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "Spanner" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "Spanner" from the plugins list as: "Sink" | ||
Then Connect plugins: "Spanner" and "Spanner2" to establish connection | ||
Then Navigate to the properties page of plugin: "Spanner" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Click on the Macro button of Property: "instanceId" and set the value to: "macroStringInstance" | ||
Then Click on the Macro button of Property: "databaseName" and set the value to: "macroStringDatabase" | ||
Then Click on the Macro button of Property: "tableName" and set the value to: "macroStringSourceTable" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Spanner2" | ||
Then Enter Spanner property reference name | ||
Then Enter Spanner property projectId "projectId" | ||
Then Override Service account details if set in environment variables | ||
Then Click on the Macro button of Property: "instanceId" and set the value to: "macroStringInstance" | ||
Then Click on the Macro button of Property: "databaseName" and set the value to: "macroStringTargetDatabase" | ||
Then Click on the Macro button of Property: "tableName" and set the value to: "macroStringTargetTable" | ||
Then Click on the Macro button of Property: "keys" and set the value to: "macroStringPrimaryKey" | ||
Then Validate "Spanner" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "spannerInstance" for key "macroStringInstance" | ||
Then Enter runtime argument value "spannerDatabase" for key "macroStringDatabase" | ||
Then Enter runtime argument value "spannerSourceTable" for key "macroStringSourceTable" | ||
Then Enter runtime argument value "spannerTargetDatabase" for key "macroStringTargetDatabase" | ||
Then Enter runtime argument value "spannerTargetTable" for key "macroStringTargetTable" | ||
Then Enter runtime argument value "spannerSinkPrimaryKeySpanner" for key "macroStringPrimaryKey" | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "spannerInstance" for key "macroStringInstance" | ||
Then Enter runtime argument value "spannerDatabase" for key "macroStringDatabase" | ||
Then Enter runtime argument value "spannerSourceTable" for key "macroStringSourceTable" | ||
Then Enter runtime argument value "spannerTargetDatabase" for key "macroStringTargetDatabase" | ||
Then Enter runtime argument value "spannerTargetTable" for key "macroStringTargetTable" | ||
Then Enter runtime argument value "spannerSinkPrimaryKeySpanner" for key "macroStringPrimaryKey" | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate records transferred to target spanner table with record counts of source spanner table |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
1 change: 1 addition & 0 deletions
1
src/e2e-test/resources/testdata/SpannerCreateExistingSinkTableQueries.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
CREATE TABLE ExistingSinkTable ( EmployeeDepartment STRING(1024), EmployeeName STRING(1024), Salary INT64, Workhours INT64, DateOfBirth DATE, AgeInYears FLOAT64, IsActive BOOL, InTime TIMESTAMP, Punch BYTES(25), Activities ARRAY<STRING(4)> , Numbers ARRAY<INT64>, Attendance ARRAY<BOOL>, FloatNumbers ARRAY<FLOAT64>, BytesArray ARRAY<BYTES(25)>, DateArray ARRAY<DATE>, TimestampArray ARRAY<TIMESTAMP>) PRIMARY KEY(EmployeeDepartment, EmployeeName) |