Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Required e2e Tests #1296

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
@BigQuery_Sink
Feature: BigQuery sink - Validate BigQuery sink plugin error scenarios

@BigQuery_Sink_Required
Scenario Outline:Verify BigQuery Sink properties validation errors for mandatory fields
Given Open Datafusion Project to configure pipeline
When Sink is BigQuery
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/bigquery/sink/GCSToBigQuery.feature
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
@BigQuery_Sink
Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfer

@CMEK @GCS_CSV_TEST @BQ_SINK_TEST
@CMEK @GCS_CSV_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
Scenario:Validate successful records transfer from GCS to BigQuery
Given Open Datafusion Project to configure pipeline
When Source is GCS
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Verify the pipeline status is "Succeeded"
Then Verify the partition table is created with partitioned on field "bqPartitionFieldTime"

@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST @BigQuery_Source_Required
Scenario:Validate successful records transfer from BigQuery to BigQuery with all the datatypes
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
@BigQuery_Source
Feature: BigQuery source - Verification of BigQuery to BigQuery successful data transfer using connections

@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_CONNECTION
@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_CONNECTION @BigQuery_Source_Required
Scenario: To verify data transfer from BigQuery to BigQuery with pipeline connection created from wrangler
Given Open Wrangler connections page
Then Click plugin property: "addConnection" button
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/bigqueryexecute/BQExecute.feature
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"

@BQ_EXECUTE_DDL_CREATE_TEST
@BQ_EXECUTE_DDL_CREATE_TEST @BQExecute_Required
Scenario: Verify BQExecute plugin functionality for DDL query - Create table
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/gcs/sink/GCSSink.feature
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
Then Verify data is transferred to target GCS bucket
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled

@GCS_SINK_TEST @BQ_SOURCE_TEST
@GCS_SINK_TEST @BQ_SOURCE_TEST @GCS_Sink_Required
Scenario Outline: To verify data is getting transferred successfully from BigQuery to GCS for different formats
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
Expand Down Expand Up @@ -172,7 +172,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
Then Verify the pipeline status is "Succeeded"
Then Verify data is transferred to target GCS bucket with path suffix "gcsPathSuffix"

@GCS_DATATYPE_TEST @GCS_SINK_TEST
@GCS_DATATYPE_TEST @GCS_SINK_TEST @GCS_Sink_Required
Scenario: To verify data is getting transferred from GCS to GCS with supported DataTypes
Given Open Datafusion Project to configure pipeline
When Source is GCS
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcs/sink/GCSSinkError.feature
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Feature: GCS sink - Verify GCS Sink plugin error scenarios
Then Click on the Validate button
Then Verify that the Plugin Property: "path" is displaying an in-line error message: "errorMessageInvalidBucketName"

@BQ_SOURCE_DATATYPE_TEST @GCS_SINK_TEST
@BQ_SOURCE_DATATYPE_TEST @GCS_SINK_TEST @GCS_Sink_Required
Scenario: To verify error message when unsupported format is used in GCS sink with multiple datatypes provided in source table
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ Feature: GCS source - Verification of GCS to GCS successful data transfer using
Then Click plugin property: "Delete" button
Then Verify connection: "gcsConnectionName" of type: "GCS" is deleted successfully

@GCS_CSV_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION
@GCS_CSV_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required
Scenario: To verify data is getting transferred from GCS to GCS with use connection functionality
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down
1 change: 1 addition & 0 deletions src/e2e-test/features/gcs/source/GCSourceSchema.feature
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
| GcsPath | FileFormat | ExpectedSchema |
| gcsTsvFile | tsv | gcsTsvFileSchema |

@GCS_Source_Required
Scenario Outline:GCS Source output schema validation for blob, parquet, avro and text format
Given Open Datafusion Project to configure pipeline
When Source is GCS
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
@Spanner_Source @SPANNER_TEST
Feature: Spanner source - Verification of Spanner to Spanner successful data transfer without using connections

@SPANNER_SINK_TEST @SPANNER_TEST
@SPANNER_SINK_TEST @SPANNER_TEST @Spanner_Source_Required
Scenario: To verify data is getting transferred from Spanner to Spanner without using connection functionality
Given Open Datafusion Project to configure pipeline
When Select plugin: "Spanner" from the plugins list as: "Source"
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/spanner/source/SpannertoGCS.feature
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
@Spanner_Source @SPANNER_TEST
Feature: Spanner Source - Verification of Spanner to GCS successful data transfer

@GCS_SINK_TEST
@GCS_SINK_TEST @Spanner_Source_Required
Scenario: Verify data is getting transferred from Spanner to GCS successfully
Given Open Datafusion Project to configure pipeline
When Source is Spanner
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
/*
* Copyright © 2021 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.gcs.runners.sinkrunner;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute only required GCS sink cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.plugin.gcs.stepsdesign", "io.cdap.plugin.bigquery.stepsdesign",
"stepsdesign", "io.cdap.plugin.common.stepsdesign"},
tags = {"@GCS_Sink_Required"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/gcs-sink",
"json:target/cucumber-reports/cucumber-gcs-sink.json",
"junit:target/cucumber-reports/cucumber-gcs-sink.xml"}
)
public class TestRunnerRequired {
}