diff --git a/src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature b/src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature index 540b7e3d73..6a1afdfcf1 100644 --- a/src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature +++ b/src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature @@ -196,7 +196,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr Then Verify the pipeline status is "Succeeded" Then Verify the partition table is created with partitioned on field "bqPartitionFieldTime" - @BQ_EXISTING_SOURCE_DATATYPE_TEST @BQ_EXISTING_SINK_DATATYPE_TEST @EXISTING_BQ_CONNECTION + @BQ_EXISTING_SOURCE_DATATYPE_TEST @BQ_EXISTING_SINK_DATATYPE_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST Scenario: Validate user is able to read the records from BigQuery source(existing table),source table here has more columns than BigQuery sink(existing table) with update button schema with use connection functionality Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" @@ -244,7 +244,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr Then Verify the pipeline status is "Succeeded" Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bgInsertDatatypeFile" - @BQ_INSERT_SOURCE_TEST @BQ_UPDATE_SINK_TEST @EXISTING_BQ_CONNECTION + @BQ_INSERT_SOURCE_TEST @BQ_UPDATE_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced Operations Update without updating the destination table schema with use connection functionality Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" @@ -295,7 +295,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr Then Verify the pipeline status is "Succeeded" Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table - @BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION + @BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced operations Upsert without updating the destination table schema with use connection functionality Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" diff --git a/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature b/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature index 6e78173267..299a48125b 100644 --- a/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature +++ b/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature @@ -263,7 +263,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data Then Verify the pipeline status is "Succeeded" Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table - @BQ_EXISTING_SOURCE_TEST @BQ_EXISTING_SINK_TEST @EXISTING_BQ_CONNECTION + @BQ_EXISTING_SOURCE_TEST @BQ_EXISTING_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Source_Required @ITN_TEST Scenario: Validate user is able to read data from BigQuery source(existing table) and store them in BigQuery sink(existing table) with use connection functionality Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" @@ -310,7 +310,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data Then Verify the pipeline status is "Succeeded" Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqExpectedFile" - @BQ_EXISTING_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION + @BQ_EXISTING_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Source_Required @ITN_TEST Scenario: Validate user is able to read data from BigQuery source(existing table) without clicking on the validate button of BigQuery source and store them in BigQuery sink(new table) with use connection functionality Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" diff --git a/src/e2e-test/features/gcs/source/GCSToGCSAdditonalTests.feature b/src/e2e-test/features/gcs/source/GCSToGCSAdditonalTests.feature index 554db41e7c..5381fd80ab 100644 --- a/src/e2e-test/features/gcs/source/GCSToGCSAdditonalTests.feature +++ b/src/e2e-test/features/gcs/source/GCSToGCSAdditonalTests.feature @@ -1,7 +1,7 @@ @GCS_Source Feature: GCS source - Verification of GCS to GCS Additional Tests successful - @GCS_AVRO_FILE @GCS_SINK_TEST + @GCS_AVRO_FILE @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS to GCS using Avro and Json file format with different data types Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Source" @@ -43,7 +43,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Verify the pipeline status is "Succeeded" Then Validate the data transferred from GCS Source to GCS Sink with Expected avro file and target data in GCS bucket - @GCS_AVRO_FILE @GCS_SINK_TEST @EXISTING_GCS_CONNECTION + @GCS_AVRO_FILE @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS to GCS using Avro and Json file format with different data types using connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -86,7 +86,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Verify the pipeline status is "Succeeded" Then Validate the data transferred from GCS Source to GCS Sink with Expected avro file and target data in GCS bucket - @GCS_CSV @GCS_SINK_TEST @EXISTING_GCS_CONNECTION + @GCS_CSV @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS Source to GCS Sink using Schema Detection On Single File with connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -131,7 +131,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Verify the pipeline status is "Succeeded" Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket - @GCS_CSV @GCS_SINK_TEST + @GCS_CSV @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS Source to GCS Sink using Schema Detection On Single File without connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -174,7 +174,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Verify the pipeline status is "Succeeded" Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket - @GCS_CSV @GCS_SINK_TEST + @GCS_CSV @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify the pipeline is getting failed from GCS to GCS when default schema is not cleared in GCS source On Single File Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -204,7 +204,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Open and capture logs Then Verify the pipeline status is "Failed" - @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION + @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST Scenario: To verify the pipeline is getting failed from GCS Source to GCS Sink On Multiple File having different schemas with connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -236,7 +236,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Wait till pipeline is in running state Then Verify the pipeline status is "Failed" - @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST + @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify the pipeline is getting failed from GCS Source to GCS Sink On Multiple File having different schemas without connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -271,7 +271,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful | Level | Message | | ERROR | errorMessageMultipleFileWithFirstRowAsHeaderEnabled | - @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST + @GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify the pipeline is getting failed from GCS to GCS when default schema is not cleared in GCS source On Multiple File Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -305,7 +305,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful | Level | Message | | ERROR | errorMessageMultipleFileWithoutClearDefaultSchema | - @GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION + @GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS to GCS On Multiple File with filter regex using connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" @@ -349,7 +349,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful Then Verify the pipeline status is "Succeeded" Then Validate the data from GCS Source to GCS Sink with expected json file and target data in GCS bucket - @GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST + @GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST Scenario: To verify data is getting transferred from GCS to GCS On Multiple File with filter regex without using connection Given Open Datafusion Project to configure pipeline When Select plugin: "GCS" from the plugins list as: "Source" diff --git a/src/e2e-test/features/gcscopy/GCSCopy.feature b/src/e2e-test/features/gcscopy/GCSCopy.feature index cc81c7f875..6d4614fcb2 100644 --- a/src/e2e-test/features/gcscopy/GCSCopy.feature +++ b/src/e2e-test/features/gcscopy/GCSCopy.feature @@ -15,7 +15,7 @@ @GCSCopy Feature:GCSCopy - Verification of successful objects copy from one bucket to another - @CMEK @GCS_CSV_TEST @GCS_SINK_TEST + @CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario:Validate successful copy object from one bucket to another new bucket along with data validation with default subdirectory and overwrite toggle button as false. Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -37,7 +37,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano Then Validate GCSCopy successfully copies object "gcsCsvFile" to destination bucket Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath" - @GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required + @GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario: Validate successful copy objects from one bucket to another with Copy All Subdirectories set to true along with data validation. Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -59,7 +59,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano Then Close the pipeline logs Then Validate GCSCopy copies subdirectories along with its files to the destination bucket - @GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required + @GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario: Validate successful copy objects from one bucket to another with Copy All Subdirectories set to false along with data validation. Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -81,7 +81,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano Then Close the pipeline logs Then Validate GCSCopy did not copy subdirectories along with its files to the destination bucket - @GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required + @GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required @ITN_TEST Scenario: Validate successful copy objects from one bucket to another existing bucket with Overwrite Existing Files set to true along with data validation. Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -104,7 +104,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano Then Validate GCSCopy successfully copies object "gcsCsvFile" to destination bucket Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath" - @GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST + @GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required @ITN_TEST Scenario: Validate successful copy objects from one bucket to another existing bucket with Overwrite Existing Files set to false along with data validation. Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -127,7 +127,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano Then Validate GCSCopy failed to copy object "gcsCsvFile" to destination bucket Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath" - @GCS_CSV_TEST @GCS_SINK_TEST + @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario:Validate successful Copy object from one bucket to another new bucket with location set to non-default value Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" diff --git a/src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature b/src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature index 5ce468b81f..ffd6877602 100644 --- a/src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature +++ b/src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature @@ -15,6 +15,7 @@ @GCSCopy Feature: GCSCopy - Validate GCSCopy plugin error scenarios + @GCSCopy_Required @ITN_TEST Scenario:Verify GCSCopy plugin properties validation errors for mandatory fields Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -25,7 +26,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios | sourcePath | | destPath | - @GCS_SINK_TEST + @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario:Verify GCSCopy plugin error message for invalid bucket name in Source Path Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -36,7 +37,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios Then Click on the Validate button Then Verify that the Plugin Property: "sourcePath" is displaying an in-line error message: "errorMessageInvalidSourcePath" - @GCS_CSV_TEST + @GCS_CSV_TEST @GCSCopy_Required @ITN_TEST Scenario:Verify GCSCopy plugin error message for invalid bucket name in Destination Path Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" @@ -47,7 +48,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios Then Click on the Validate button Then Verify that the Plugin Property: "destPath" is displaying an in-line error message: "errorMessageInvalidDestPath" - @GCS_CSV_TEST @GCS_SINK_TEST + @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario:Verify GCSCopy plugin error message for invalid Encryption Key Name Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions" diff --git a/src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature b/src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature index e33faa48c5..4ffca46355 100644 --- a/src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature +++ b/src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature @@ -15,7 +15,7 @@ @GCSCopy Feature:GCSCopy - Verification of successful objects copy from one bucket to another with macro arguments - @CMEK @GCS_CSV_TEST @GCS_SINK_TEST + @CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST Scenario:Validate successful copy object from one bucket to another new bucket with macro arguments Given Open Datafusion Project to configure pipeline When Expand Plugin group in the LHS plugins list: "Conditions and Actions"