Skip to content

Commit

Permalink
Merge pull request data-integrations#1330 from cloudsufi/ITN_TAG
Browse files Browse the repository at this point in the history
Added ITN Required Tag
  • Loading branch information
Vipinofficial11 authored Nov 8, 2023
2 parents ae01662 + 3b832d3 commit 4d207bb
Show file tree
Hide file tree
Showing 6 changed files with 26 additions and 25 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
Then Verify the pipeline status is "Succeeded"
Then Verify the partition table is created with partitioned on field "bqPartitionFieldTime"

@BQ_EXISTING_SOURCE_DATATYPE_TEST @BQ_EXISTING_SINK_DATATYPE_TEST @EXISTING_BQ_CONNECTION
@BQ_EXISTING_SOURCE_DATATYPE_TEST @BQ_EXISTING_SINK_DATATYPE_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST
Scenario: Validate user is able to read the records from BigQuery source(existing table),source table here has more columns than BigQuery sink(existing table) with update button schema with use connection functionality
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down Expand Up @@ -244,7 +244,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
Then Verify the pipeline status is "Succeeded"
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bgInsertDatatypeFile"

@BQ_INSERT_SOURCE_TEST @BQ_UPDATE_SINK_TEST @EXISTING_BQ_CONNECTION
@BQ_INSERT_SOURCE_TEST @BQ_UPDATE_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST
Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced Operations Update without updating the destination table schema with use connection functionality
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down Expand Up @@ -295,7 +295,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
Then Verify the pipeline status is "Succeeded"
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table

@BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
@BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST
Scenario:Validate successful records transfer from BigQuery to BigQuery with Advanced operations Upsert without updating the destination table schema with use connection functionality
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Verify the pipeline status is "Succeeded"
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table

@BQ_EXISTING_SOURCE_TEST @BQ_EXISTING_SINK_TEST @EXISTING_BQ_CONNECTION
@BQ_EXISTING_SOURCE_TEST @BQ_EXISTING_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Source_Required @ITN_TEST
Scenario: Validate user is able to read data from BigQuery source(existing table) and store them in BigQuery sink(existing table) with use connection functionality
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down Expand Up @@ -310,7 +310,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Verify the pipeline status is "Succeeded"
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqExpectedFile"

@BQ_EXISTING_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
@BQ_EXISTING_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Source_Required @ITN_TEST
Scenario: Validate user is able to read data from BigQuery source(existing table) without clicking on the validate button of BigQuery source and store them in BigQuery sink(new table) with use connection functionality
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down
20 changes: 10 additions & 10 deletions src/e2e-test/features/gcs/source/GCSToGCSAdditonalTests.feature
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
@GCS_Source
Feature: GCS source - Verification of GCS to GCS Additional Tests successful

@GCS_AVRO_FILE @GCS_SINK_TEST
@GCS_AVRO_FILE @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS to GCS using Avro and Json file format with different data types
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down Expand Up @@ -43,7 +43,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Verify the pipeline status is "Succeeded"
Then Validate the data transferred from GCS Source to GCS Sink with Expected avro file and target data in GCS bucket

@GCS_AVRO_FILE @GCS_SINK_TEST @EXISTING_GCS_CONNECTION
@GCS_AVRO_FILE @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS to GCS using Avro and Json file format with different data types using connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -86,7 +86,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Verify the pipeline status is "Succeeded"
Then Validate the data transferred from GCS Source to GCS Sink with Expected avro file and target data in GCS bucket

@GCS_CSV @GCS_SINK_TEST @EXISTING_GCS_CONNECTION
@GCS_CSV @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS Source to GCS Sink using Schema Detection On Single File with connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -131,7 +131,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Verify the pipeline status is "Succeeded"
Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket

@GCS_CSV @GCS_SINK_TEST
@GCS_CSV @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS Source to GCS Sink using Schema Detection On Single File without connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -174,7 +174,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Verify the pipeline status is "Succeeded"
Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket

@GCS_CSV @GCS_SINK_TEST
@GCS_CSV @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify the pipeline is getting failed from GCS to GCS when default schema is not cleared in GCS source On Single File
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -204,7 +204,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Open and capture logs
Then Verify the pipeline status is "Failed"

@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION
@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST
Scenario: To verify the pipeline is getting failed from GCS Source to GCS Sink On Multiple File having different schemas with connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -236,7 +236,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Wait till pipeline is in running state
Then Verify the pipeline status is "Failed"

@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST
@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify the pipeline is getting failed from GCS Source to GCS Sink On Multiple File having different schemas without connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -271,7 +271,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
| Level | Message |
| ERROR | errorMessageMultipleFileWithFirstRowAsHeaderEnabled |

@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST
@GCS_MULTIPLE_FILES_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify the pipeline is getting failed from GCS to GCS when default schema is not cleared in GCS source On Multiple File
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -305,7 +305,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
| Level | Message |
| ERROR | errorMessageMultipleFileWithoutClearDefaultSchema |

@GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION
@GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @EXISTING_GCS_CONNECTION @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS to GCS On Multiple File with filter regex using connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down Expand Up @@ -349,7 +349,7 @@ Feature: GCS source - Verification of GCS to GCS Additional Tests successful
Then Verify the pipeline status is "Succeeded"
Then Validate the data from GCS Source to GCS Sink with expected json file and target data in GCS bucket

@GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST
@GCS_MULTIPLE_FILES_REGEX_TEST @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Scenario: To verify data is getting transferred from GCS to GCS On Multiple File with filter regex without using connection
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down
12 changes: 6 additions & 6 deletions src/e2e-test/features/gcscopy/GCSCopy.feature
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@GCSCopy
Feature:GCSCopy - Verification of successful objects copy from one bucket to another

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST
@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Validate successful copy object from one bucket to another new bucket along with data validation with default subdirectory and overwrite toggle button as false.
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -37,7 +37,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
Then Validate GCSCopy successfully copies object "gcsCsvFile" to destination bucket
Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath"

@GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required
@GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario: Validate successful copy objects from one bucket to another with Copy All Subdirectories set to true along with data validation.
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -59,7 +59,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
Then Close the pipeline logs
Then Validate GCSCopy copies subdirectories along with its files to the destination bucket

@GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required
@GCS_READ_RECURSIVE_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario: Validate successful copy objects from one bucket to another with Copy All Subdirectories set to false along with data validation.
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -81,7 +81,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
Then Close the pipeline logs
Then Validate GCSCopy did not copy subdirectories along with its files to the destination bucket

@GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required
@GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required @ITN_TEST
Scenario: Validate successful copy objects from one bucket to another existing bucket with Overwrite Existing Files set to true along with data validation.
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -104,7 +104,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
Then Validate GCSCopy successfully copies object "gcsCsvFile" to destination bucket
Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath"

@GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST
@GCS_CSV_TEST @GCS_SINK_EXISTING_BUCKET_TEST @GCSCopy_Required @ITN_TEST
Scenario: Validate successful copy objects from one bucket to another existing bucket with Overwrite Existing Files set to false along with data validation.
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -127,7 +127,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
Then Validate GCSCopy failed to copy object "gcsCsvFile" to destination bucket
Then Validate the data of GCS Copy source bucket and destination bucket "gcsCopyCsvExpectedFilePath"

@GCS_CSV_TEST @GCS_SINK_TEST
@GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Validate successful Copy object from one bucket to another new bucket with location set to non-default value
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand Down
7 changes: 4 additions & 3 deletions src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
@GCSCopy
Feature: GCSCopy - Validate GCSCopy plugin error scenarios

@GCSCopy_Required @ITN_TEST
Scenario:Verify GCSCopy plugin properties validation errors for mandatory fields
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -25,7 +26,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios
| sourcePath |
| destPath |

@GCS_SINK_TEST
@GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Verify GCSCopy plugin error message for invalid bucket name in Source Path
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -36,7 +37,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios
Then Click on the Validate button
Then Verify that the Plugin Property: "sourcePath" is displaying an in-line error message: "errorMessageInvalidSourcePath"

@GCS_CSV_TEST
@GCS_CSV_TEST @GCSCopy_Required @ITN_TEST
Scenario:Verify GCSCopy plugin error message for invalid bucket name in Destination Path
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand All @@ -47,7 +48,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios
Then Click on the Validate button
Then Verify that the Plugin Property: "destPath" is displaying an in-line error message: "errorMessageInvalidDestPath"

@GCS_CSV_TEST @GCS_SINK_TEST
@GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Verify GCSCopy plugin error message for invalid Encryption Key Name
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@GCSCopy
Feature:GCSCopy - Verification of successful objects copy from one bucket to another with macro arguments

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST
@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Validate successful copy object from one bucket to another new bucket with macro arguments
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
Expand Down

0 comments on commit 4d207bb

Please sign in to comment.