forked from cdapio/hydrator-plugins
-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
6 changed files
with
156 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
79 changes: 79 additions & 0 deletions
79
core-plugins/src/e2e-test/features/deduplicate/Deduplicate_RuntimeErrorScenarios.feature
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,79 @@ | ||
@Deduplicate | ||
Feature:Deduplicate - Verify Deduplicate Plugin Runtime Error Scenarios | ||
|
||
@GCS_DEDUPLICATE_TEST @FILE_SINK_TEST | ||
Scenario:Verify the Pipeline Fails When the Unique Field Column is Empty | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "File" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Analytics" | ||
When Select plugin: "Deduplicate" from the plugins list as: "Analytics" | ||
Then Connect plugins: "File" and "Deduplicate" to establish connection | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "File" from the plugins list as: "Sink" | ||
Then Connect plugins: "Deduplicate" and "File2" to establish connection | ||
Then Navigate to the properties page of plugin: "File" | ||
Then Enter input plugin property: "referenceName" with value: "FileReferenceName" | ||
Then Enter input plugin property: "path" with value: "gcsDeduplicateTest" | ||
Then Select dropdown plugin property: "format" with option value: "csv" | ||
Then Click plugin property: "skipHeader" | ||
Then Click on the Get Schema button | ||
Then Verify the Output Schema matches the Expected Schema: "deduplicateOutputSchema" | ||
Then Validate "File" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Deduplicate" | ||
Then Validate "Deduplicate" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "File2" | ||
Then Enter input plugin property: "referenceName" with value: "FileReferenceName" | ||
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket" | ||
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss" | ||
Then Select dropdown plugin property: "format" with option value: "csv" | ||
Then Validate "File" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Failed" | ||
|
||
@GCS_DEDUPLICATE_TEST @FILE_SINK_TEST | ||
Scenario: To verify that pipeline fails from File to File using Deduplicate plugin with invalid partition and invalid unique field as macro argument | ||
Given Open Datafusion Project to configure pipeline | ||
When Select plugin: "File" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Analytics" | ||
When Select plugin: "Deduplicate" from the plugins list as: "Analytics" | ||
Then Connect plugins: "File" and "Deduplicate" to establish connection | ||
Then Navigate to the properties page of plugin: "File" | ||
Then Enter input plugin property: "referenceName" with value: "FileReferenceName" | ||
Then Enter input plugin property: "path" with value: "gcsDeduplicateTest" | ||
Then Select dropdown plugin property: "format" with option value: "csv" | ||
Then Click plugin property: "skipHeader" | ||
Then Click on the Get Schema button | ||
Then Verify the Output Schema matches the Expected Schema: "deduplicateOutputSchema" | ||
Then Validate "File" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Deduplicate" | ||
Then Click on the Macro button of Property: "deduplicateUniqueFields" and set the value to: "deduplicateUniqueFields" | ||
Then Click on the Macro button of Property: "deduplicateNumPartitions" and set the value to: "deduplicateNumberOfPartitions" | ||
Then Validate "Deduplicate" plugin properties | ||
Then Close the Plugin Properties page | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "File" from the plugins list as: "Sink" | ||
Then Connect plugins: "Deduplicate" and "File2" to establish connection | ||
Then Navigate to the properties page of plugin: "File2" | ||
Then Enter input plugin property: "referenceName" with value: "FileReferenceName" | ||
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket" | ||
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss" | ||
Then Select dropdown plugin property: "format" with option value: "csv" | ||
Then Validate "File" plugin properties | ||
Then Close the Plugin Properties page | ||
Then Save the pipeline | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "invalidUniqueField" for key "deduplicateUniqueFields" | ||
Then Enter runtime argument value "deduplicateInvalidNumberOfPartitions" for key "deduplicateNumberOfPartitions" | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Failed" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 2 additions & 0 deletions
2
...plugins/src/e2e-test/resources/testdata/expected_outputs/CSV_DEDUPLICATE_TEST7.Output.csv
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
alice,smith,1.5,34567 | ||
bob,smith,50.23,12345 |