Skip to content

Commit

Permalink
e2e test added for gcs multipart upload
Browse files Browse the repository at this point in the history
  • Loading branch information
neerajsinghal05 committed Nov 30, 2023
1 parent 34e3956 commit 57e8143
Show file tree
Hide file tree
Showing 3 changed files with 61 additions and 1 deletion.
10 changes: 10 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1210,6 +1210,16 @@
<version>1.2.8</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-storage</artifactId>
<version>v1-rev20220604-1.32.1</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>2.8.0</version>
</dependency>
</dependencies>

</profile>
Expand Down
40 changes: 40 additions & 0 deletions src/e2e-test/features/bigquery/source/BigQueryToGCS.feature
Original file line number Diff line number Diff line change
Expand Up @@ -116,3 +116,43 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Verify data is transferred to target GCS bucket

@BQ_SOURCE_DATATYPE_TEST @GCS_SINK_MULTI_PART_UPLOAD
Scenario:Validate successful records transfer from BigQuery to GCS with bucket having delete multi part upload policy enabled
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "GCS" from the plugins list as: "Sink"
Then Navigate to the properties page of plugin: "BigQuery"
And Enter input plugin property: "referenceName" with value: "Reference"
And Replace input plugin property: "project" with value: "projectId"
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
And Replace input plugin property: "dataset" with value: "dataset"
Then Override Service account details if set in environment variables
And Enter input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Validate output schema with expectedSchema "bqSourceSchemaDatatype"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Navigate to the properties page of plugin: "GCS"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter GCS sink property path
Then Select dropdown plugin property: "select-format" with option value: "json"
Then Validate "GCS" plugin properties
Then Close the Plugin Properties page
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Validate the values of records transferred to GCS bucket is equal to the values from source BigQuery table
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ public static void createTargetGCSBucketWithCSVFile() throws IOException, URISyn
BeforeActions.scenario.write("GCS target bucket name - " + gcsTargetBucketName);
}

@After(order = 1, value = "@GCS_SINK_TEST or @GCS_SINK_EXISTING_BUCKET_TEST")
@After(order = 1, value = "@GCS_SINK_TEST or @GCS_SINK_EXISTING_BUCKET_TEST or @GCS_SINK_MULTI_PART_UPLOAD")
public static void deleteTargetBucketWithFile() {
deleteGCSBucket(gcsTargetBucketName);
PluginPropertyUtils.removePluginProp("gcsTargetBucketName");
Expand Down Expand Up @@ -1030,4 +1030,14 @@ public static void createSinkBQExistingDatatypeTable() throws IOException, Inter
PluginPropertyUtils.addPluginProp(" bqTargetTable", bqTargetTable);
BeforeActions.scenario.write("BQ Target Table " + bqTargetTable + " updated successfully");
}
private static String createGCSBucketLifeCycle() throws IOException, URISyntaxException {
String bucketName = StorageClient.createBucketwithLifeCycle("00000000-e2e-" + UUID.randomUUID(), 30).getName();
PluginPropertyUtils.addPluginProp("gcsTargetBucketName", bucketName);
return bucketName;
}

@Before(order = 1, value = "@GCS_SINK_MULTI_PART_UPLOAD")
public static void createBucketWithLifeCycle() throws IOException, URISyntaxException {
gcsTargetBucketName = createGCSBucketLifeCycle();
BeforeActions.scenario.write("GCS target bucket name - " + gcsTargetBucketName); }
}

0 comments on commit 57e8143

Please sign in to comment.