Skip to content

Commit

Permalink
E2E Enhancements
Browse files Browse the repository at this point in the history
  • Loading branch information
satyamsufi committed Feb 28, 2023
1 parent fe35c1b commit 59ac3cc
Show file tree
Hide file tree
Showing 13 changed files with 128 additions and 86 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,12 +55,14 @@ jobs:
secrets: |-
ZENDESK_EMAIL:cdapio-github-builds/ZENDESK_EMAIL
ZENDESK_API_TOKEN:cdapio-github-builds/ZENDESK_API_TOKEN
ZENDESK_PASSWORD:cdapio-github-builds/ZENDESK_PASSWORD
- name: Run tests
run: python3 e2e/src/main/scripts/run_e2e_test.py
env:
ZENDESK_EMAIL: ${{ steps.secrets.outputs.ZENDESK_EMAIL }}
ZENDESK_API_TOKEN: ${{ steps.secrets.outputs.ZENDESK_API_TOKEN }}
ZENDESK_PASSWORD: ${{ steps.secrets.outputs.ZENDESK_PASSWORD }}

- name: Upload report
uses: actions/upload-artifact@v3
Expand Down
12 changes: 11 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -572,6 +572,9 @@
<ZENDESK_API_TOKEN>
${ZENDESK_API_TOKEN}
</ZENDESK_API_TOKEN>
<ZENDESK_PASSWORD>
${ZENDESK_PASSWORD}
</ZENDESK_PASSWORD>
</environmentVariables>

</configuration>
Expand Down Expand Up @@ -631,9 +634,16 @@
<dependency>
<groupId>io.cdap.tests.e2e</groupId>
<artifactId>cdap-e2e-framework</artifactId>
<version>0.1.0-SNAPSHOT</version>
<version>0.2.0-SNAPSHOT</version>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<version>5.1.1</version>
</dependency>

</dependencies>
</profile>
<profile>
Expand Down
11 changes: 6 additions & 5 deletions src/e2e-test/features/zendesksource/RunTime.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
@Regression
Feature: Zendesk Source - Run time scenarios

@TS-ZD-RNTM-01 @BQ_SINK @FILE_PATH @BQ_SINK_CLEANUP
@TS-ZD-RNTM-01 @BQ_SINK @BQ_SINK_CLEANUP @CREATE_GROUP @DELETE_GROUP
Scenario: Verify user should be able to preview and deploy the pipeline when plugin is configured for a Non hierarchical object
When Open Datafusion Project to configure pipeline
And Select plugin: "Zendesk" from the plugins list as: "Source"
Expand All @@ -38,6 +38,7 @@ Feature: Zendesk Source - Run time scenarios
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
And Enter input plugin property: "dataset" with value: "dataset"
And Enter input plugin property: "table" with value: "bqtarget.table"
And Replace input plugin property: "serviceFilePath" with value: "file.path"
And Validate "BigQuery" plugin properties
And Close the Plugin Properties page
And Preview and run the pipeline
Expand All @@ -52,9 +53,9 @@ Feature: Zendesk Source - Run time scenarios
And Open and capture logs
And Verify the pipeline status is "Succeeded"
And Close the pipeline logs
Then Validate record created in Sink application for Single object is equal to expected output file "groupsTestOutputFile"
Then Validate record created in Sink application for Single object is equal to expected output file

@TS-ZD-RNTM-02 @BQ_SINK @FILE_PATH @BQ_SINK_CLEANUP
@TS-ZD-RNTM-02 @BQ_SINK @BQ_SINK_CLEANUP @CREATE_GROUP @DELETE_GROUP
Scenario: Verify user should be able to preview and deploy the pipeline when plugin is configured for Advanced properties
When Open Datafusion Project to configure pipeline
And Select plugin: "Zendesk" from the plugins list as: "Source"
Expand Down Expand Up @@ -91,10 +92,10 @@ Feature: Zendesk Source - Run time scenarios
And Open and capture logs
And Verify the pipeline status is "Succeeded"
And Close the pipeline logs
Then Validate record created in Sink application for Single object is equal to expected output file "groupsTestOutputFile"
Then Validate record created in Sink application for Single object is equal to expected output file

@TS-ZD-RNTM-03
Scenario: Verify user should be able to preview and deploy and Run the pipeline when plugin is configured for a herarchical object with File Sink
Scenario: Verify user should be able to preview and deploy and Run the pipeline when plugin is configured for a hierarchical object with File Sink
When Open Datafusion Project to configure pipeline
And Select plugin: "Zendesk" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Zendesk"
Expand Down
8 changes: 4 additions & 4 deletions src/e2e-test/features/zendesksource/RunTimeWithMacros.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
@Regression
Feature: Zendesk Source - Run time scenarios

@TS-ZD-RNTM-MACRO-01 @BQ_SINK @BQ_SINK_CLEANUP @FILE_PATH
@TS-ZD-RNTM-MACRO-01 @BQ_SINK @BQ_SINK_CLEANUP @TEST_DATA @DELETE_TEST_DATA
Scenario: Verify user should be able to preview and deploy the pipeline when plugin is configured for a Non hierarchical Object with macros
When Open Datafusion Project to configure pipeline
And Select plugin: "Zendesk" from the plugins list as: "Source"
Expand Down Expand Up @@ -69,9 +69,9 @@ Feature: Zendesk Source - Run time scenarios
And Open and capture logs
And Verify the pipeline status is "Succeeded"
And Close the pipeline logs
Then Validate record created in Sink application for Single object is equal to expected output file "groupsTestOutputFile"
Then Validate record created in Sink application for Single object is equal to expected output file

@TS-ZD-RNTM-MACRO-02 @BQ_SINK @BQ_SINK_CLEANUP @FILE_PATH
@TS-ZD-RNTM-MACRO-02 @BQ_SINK @BQ_SINK_CLEANUP @TEST_DATA @DELETE_TEST_DATA
Scenario: Verify user should be able to preview and deploy the pipeline when plugin is configured for Advanced Properties with macros
When Open Datafusion Project to configure pipeline
And Select plugin: "Zendesk" from the plugins list as: "Source"
Expand Down Expand Up @@ -116,7 +116,7 @@ Feature: Zendesk Source - Run time scenarios
And Open and capture logs
And Verify the pipeline status is "Succeeded"
And Close the pipeline logs
Then Validate record created in Sink application for Single object is equal to expected output file "groupsTestOutputFile"
Then Validate record created in Sink application for Single object is equal to expected output file

@TS-ZD-RNTM-MACRO-03 @BQ_SINK @BQ_SINK_CLEANUP
Scenario: Verify pipeline failure message in logs when user provides invalid Credentials with Macros
Expand Down
37 changes: 21 additions & 16 deletions src/e2e-test/java/io/cdap/plugin/tests/hooks/TestSetupHooks.java
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import com.google.cloud.bigquery.BigQueryException;
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.plugin.zendesk.actions.DataValidationHelper;
import io.cucumber.java.After;
import io.cucumber.java.Before;
import org.apache.commons.lang3.RandomStringUtils;
Expand All @@ -26,12 +27,27 @@
import stepsdesign.BeforeActions;
import java.io.IOException;
import java.nio.file.Paths;
import java.util.Base64;

/**
* Represents Test Setup and Clean up hooks.
*/
public class TestSetupHooks {
private static boolean firstFileSinkTestFlag = true;
public static String testdata_Group = "";
public static String cred = "";


@Before(order = 1, value = "@CREATE_GROUP")
public void createGroup() {
Base64.Encoder encoder = Base64.getUrlEncoder();
String email = System.getenv("ZENDESK_EMAIL");
String password = System.getenv("ZENDESK_PASSWORD");
String auth = email + ":" + password;
String encodedAuth = encoder.encodeToString(auth.getBytes());
cred = "Basic " + encodedAuth;
String jsonBody = "{\"group\": {\"name\": \"My Group" + RandomStringUtils.randomAlphanumeric(10) + "\"}}";
testdata_Group = DataValidationHelper.createGroup(cred, jsonBody);
}

@Before(order = 1, value = "@BQ_SINK")
public void setTempTargetBQDataset() {
Expand Down Expand Up @@ -77,19 +93,8 @@ public void deleteMultiSourceTargetBQTable() throws IOException, InterruptedExce
}
}
}

@Before(order = 1, value = "@FILE_PATH")
public static void setFileAbsolutePath() {

if (firstFileSinkTestFlag) {
PluginPropertyUtils.addPluginProp("groupsTestOutputFile", Paths.get(TestSetupHooks.class.getResource
("/" + PluginPropertyUtils.pluginProp("groupsTestOutputFile")).getPath()).toString());
PluginPropertyUtils.addPluginProp("multiObjectsOutputFile", Paths.get(TestSetupHooks.class.getResource
("/" + PluginPropertyUtils.pluginProp("multiObjectsOutputFile")).getPath()).toString());
PluginPropertyUtils.addPluginProp("multiObjectsOutputFile1", Paths.get(TestSetupHooks.class.getResource
("/" + PluginPropertyUtils.pluginProp("multiObjectsOutputFile1")).getPath()).toString());

firstFileSinkTestFlag = false;
}
@After(order = 2, value = "@DELETE_GROUP")
public void deleteGroup() {
DataValidationHelper.deleteGroup(cred);
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

package io.cdap.plugin.zendesk.actions;

import io.restassured.http.ContentType;
import io.restassured.response.Response;
import io.restassured.response.ResponseBody;
import static io.restassured.RestAssured.given;


/**
* Zendesk utility - enhancements.
*/
public class DataValidationHelper {
private static String baseURI = "https://cloudsufi.zendesk.com/api/v2";
public static String createGroup(String cred, String jsonBody) {
Response response = given()
.header("authorization", cred)
.accept(ContentType.JSON)
.contentType(ContentType.JSON)
.and()
.body(jsonBody)
.when()
.post(baseURI + "/groups.json")
.then().extract().response();

ResponseBody responseBody = response;

return responseBody.asString();
}
public static void deleteGroup(String cred) {
Response response1 = given()
.header("authorization", cred)
.delete(baseURI + "/groups/" + ZendeskPropertiesPageActions.uniqueId + ".json");
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
import io.cdap.e2e.utils.ElementHelper;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.e2e.utils.SeleniumHelper;
import io.cdap.plugin.tests.hooks.TestSetupHooks;
import io.cdap.plugin.utils.enums.Subdomains;
import io.cdap.plugin.zendesk.locators.ZendeskPropertiesPage;
import org.junit.Assert;
Expand All @@ -49,6 +50,8 @@ public class ZendeskPropertiesPageActions {
private static Gson gson = new Gson();
private static List<String> bigQueryrows = new ArrayList<>();

public static BigInteger uniqueId;

static {
SeleniumHelper.getPropertiesLocators(ZendeskPropertiesPage.class);
}
Expand Down Expand Up @@ -91,31 +94,21 @@ public static void selectDropdowWithMultipleOptionsForObjectsToSkip(List<String>
ElementHelper.clickUsingActions(CdfPluginPropertiesLocators.pluginPropertiesPageHeader);
}

public static void verifyIfRecordCreatedInSinkForSingleObjectIsCorrect(String expectedOutputFile)
public static void verifyIfRecordCreatedInSinkForSingleObjectIsCorrect()
throws IOException, InterruptedException {
List<String> expectedOutput = new ArrayList<>();
try (BufferedReader bf1 = Files.newBufferedReader(Paths.get(PluginPropertyUtils.pluginProp(expectedOutputFile)))) {
String line;
while ((line = bf1.readLine()) != null) {
expectedOutput.add(line);
}
}

for (int expectedRow = 0; expectedRow < expectedOutput.size(); expectedRow++) {
JsonObject expectedOutputAsJson = gson.fromJson(expectedOutput.get(expectedRow), JsonObject.class);
BigInteger uniqueId = expectedOutputAsJson.get("id").getAsBigInteger();
JsonObject expectedOutputAsJson = gson.fromJson(TestSetupHooks.testdata_Group, JsonObject.class);
JsonObject group = (JsonObject) expectedOutputAsJson.get("group");
System.out.println(group);
uniqueId = group.get("id").getAsBigInteger();
getBigQueryTableData(PluginPropertyUtils.pluginProp("dataset"),
PluginPropertyUtils.pluginProp("bqtarget.table"), uniqueId);

}
for (int row = 0; row < bigQueryrows.size() && row < expectedOutput.size(); row++) {
Assert.assertTrue(ZendeskPropertiesPageActions.compareValueOfBothResponses(expectedOutput.get(row),
bigQueryrows.get(row)));
Assert.assertTrue(ZendeskPropertiesPageActions.compareValueOfBothResponses(group.toString(),
bigQueryrows.get(0)));
}
}


public static void verifyIfRecordCreatedInSinkForMultipleObjectsAreCorrect(String expectedOutputFile)
throws IOException, InterruptedException {
throws IOException, InterruptedException {
List<String> expectedOutput = new ArrayList<>();
try (BufferedReader bf1 = Files.newBufferedReader(Paths.get(PluginPropertyUtils.pluginProp(expectedOutputFile)))) {
String line;
Expand All @@ -126,21 +119,22 @@ public static void verifyIfRecordCreatedInSinkForMultipleObjectsAreCorrect(Strin

List<String> bigQueryDatasetTables = new ArrayList<>();
TableResult tablesSchema = ZendeskPropertiesPageActions.getTableNamesFromDataSet
(PluginPropertyUtils.pluginProp("dataset"));
(PluginPropertyUtils.pluginProp("dataset"));
tablesSchema.iterateAll().forEach(value -> bigQueryDatasetTables.add(value.get(0).getValue().toString()));
System.out.println(bigQueryDatasetTables.size());

for (int expectedRow = 0; expectedRow < expectedOutput.size(); expectedRow++) {
JsonObject expectedOutputAsJson = gson.fromJson(expectedOutput.get(expectedRow), JsonObject.class);
BigInteger uniqueId = expectedOutputAsJson.get("id").getAsBigInteger();
getBigQueryTableData(PluginPropertyUtils.pluginProp("dataset"),
bigQueryDatasetTables.get(0), uniqueId);
bigQueryDatasetTables.get(0), uniqueId);
}
for (int row = 0; row < bigQueryrows.size() && row < expectedOutput.size(); row++) {
Assert.assertTrue(ZendeskPropertiesPageActions.compareValueOfBothResponses(expectedOutput.get(row),
bigQueryrows.get(row)));
bigQueryrows.get(row)));
}
}
}


static boolean compareValueOfBothResponses(String zendeskResponse, String bigQueryResponse) {
Type type = new TypeToken<Map<String, Object>>() {
Expand All @@ -149,13 +143,17 @@ static boolean compareValueOfBothResponses(String zendeskResponse, String bigQue
Map<String, Object> bigQueryResponseInMap = gson.fromJson(bigQueryResponse, type);
MapDifference<String, Object> mapDifference = Maps.difference(zendeskResponseInmap, bigQueryResponseInMap);
logger.info("Assertion :" + mapDifference);

return mapDifference.areEqual();
}

public static void getBigQueryTableData(String dataset, String table, BigInteger uniqueId)
throws IOException, InterruptedException {
String projectId = PluginPropertyUtils.pluginProp("projectId");

// Altering the table to get the field names in sync with Zendesk response.
String alterQuery = "ALTER TABLE `" + projectId + "." + dataset + "." + table +
"` RENAME COLUMN updatedAt to updated_at,RENAME COLUMN createdAt to created_at ";
BigQueryClient.executeQuery(alterQuery);
String selectQuery = "SELECT TO_JSON(t) FROM `" + projectId + "." + dataset + "." + table + "` AS t WHERE " +
"id=" + uniqueId + " ";
TableResult result = BigQueryClient.getQueryResult(selectQuery);
Expand All @@ -169,5 +167,5 @@ public static TableResult getTableNamesFromDataSet(String bqTargetDataset) throw

return BigQueryClient.getQueryResult(selectQuery);
}
}
}

Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,10 @@ public void fillObjectsToSkipListWithBelowListedObjects(DataTable table) {
ZendeskPropertiesPageActions.selectDropdowWithMultipleOptionsForObjectsToSkip(tablesList);
}

@Then("Validate record created in Sink application for Single object is equal to expected output file {string}")
public void verifyIfNewRecordCreatedInSinkApplicationForObjectIsCorrect(String expectedOutputFile)
@Then("Validate record created in Sink application for Single object is equal to expected output file")
public void verifyIfNewRecordCreatedInSinkApplicationForObjectIsCorrect()
throws IOException, InterruptedException {
ZendeskPropertiesPageActions.verifyIfRecordCreatedInSinkForSingleObjectIsCorrect(expectedOutputFile);
ZendeskPropertiesPageActions.verifyIfRecordCreatedInSinkForSingleObjectIsCorrect();
}

@Then("Validate record created in Sink application for Multi object is equal to expected output file {string}")
Expand Down
3 changes: 3 additions & 0 deletions src/e2e-test/resources/pluginParameters.properties
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
#Credentialls
admin.email = ZENDESK_EMAIL
admin.apitoken = ZENDESK_API_TOKEN
password = ZENDESK_PASSWORD

#Valid Porperties
admin.subdomain = cloudsufi
Expand Down Expand Up @@ -28,6 +29,7 @@ projectId=cdf-athena
datasetprojectId=cdf-athena
dataset=enterprise_test_automation
bqtarget.table=target-table
file.path=/Users/tiwarisatyam/Downloads/cdf-athena-48053fab02c2.json

groups=Groups
ticket_comments=TicketComments
Expand Down Expand Up @@ -170,3 +172,4 @@ schema.objecttopull.satisfactionrating = [{"key":"id","value":"long"},\
{"key":"createdAt","value":"string"},{"key":"updatedAt","value":"string"},\
{"key":"comment","value":"string"},{"key":"reason","value":"string"},\
{"key":"reasonId","value":"long"},{"key":"reasonCode","value":"long"}]

This file was deleted.

Loading

0 comments on commit 59ac3cc

Please sign in to comment.