Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

E2E cloud datastore #49

Merged
merged 1 commit into from
Apr 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ jobs:
)
strategy:
matrix:
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy]
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy, datastore]
fail-fast: false
steps:
# Pinned 1.0.0 version
Expand Down
149 changes: 149 additions & 0 deletions src/e2e-test/features/datastore/runtime.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
# Copyright © 2024 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@DataStore
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add the datastore folder name in e2e.yml file to execute the tests on GitHub.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added.

Feature: DataStore - Verification of Datastore to Datastore Successful Data Transfer

@DATASTORE_SOURCE_ENTITY @datastore_Required
Scenario: To verify data is getting transferred from Datastore to Datastore successfully using filter and custom index
Given Open Datafusion Project to configure pipeline
Then Select plugin: "Datastore" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Datastore"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter key value pairs for plugin property: "filters" with values from json: "filterOptions"
Then Enter kind for datastore plugin
Then Select dropdown plugin property: "keyType" with option value: "None"
Then Click on the Get Schema button
Then Validate "Datastore" plugin properties
Then Close the Plugin Properties page
And Select Sink plugin: "Datastore" from the plugins list
Then Connect plugins: "Datastore" and "Datastore2" to establish connection
Then Navigate to the properties page of plugin: "Datastore2"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "refName"
Then Enter kind for datastore plugin
Then Select dropdown plugin property: "indexStrategy" with option value: "Custom"
Then Enter Value for plugin property table key : "indexedProperties" with values: "propertyName"
Then Validate "datastore2" plugin properties
Then Close the Plugin Properties page
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Validate OUT record count is equal to IN record count
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Copy link
Collaborator

@itsmekumari itsmekumari Mar 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add data validation step in all scenarios, refer ITN if added.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now we are not going with data validation step for datastore.

Then Validate The Data From Datastore To Datastore With Actual And Expected File for: "dsExpectedFile"

@DATASTORE_SOURCE_ENTITY @datastore_Required
Scenario: To verify data is getting transferred from Datastore to Datastore using Urlsafekey
Given Open Datafusion Project to configure pipeline
Then Select plugin: "Datastore" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Datastore"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter key value pairs for plugin property: "filters" with values from json: "filterOptions"
Then Enter kind for datastore plugin
Then Select dropdown plugin property: "keyType" with option value: "URL-safe key"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Click on the Get Schema button
Then Validate "Datastore" plugin properties
Then Close the Plugin Properties page
And Select Sink plugin: "Datastore" from the plugins list
Then Connect plugins: "Datastore" and "Datastore2" to establish connection
Then Navigate to the properties page of plugin: "Datastore2"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "refName"
Then Select dropdown plugin property: "keyType" with option value: "URL-safe key"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Enter kind for datastore plugin
Then Enter Ancestor for the datastore plugin
Then Validate "datastore2" plugin properties
Then Close the Plugin Properties page
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Validate OUT record count is equal to IN record count
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From Datastore To Datastore With Actual And Expected File for: "dsExpectedFile"

@DATASTORE_SOURCE_ENTITY @datastore_Required
Scenario: To verify data is getting transferred from Datastore to Datastore using Ancestor and Key Literal
Given Open Datafusion Project to configure pipeline
Then Select plugin: "Datastore" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Datastore"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter kind for datastore plugin
Then Enter Ancestor for the datastore plugin
Then Select dropdown plugin property: "keyType" with option value: "Key literal"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Click on the Get Schema button
Then Validate "Datastore" plugin properties
Then Close the Plugin Properties page
And Select Sink plugin: "Datastore" from the plugins list
Then Connect plugins: "Datastore" and "Datastore2" to establish connection
Then Navigate to the properties page of plugin: "Datastore2"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "refName"
Then Select dropdown plugin property: "keyType" with option value: "Key literal"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Enter kind for datastore plugin
Then Enter Ancestor for the datastore plugin
Then Validate "datastore2" plugin properties
Then Close the Plugin Properties page
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Validate OUT record count is equal to IN record count
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From Datastore To Datastore With Actual And Expected File for: "dsExpectedFile"

@DATASTORE_SOURCE_ENTITY @datastore_Required
Scenario: To verify data is getting transferred from Datastore to Datastore using Ancestor and Custom Key
Given Open Datafusion Project to configure pipeline
Then Select plugin: "Datastore" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Datastore"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter kind for datastore plugin
Then Enter Ancestor for the datastore plugin
Then Select dropdown plugin property: "keyType" with option value: "Key literal"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Click on the Get Schema button
Then Validate "Datastore" plugin properties
Then Close the Plugin Properties page
And Select Sink plugin: "Datastore" from the plugins list
Then Connect plugins: "Datastore" and "Datastore2" to establish connection
Then Navigate to the properties page of plugin: "Datastore2"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "refName"
Then Select dropdown plugin property: "keyType" with option value: "Custom name"
Then Enter input plugin property: "keyAlias" with value: "fieldName"
Then Enter kind for datastore plugin
Then Validate "datastore2" plugin properties
Then Close the Plugin Properties page
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Validate OUT record count is equal to IN record count
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From Datastore To Datastore With Actual And Expected File for: "dsExpectedFile"
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.e2e.utils.StorageClient;
import io.cdap.plugin.utils.DataStoreClient;
import io.cdap.plugin.utils.PubSubClient;
import io.cdap.plugin.utils.SpannerClient;
import io.cucumber.java.After;
Expand Down Expand Up @@ -69,6 +70,8 @@ public class TestSetupHooks {
public static String spannerTargetTable = StringUtils.EMPTY;
public static boolean firstSpannerTestFlag = true;
public static String datasetName = PluginPropertyUtils.pluginProp("dataset");
public static String kindName = StringUtils.EMPTY;
public static String targetKind = StringUtils.EMPTY;
public static String spannerExistingTargetTable = StringUtils.EMPTY;

@Before(order = 1)
Expand Down Expand Up @@ -1298,6 +1301,33 @@ public static void createBucketWithLifeCycle() throws IOException, URISyntaxExce
gcsTargetBucketName = createGCSBucketLifeCycle();
BeforeActions.scenario.write("GCS target bucket name - " + gcsTargetBucketName); }

@Before(order = 1, value = "@DATASTORE_SOURCE_ENTITY")
public static void createEntityInCloudDataStore() throws IOException, URISyntaxException {
kindName = "cdf-test-" + UUID.randomUUID().toString().substring(0, 8);
String entityName = DataStoreClient.createKind(kindName);
PluginPropertyUtils.addPluginProp("kindName", entityName);
BeforeActions.scenario.write("Kind name - " + entityName + " created successfully");
}

@After(order = 1, value = "@DATASTORE_SOURCE_ENTITY")
public static void deleteEntityInCloudDataStore() throws IOException, URISyntaxException {
DataStoreClient.deleteEntity(kindName);
BeforeActions.scenario.write("Kind name - " + kindName + " deleted successfully");
}

@Before(order = 2, value = "@DATASTORE_TARGET_ENTITY")
public static void setTempTargetKindName() {
targetKind = "cdf-target-test-" + UUID.randomUUID().toString().substring(0, 8);
PluginPropertyUtils.addPluginProp("targetKind", targetKind);
BeforeActions.scenario.write("Target kind name - " + targetKind);
}

@After(order = 1, value = "@DATASTORE_TARGET_ENTITY")
public static void deleteTargetEntityInCloudDataStore() throws IOException, URISyntaxException {
DataStoreClient.deleteEntity(targetKind);
BeforeActions.scenario.write("Target Kind name - " + targetKind + " deleted successfully");
}

@Before(order = 1, value = "@BQEXECUTE_SOURCE_TEST")
public static void createBQEcxecuteSourceBQTable() throws IOException, InterruptedException {
bqSourceTable = "E2E_SOURCE_" + UUID.randomUUID().toString().replaceAll("-", "_");
Expand Down Expand Up @@ -1361,5 +1391,4 @@ public static void makeExistingTargetSpannerDBAndTableName() {
e.printStackTrace();
}
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

package io.cdap.plugin.datastore.actions;

import io.cdap.e2e.utils.ElementHelper;
import io.cdap.e2e.utils.SeleniumHelper;
import io.cdap.plugin.datastore.locators.DataStoreLocators;
import io.cdap.plugin.utils.DataStoreClient;

/**
* DataStore Plugin related actions.
*/
public class DataStoreActions {
static {
SeleniumHelper.getPropertiesLocators(DataStoreLocators.class);
}

/**
* Enters the specified kind name into the appropriate field in the user interface.
*
* @param kindName the name of the kind to be entered
*/
public static void enterKind(String kindName) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add small java doc for methods.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added

ElementHelper.sendKeys(DataStoreLocators.kind, kindName);
}

/**
* Enters the key literal of the current entity into the appropriate field in the user interface
* as the ancestor.
*/
public static void enterAncestor() {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we creating method to send text, can't we use existing framework steps ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are going with as per ganesh's comments to go for locators with data-testid if we can use them, so i just use these 2 new fields with data-testid.

ElementHelper.sendKeys(DataStoreLocators.ancestor, DataStoreClient.getKeyLiteral());
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
/**
* Package contains the actions for the DataStore features.
*/
package io.cdap.plugin.datastore.actions;
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

package io.cdap.plugin.datastore.locators;

import org.openqa.selenium.WebElement;
import org.openqa.selenium.support.FindBy;
import org.openqa.selenium.support.How;

/**
* DataStore Plugin related step design.
*/
public class DataStoreLocators {
@FindBy(how = How.XPATH, using = "//input[@data-testid='kind']")
public static WebElement kind;

@FindBy(how = How.XPATH, using = "//input[@data-testid='ancestor']")
public static WebElement ancestor;
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
/**
* Package contains the locators for the DataStore features.
*/
package io.cdap.plugin.datastore.locators;
37 changes: 37 additions & 0 deletions src/e2e-test/java/io/cdap/plugin/datastore/runner/TestRunner.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.datastore.runner;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute Datastore cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.plugin.datastore.stepsdesign", "io.cdap.plugin.common.stepsdesign",
"stepsdesign"},
tags = {"@DataStore"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/datastore",
"json:target/cucumber-reports/cucumber-datastore.json",
"junit:target/cucumber-reports/cucumber-datastore.xml"}
)
public class TestRunner {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.datastore.runner;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute only required DataStore cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.plugin.datastore.stepsdesign", "io.cdap.plugin.common.stepsdesign",
"stepsdesign"},
tags = {"@datastore_Required"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/datastore-required",
"json:target/cucumber-reports/cucumber-datastore-required.json",
"junit:target/cucumber-reports/cucumber-datastore-required.xml"}
)

public class TestRunnerRequired {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
/**
* Package contains the DataStore runners.
*/
package io.cdap.plugin.datastore.runner;
Loading