Skip to content

Commit

Permalink
NameSpaceAdmin feature design and Runtime scenarios addition.
Browse files Browse the repository at this point in the history
  • Loading branch information
rahuldash171 committed Oct 30, 2023
1 parent 6bc4382 commit 9f9eafd
Show file tree
Hide file tree
Showing 9 changed files with 432 additions and 0 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
#
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#
@Namespaceadmin
Feature: NameSpaceAdmin - Validate nameSpace admin design time scenarios

@Namespaceadmin
Scenario:Verify user is able to click the nameSpace admin tab and successfully navigates to the page
Given Open Datafusion Project to configure pipeline
When Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Verify that the user is navigated to nameSpace admin page successfully

@Namespaceadmin
Scenario:Validate user is able to open compute profile page and select a provisioner
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click on create profile button for "default" Namespace
Then Select a provisioner: "existingDataProc" for the compute profile
Then Click on close button of compute profile properties page

@Namespaceadmin
Scenario: Validate user is able to create new namespace preferences and able to delete the added namespace preferences successfully
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click "preferences" tab from Configuration page for "default" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Set nameSpace preferences with key: "keyValue" and value: "nameSpacePreferences1"
Then Click on the Save & Close preferences button
Then Click on edit namespace preferences to set namespace preferences
Then Delete the preferences
Then Click on the Save & Close preferences button

Scenario: Validate user is able to add multiple namespace preferences inside namespace admin successfully
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click "preferences" tab from Configuration page for "default" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Set nameSpace preferences with key: "keyValue" and value: "nameSpacePreferences2"
Then Click on the Save & Close preferences button
Then Click on edit namespace preferences to set namespace preferences
Then Delete the preferences
Then Delete the preferences
Then Click on the Save & Close preferences button

Scenario: Validate user is able reset the namespace preferences added inside namespace admin successfully
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click "preferences" tab from Configuration page for "default" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Set nameSpace preferences with key: "keyValue" and value: "nameSpacePreferences1"
Then Reset the preferences
Then Verify the reset is successful for added preferences

Scenario: To verify the validation error message with invalid cluster name
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click on create profile button for "default" Namespace
Then Select a provisioner: "existingDataProc" for the compute profile
Then Enter input plugin property: "profileLabel" with value: "validProfile"
Then Enter textarea plugin property: "profileDescription" with value: "validDescription"
Then Enter input plugin property: "clusterName" with value: "invalidClusterName"
Then Click on: "Create" button in the properties
Then Verify that the compute profile is displaying an error message: "errorInvalidClusterName" on the footer

Scenario:To verify the validation error message with invalid profile name
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click on create profile button for "default" Namespace
Then Select a provisioner: "existingDataProc" for the compute profile
Then Enter input plugin property: "profileLabel" with value: "invalidProfile"
Then Enter textarea plugin property: "profileDescription" with value: "validDescription"
Then Enter input plugin property: "clusterName" with value: "validClusterName"
Then Click on: "Create" button in the properties
Then Verify that the compute profile is displaying an error message: "errorInvalidProfileName" on the footer

Scenario:To verify the validation error message with invalid namespace name
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on Namespace dropdown button
Then Click on the Add Namespace tab
Then Enter the New Namespace Name with value: "invalidNamespaceName"
Then Enter the Namespace Description with value: "validNamespaceDescription"
Then Click on: "Finish" button in the properties
Then Verify the failed error message: "errorInvalidNamespace" displayed on dialog box

Scenario: Validate user is able to create new namespace from hamburger menu and switch to newly created namespace
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on Namespace dropdown button
Then Click on the Add Namespace tab
Then Enter the New Namespace Name with value: "validNamespaceName"
Then Enter the Namespace Description with value: "validNamespaceDescription"
Then Click on: "Finish" button in the properties
Then Switch to the newly created Namespace
Then Click on the Hamburger bar on the left panel
Then Verify if the switch is successful by checking the current "validNamespaceName" value
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
#
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#
@Namespaceadmin
Feature: NameSpaceAdmin - Validate nameSpace admin run time scenarios

@BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario:To verify user should be able to run a pipeline successfully using the NameSpace preferences created
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click "preferences" tab from Configuration page for "default" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Set nameSpace preferences with key: "keyValue" and value: "nameSpacePreferences2"
Then Click on the Save & Close preferences button
Then Click on the Hamburger bar on the left panel
Then Select navigation item: "studio" from the Hamburger menu list
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Navigate to the properties page of plugin: "BigQuery"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetprojectId"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetprojectId"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"

@BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User should be able to create a test connection inside Namespace admin and use it in any of the selected plugin Property.
Given Open Datafusion Project to configure pipeline
Then Click on the Hamburger bar on the left panel
Then Click on NameSpace Admin link from the menu
Then Click "connections" tab from Configuration page for "default" Namespace
Then Click on the Add Connection button
Then Add connection type as "bqConnection" and provide a "ConnectionName"
Then Click on the Test Connection button
Then Click on the Create button
Then Click on the Hamburger bar on the left panel
Then Select navigation item: "studio" from the Hamburger menu list
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Navigate to the properties page of plugin: "BigQuery"
Then Click plugin property: "switch-useConnection"
Then Click on the Browse Connections button
Then Select connection: "ConnectionName"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Browse button inside plugin properties
Then Click SELECT button inside connection data row with name: "dataset"
Then Wait till connection data loading completes with a timeout of 60 seconds
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Click plugin property: "useConnection"
Then Click on the Browse Connections button
Then Select connection: "ConnectionName"
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
Then Click on the Browse button inside plugin properties
Then Click SELECT button inside connection data row with name: "dataset"
Then Wait till connection data loading completes with a timeout of 60 seconds
Then Verify input plugin property: "dataset" contains value: "dataset"
Then Enter input plugin property: "table" with value: "bqTargetTable"
Then Click plugin property: "truncateTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"


Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
/*
* Copyright © 2023 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

package io.cdap.cdap.namespaceadmin;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute namespace admin related test cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.cdap.stepsdesign", "stepsdesign"},
tags = {"@NameSpaceadmin"},
plugin = {"pretty", "html:target/cucumber-html-report/namespaceadmin",
"json:target/cucumber-reports/cucumber-namespaceadmin.json",
"junit:target/cucumber-reports/cucumber-namespaceadmin.xml"}
)
public class TestRunner {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
/*
* Copyright © 2023 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

/**
* Package contains the runners for nameSpace admin features.
*/
package io.cdap.cdap.namespaceadmin;
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
/*
* Copyright © 2023 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.cdap.stepsdesign;

import com.google.cloud.bigquery.BigQueryException;
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cucumber.java.After;
import io.cucumber.java.Before;
import java.io.IOException;
import java.util.UUID;
import org.apache.commons.lang3.StringUtils;
import org.junit.Assert;
import stepsdesign.BeforeActions;

/**
* GCP test hooks.
*/
public class TestSetupHooks {

public static String bqTargetTable = StringUtils.EMPTY;
public static String bqSourceTable = StringUtils.EMPTY;
public static String datasetName = PluginPropertyUtils.pluginProp("dataset");

@Before(order = 1, value = "@BQ_SINK_TEST")
public static void setTempTargetBQTableName() {
bqTargetTable = "E2E_TARGET_" + UUID.randomUUID().toString().replaceAll("-", "_");
PluginPropertyUtils.addPluginProp("bqTargetTable", bqTargetTable);
BeforeActions.scenario.write("BQ Target table name - " + bqTargetTable);
}

@After(order = 1, value = "@BQ_SINK_TEST")
public static void deleteTempTargetBQTable() throws IOException, InterruptedException {
try {
BigQueryClient.dropBqQuery(bqTargetTable);
PluginPropertyUtils.removePluginProp("bqTargetTable");
BeforeActions.scenario.write("BQ Target table - " + bqTargetTable + " deleted successfully");
bqTargetTable = StringUtils.EMPTY;
} catch (BigQueryException e) {
if (e.getMessage().contains("Not found: Table")) {
BeforeActions.scenario.write("BQ Target Table " + bqTargetTable + " does not exist");
} else {
Assert.fail(e.getMessage());
}
}
}

/**
* Create BigQuery table with 3 columns (Id - Int, Value - Int, UID - string) containing random testdata.
* Sample row:
* Id | Value | UID
* 22 | 968 | 245308db-6088-4db2-a933-f0eea650846a
*/
@Before(order = 1, value = "@BQ_SOURCE_TEST")
public static void createTempSourceBQTable() throws IOException, InterruptedException {
bqSourceTable = "E2E_SOURCE_" + UUID.randomUUID().toString().replaceAll("-", "_");
StringBuilder records = new StringBuilder(StringUtils.EMPTY);
for (int index = 2; index <= 25; index++) {
records.append(" (").append(index).append(", ").append((int) (Math.random() * 1000 + 1)).append(", '")
.append(UUID.randomUUID()).append("'), ");
}
BigQueryClient.getSoleQueryResult("create table `" + datasetName + "." + bqSourceTable + "` as " +
"SELECT * FROM UNNEST([ " +
" STRUCT(1 AS Id, " + ((int) (Math.random() * 1000 + 1)) + " as Value, " +
"'" + UUID.randomUUID() + "' as UID), " +
records +
" (26, " + ((int) (Math.random() * 1000 + 1)) + ", " +
"'" + UUID.randomUUID() + "') " +
"])");
PluginPropertyUtils.addPluginProp("bqSourceTable", bqSourceTable);
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " created successfully");
}

@After(order = 1, value = "@BQ_SOURCE_TEST")
public static void deleteTempSourceBQTable() throws IOException, InterruptedException {
BigQueryClient.dropBqQuery(bqSourceTable);
PluginPropertyUtils.removePluginProp("bqSourceTable");
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " deleted successfully");
bqSourceTable = StringUtils.EMPTY;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/*
* Copyright © 2023 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
/**
* Package contains the stepDesign for the common features.
*/
package io.cdap.cdap.stepsdesign;
Loading

0 comments on commit 9f9eafd

Please sign in to comment.