Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[wip][ui] Fix model mesh tests and update runtime images #2060

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
2 changes: 1 addition & 1 deletion ods_ci/tests/Resources/CLI/DSProjects/DSProjects.resource
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ Delete All DS Projects With Name Like
Log List of DS Projects to be deleted: @{dsp_list} console=yes
FOR ${dsp_name} IN @{dsp_list}
${return_code}= Run And Return Rc
... oc delete project ${dsp_name}
... oc delete project ${dsp_name} --force
Should Be Equal As Integers ${return_code} 0 msg=Error deleting DS Project ${dsp_name}
END
1 change: 1 addition & 0 deletions ods_ci/tests/Resources/CLI/ModelServing/modelmesh.resource
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ Verify Openvino Deployment
[Arguments] ${runtime_name} ${project_name}=${PRJ_TITLE} ${num_replicas}=1
${pod_selector}= Set Variable name=modelmesh-serving-${runtime_name}
@{ovms} = Oc Get kind=Pod namespace=${project_name} label_selector=${pod_selector}
Log message=${ovms} console=True
${containerNames} = Create List rest-proxy oauth-proxy ovms ovms-adapter mm
${pass}= Run Keyword And Return Status Verify Deployment ${ovms} ${num_replicas} 5 ${containerNames}
IF not ${pass}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,6 @@ spec:
value: /tmp/transformers_cache
- name: RUNTIME_LOCAL_MODELS_DIR
value: /mnt/models
- name: TRANSFORMERS_CACHE
value: /tmp/transformers_cache
- name: RUNTIME_GRPC_ENABLED
value: "false"
- name: RUNTIME_HTTP_ENABLED
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,9 @@ ${SERVING_MODEL_SERVERS_SIDE_MENU}= xpath=//span[text()='Models and model ser
${TOKEN_AUTH_CHECKBOX_XP}= xpath://input[@id="alt-form-checkbox-auth"]
${ADD_SERVICE_ACCOUNT_BUTTON}= xpath://button[text()='Add a service account']
${SERVICE_ACCOUNT_INPUT}= xpath://input[@data-testid='service-account-form-name']
${REPLICAS_COUNT_XP}= xpath=//input[@id='model-server-replicas']
${REPLICAS_COUNT_XP}= xpath=//input[@aria-label='model server replicas number input']
${PROJECT_SELECTOR_XP}= xpath://main[contains(@id, 'dashboard-page-main')]//*[@data-testid="project-selector-toggle"]
${DEPLOY_MULTI_MODEL_BTN}= //button[contains(@data-testid,"add-server-button")]


*** Keywords ***
Expand All @@ -39,8 +40,11 @@ Create Model Server
${existing_server}= Run Keyword And Return Status Wait Until Page Contains Element //button[.="${server_name}"]
IF ${existing_server} Run Keyword And Return
... Log Model Server '${server_name}' already exists, reusing server console=True
ELSE
SeleniumLibrary.Click Button //button[@data-testid="multi-serving-select-button"]
END
SeleniumLibrary.Click Button Add model server
SeleniumLibrary.Wait Until Page Contains Element ${DEPLOY_MULTI_MODEL_BTN}
SeleniumLibrary.Click Button ${DEPLOY_MULTI_MODEL_BTN}
SeleniumLibrary.Wait Until Page Contains Element //span[.="Add model server"]
Set Model Server Name ${server_name}
Set Replicas Number With Buttons ${no_replicas}
Expand Down Expand Up @@ -221,7 +225,7 @@ Get Model Serving Access Token via UI
SeleniumLibrary.Wait Until Page Contains Element xpath://td[@data-label="Tokens"]/button
SeleniumLibrary.Click Element xpath://td[@data-label="Tokens"]/button
${token}= SeleniumLibrary.Get Element Attribute
... xpath://div[.="${service_account_name} "]/../../td[@data-label="Token Secret"]//span/input value
... xpath://div[.="${service_account_name}"]/../../td[@data-label="Token Secret"]//span/input value
END
RETURN ${token}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@
[Arguments] ${model_name}
# TODO: Open model serving home page if needed?
# Click on Inference Endpoints link
${endpoint_link}= Set Variable //a[@data-testid="metrics-link-test-model" and text()="${model_name}"]/ancestor::tr//td//button[@data-testid="internal-external-service-button"]
${endpoint_link}= Set Variable //a[contains(@data-testid, 'metrics-link')][text() = '${model_name}']/ancestor::tr//td//button[@data-testid="internal-external-service-button"]

Check warning

Code scanning / Robocop

Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test

Line is too long (184/120)
SeleniumLibrary.Wait Until Page Contains Element ${endpoint_link}
SeleniumLibrary.Click Button ${endpoint_link}
# Get the external URL
Expand Down Expand Up @@ -515,10 +515,27 @@
Set Model Server Runtime
[Documentation] Opens the Serving runtime dropdown in the deploy model modal window for models
... and select the given runtime
[Arguments] ${runtime}=Caikit TGIS
Page Should Contain Element ${KSERVE_RUNTIME_DROPDOWN}
Click Element ${KSERVE_RUNTIME_DROPDOWN}
Click Element //span[contains(text(),"${runtime}")]
[Arguments] ${runtime}=Caikit TGIS ${retries}=1
TRY
${is_enabled}= Run Keyword And Return Status
... Element Should Be Enabled xpath://button[@id="serving-runtime-template-selection"]

IF ${is_enabled}
FOR ${retry_idx} IN RANGE 0 1+${retries}
Click Element xpath://button[@id="serving-runtime-template-selection"]
Page Should Contain Element xpath://span[contains(., "${runtime}")]
${selected}= Run Keyword And Return Status
... Click Element xpath://span[contains(., "${runtime}")]
IF ${selected}==${TRUE} BREAK

Check notice

Code scanning / Robocop

'{{ block_name }}' condition can be simplified Note test

'INLINE IF' condition can be simplified
END
ELSE
Element Should Be Disabled id:serving-runtime-template-selection
${text}= Get Text xpath://button[@id="serving-runtime-template-selection"]/span
Should Be True ${runtime} in ${text}
END
EXCEPT
Log framework ${runtime} does not appear to be supported by the chosen model server
END

Get Kserve Inference Host Via UI
[Documentation] Fetches the host of the model's URL from the Data Science Project UI
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,26 +69,27 @@
[Tags] Sanity
... ODS-2268
Open Data Science Projects Home Page
Create Data Science Project title=${PRJ_TITLE}-2268 description=${PRJ_DESCRIPTION}
Recreate S3 Data Connection project_title=${PRJ_TITLE}-2268 dc_name=model-serving-connection
${namespace}= Set Variable ${PRJ_TITLE}-2268
Create Data Science Project title=${namespace} description=${PRJ_DESCRIPTION}
Recreate S3 Data Connection project_title=${namespace} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${FALSE} server_name=${RUNTIME_NAME} existing_server=${TRUE}
Serve Model project_name=${PRJ_TITLE}-2268 model_name=${MODEL_NAME} framework=tensorflow
Create Model Server token=${FALSE} server_name=${RUNTIME_NAME} existing_server=${FALSE}
Serve Model project_name=${namespace} model_name=${MODEL_NAME} framework=tensorflow
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
... model_path=inception_resnet_v2.pb
${runtime_pod_name}= Replace String Using Regexp string=${RUNTIME_NAME} pattern=\\s replace_with=-
${runtime_pod_name}= Convert To Lower Case ${runtime_pod_name}
Wait Until Keyword Succeeds 5 min 10 sec Verify Openvino Deployment runtime_name=${RUNTIME_POD_NAME}
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service
Wait Until Keyword Succeeds 5 min 10 sec Verify Openvino Deployment runtime_name=${RUNTIME_POD_NAME} project_name=${namespace}

Check warning

Code scanning / Robocop

Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test

Line is too long (141/120)

Check warning

Code scanning / Robocop

Variable '{{ name }}' has inconsistent naming. First used as '{{ first_use }}' Warning test

Variable '${RUNTIME_POD_NAME}' has inconsistent naming. First used as '${runtime_pod_name}'
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service project_name=${namespace}
Verify Model Status ${MODEL_NAME} success
Set Suite Variable ${MODEL_CREATED} ${TRUE}
${url}= Get Model Route Via UI ${MODEL_NAME}
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input:0

Check notice

Code scanning / Robocop

Variable '{{ name }}' is assigned but not used Note test

Variable '${response_text}' is assigned but not used
... shape={"B": 1, "H": 299, "W": 299, "C": 3} no_requests=1
Should Be Equal As Strings ${status_code} 200
[Teardown] Run Keywords Run Keyword If Test Failed Get Modelmesh Events And Logs
... server_name=${RUNTIME_NAME} project_title=${PRJ_TITLE}-2869
... server_name=${RUNTIME_NAME} project_title=${namespace}
... AND
... Model Serving Test Teardown

Expand All @@ -102,7 +103,7 @@
Recreate S3 Data Connection project_title=${PRJ_TITLE} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${TRUE} server_name=${SECURED_RUNTIME} existing_server=${TRUE}
Create Model Server token=${TRUE} server_name=${SECURED_RUNTIME} existing_server=${FALSE}
Serve Model project_name=${PRJ_TITLE} model_name=${SECURED_MODEL} model_server=${SECURED_RUNTIME}
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection existing_model=${TRUE}
... framework=onnx model_path=mnist-8.onnx
Expand All @@ -113,7 +114,7 @@
Verify Model Status ${SECURED_MODEL} success
Set Suite Variable ${MODEL_CREATED} ${TRUE}
[Teardown] Run Keywords Run Keyword If Test Failed Get Modelmesh Events And Logs
... server_name=${RUNTIME_NAME} project_title=${PRJ_TITLE}-2869
... server_name=${RUNTIME_NAME} project_title=${PRJ_TITLE}
... AND
... Model Serving Test Teardown

Expand All @@ -127,7 +128,7 @@
Recreate S3 Data Connection project_title=${SECOND_PROJECT} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${TRUE} server_name=${SECURED_RUNTIME} existing_server=${TRUE}
Create Model Server token=${TRUE} server_name=${SECURED_RUNTIME} existing_server=${FALSE}
Serve Model project_name=${SECOND_PROJECT} model_name=${SECURED_MODEL} model_server=${SECURED_RUNTIME}
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection existing_model=${TRUE}
... framework=onnx model_path=mnist-8.onnx
Expand All @@ -139,7 +140,7 @@
${out}= Get Model Inference ${SECURED_MODEL} ${INFERENCE_INPUT} token_auth=${FALSE}
Should Contain ${out} <button type="submit" class="btn btn-lg btn-primary">Log in with OpenShift</button>
[Teardown] Run Keywords Run Keyword If Test Failed Get Modelmesh Events And Logs
... server_name=${RUNTIME_NAME} project_title=${PRJ_TITLE}-2869
... server_name=${RUNTIME_NAME} project_title=${SECOND_PROJECT}
... AND
... Model Serving Test Teardown

Expand All @@ -157,37 +158,38 @@
[Tags] Tier1
... RHOAIENG-2869
Open Data Science Projects Home Page
Create Data Science Project title=${PRJ_TITLE}-2869 description=${PRJ_DESCRIPTION}
Recreate S3 Data Connection project_title=${PRJ_TITLE}-2869 dc_name=model-serving-connection
${namespace}= Set Variable ${PRJ_TITLE}-2869
Create Data Science Project title=${namespace} description=${PRJ_DESCRIPTION}
Recreate S3 Data Connection project_title=${namespace} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${FALSE} server_name=${RUNTIME_NAME} existing_server=${TRUE}
Serve Model project_name=${PRJ_TITLE}-2869 model_name=${MODEL_NAME} framework=tensorflow
Serve Model project_name=${namespace} model_name=${MODEL_NAME} framework=tensorflow
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
... model_path=inception_resnet_v2.pb
${runtime_pod_name}= Replace String Using Regexp string=${RUNTIME_NAME} pattern=\\s replace_with=-
${runtime_pod_name}= Convert To Lower Case ${runtime_pod_name}
Wait Until Keyword Succeeds 5 min 10 sec Verify Openvino Deployment runtime_name=${RUNTIME_POD_NAME}
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service
Wait Until Keyword Succeeds 5 min 10 sec Verify Openvino Deployment runtime_name=${RUNTIME_POD_NAME} project_name=${namespace}

Check warning

Code scanning / Robocop

Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test

Line is too long (141/120)

Check warning

Code scanning / Robocop

Variable '{{ name }}' has inconsistent naming. First used as '{{ first_use }}' Warning test

Variable '${RUNTIME_POD_NAME}' has inconsistent naming. First used as '${runtime_pod_name}'
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service project_name=${namespace}
Verify Model Status ${MODEL_NAME} success
Set Suite Variable ${MODEL_CREATED} ${TRUE}
${url}= Get Model Route Via UI ${MODEL_NAME}
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input
... shape={"B": 1, "H": 299, "W": 299, "C": 3} no_requests=1
Should Be Equal As Strings ${status_code} 200
Serve Model project_name=${PRJ_TITLE}-2869 model_name=${MODEL_NAME} framework=openvino_ir
Serve Model project_name=${namespace} model_name=${MODEL_NAME} framework=openvino_ir
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
... model_path=openvino-example-model existing_model=${TRUE}
${runtime_pod_name}= Replace String Using Regexp string=${RUNTIME_NAME} pattern=\\s replace_with=-
${runtime_pod_name}= Convert To Lower Case ${runtime_pod_name}
Wait Until Keyword Succeeds 5 min 10 sec Verify Openvino Deployment runtime_name=${runtime_pod_name}
... project_name=${PRJ_TITLE}-2869
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service ${PRJ_TITLE}-2869
... project_name=${namespace}
Wait Until Keyword Succeeds 5 min 10 sec Verify Serving Service ${namespace}
Verify Model Status ${MODEL_NAME} success
Run Keyword And Continue On Failure Verify Model Inference ${MODEL_NAME} ${INFERENCE_INPUT_OPENVINO}
... ${EXPECTED_INFERENCE_OUTPUT_OPENVINO} token_auth=${FALSE}
[Teardown] Run Keywords Run Keyword If Test Failed Get Modelmesh Events And Logs
... server_name=${RUNTIME_NAME} project_title=${PRJ_TITLE}-2869
... server_name=${RUNTIME_NAME} project_title=${namespace}
... AND
... Model Serving Test Teardown

Expand Down Expand Up @@ -218,7 +220,7 @@
Recreate S3 Data Connection project_title=${new_project} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${FALSE} server_name=${server_name} existing_server=${TRUE}
Create Model Server token=${FALSE} server_name=${server_name} existing_server=${FALSE}
Serve Model project_name=${new_project} model_name=${model_name} framework=openvino_ir
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
... model_path=openvino-example-model existing_model=${TRUE}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,47 @@
... n_times=10 namespace=${test_namespace}
[Teardown] Clean Up Test Project test_ns=${test_namespace}
... isvc_names=${models_names} wait_prj_deletion=${FALSE}
...
Fixed Show fixed Hide fixed

Check warning

Code scanning / Robocop

Trailing whitespace at the end of line Warning test

Trailing whitespace at the end of line
Verify User Can Deploy Raw and Serverless Models In The Same Namespace # robocop: off=too-long-test-case,too-many-calls-in-test-case,line-too-long
Fixed Show fixed Hide fixed
[Documentation] Checks if user can deploy and query multiple models in the same namespace
[Tags] Sanity ODS-2371
[Setup] Set Project And Runtime namespace=${TEST_NS}-multisame protocol=http
${test_namespace}= Set Variable ${TEST_NS}-multisame
${model_one_name}= Set Variable bloom-560m-caikit
${model_two_name}= Set Variable flan-t5-small-caikit
${models_names}= Create List ${model_one_name} ${model_two_name}
Fixed Show fixed Hide fixed
Compile Inference Service YAML isvc_name=${model_one_name}
... sa_name=${DEFAULT_BUCKET_SA_NAME}
... model_storage_uri=${BLOOM_STORAGE_URI}
... kserve_mode=Serverless
Deploy Model Via CLI isvc_filepath=${INFERENCESERVICE_FILLED_FILEPATH}
... namespace=${test_namespace}
Compile Inference Service YAML isvc_name=${model_two_name}
... sa_name=${DEFAULT_BUCKET_SA_NAME}
... model_storage_uri=${FLAN_STORAGE_URI}
... kserve_mode=RawDeployment
Deploy Model Via CLI isvc_filepath=${INFERENCESERVICE_FILLED_FILEPATH}
... namespace=${test_namespace}
Wait For Model KServe Deployment To Be Ready label_selector=serving.kserve.io/inferenceservice=${model_one_name}
... namespace=${test_namespace} runtime=${CAIKIT_TGIS_RUNTIME_NAME}
Wait For Model KServe Deployment To Be Ready label_selector=serving.kserve.io/inferenceservice=${model_two_name}
... namespace=${test_namespace} runtime=${CAIKIT_TGIS_RUNTIME_NAME}
Query Model Multiple Times model_name=${model_one_name}
... n_times=5 namespace=${test_namespace}
... protocol=http
Query Model Multiple Times model_name=${model_two_name}
... n_times=10 namespace=${test_namespace}
... protocol=http
... port_forwarding=${TRUE}
Query Model Multiple Times model_name=${model_one_name}
... n_times=5 namespace=${test_namespace}
... protocol=http
Query Model Multiple Times model_name=${model_two_name}
... n_times=10 namespace=${test_namespace}
... protocol=http
... port_forwarding=${TRUE}
[Teardown] Clean Up Test Project test_ns=${test_namespace}
... isvc_names=${models_names} wait_prj_deletion=${FALSE}

Verify User Can Deploy Multiple Models In Different Namespaces # robocop: off=too-long-test-case,too-many-calls-in-test-case,line-too-long
[Documentation] Checks if user can deploy and query multiple models in the different namespaces
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ Verify User Can Deploy Multiple Models In The Same Namespace Using The UI # r
Wait For Model KServe Deployment To Be Ready label_selector=serving.kserve.io/inferenceservice=${model_one_name}
... namespace=${test_namespace} runtime=${CAIKIT_TGIS_RUNTIME_NAME}
Deploy Kserve Model Via UI ${model_two_name} serving_runtime=Caikit TGIS data_connection=kserve-connection
... path=flan-t5-small/${model_two_name}
... path=flan-t5-small/${model_two_name} existing_server=${TRUE}
Wait For Model KServe Deployment To Be Ready label_selector=serving.kserve.io/inferenceservice=${model_two_name}
... namespace=${test_namespace} runtime=${CAIKIT_TGIS_RUNTIME_NAME}
Query Model Multiple Times inference_type=all-tokens model_name=${model_one_name}
Expand Down Expand Up @@ -405,7 +405,7 @@ Verify User With Edit Permission Can Deploy Query And Delete A LLM # robocop:
${test_namespace}= Set Variable ${TEST_NS}-edit-permission
${flan_model_name}= Set Variable flan-t5-small-caikit
Move To Tab Permissions
Assign Edit Permissions To User ${TEST_USER_3.USERNAME}
Assign Contributor Permissions To User ${TEST_USER_3.USERNAME}
Move To Tab Overview
Logout From RHODS Dashboard
Login To RHODS Dashboard ${TEST_USER_3.USERNAME} ${TEST_USER_3.PASSWORD} ${TEST_USER_3.AUTH_TYPE}
Expand Down
Loading
Loading