Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QA] Component library tests #872

Merged
merged 24 commits into from
Nov 26, 2024
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
ea31d69
component library tests
l0uden Nov 13, 2024
c656536
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 13, 2024
40a293e
failed artifacts and slack notifications
l0uden Nov 14, 2024
3283d74
branch in notification
l0uden Nov 14, 2024
3cff909
delete screenshot
l0uden Nov 14, 2024
8fe3812
add screenshot
l0uden Nov 14, 2024
ab843c8
fix screenshot url
l0uden Nov 14, 2024
f9de666
Merge branch 'main' of https://github.com/mckinsey/vizro into qa/comp…
l0uden Nov 14, 2024
d74645f
changelog
l0uden Nov 14, 2024
9dd267a
review changes
l0uden Nov 19, 2024
790f273
Merge branch 'main' of https://github.com/mckinsey/vizro into qa/comp…
l0uden Nov 19, 2024
a37ecb2
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 19, 2024
3d9f148
review changes round 2
l0uden Nov 20, 2024
f328e1e
Merge branch 'qa/component_library_tests' of https://github.com/mckin…
l0uden Nov 20, 2024
868df89
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 20, 2024
921c9bb
review changes round 3
l0uden Nov 20, 2024
2b48abc
Merge branch 'qa/component_library_tests' of https://github.com/mckin…
l0uden Nov 20, 2024
bbe76f4
Merge branch 'main' of https://github.com/mckinsey/vizro into qa/comp…
l0uden Nov 20, 2024
9734fa2
test new artifacts and notifications failure logic
l0uden Nov 25, 2024
21d4f6d
fixed filename
l0uden Nov 25, 2024
ac64a8d
passed secret as env
l0uden Nov 25, 2024
677a64e
add path for file copying
l0uden Nov 25, 2024
52b87ba
return correct file name
l0uden Nov 25, 2024
335fd21
Merge branch 'main' of https://github.com/mckinsey/vizro into qa/comp…
l0uden Nov 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 67 additions & 0 deletions .github/workflows/test-e2e-component-library-vizro-core.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
name: e2e component library tests
l0uden marked this conversation as resolved.
Show resolved Hide resolved

defaults:
run:
working-directory: vizro-core

on:
push:
branches: [main]
pull_request:
branches:
- main

env:
PYTHONUNBUFFERED: 1
FORCE_COLOR: 1
PYTHON_VERSION: "3.12"

jobs:
test-e2e-component-library-vizro-core:
name: test-e2e-component-library-vizro-core
l0uden marked this conversation as resolved.
Show resolved Hide resolved

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}

- name: Install Hatch
run: pip install hatch

- name: Show dependency tree
run: hatch run all.py${{ env.PYTHON_VERSION }}:pip tree

- name: Run e2e component library tests
run: hatch run all.py${{ env.PYTHON_VERSION }}:test-e2e-component-library
l0uden marked this conversation as resolved.
Show resolved Hide resolved

- name: Copy failed screenshots
if: failure()
maxschulz-COL marked this conversation as resolved.
Show resolved Hide resolved
run: |
mkdir /home/runner/work/vizro/vizro/vizro-core/failed_screenshots/
cp *.png failed_screenshots

- name: Archive production artifacts
uses: actions/upload-artifact@v4
if: failure()
with:
name: Failed screenshots
path: |
/home/runner/work/vizro/vizro/vizro-core/failed_screenshots/*.png

- name: Send custom JSON data to Slack
id: slack
uses: slackapi/[email protected]
if: failure()
with:
payload: |
{
"text": "Vizro component tests build result: ${{ job.status }}\nBranch: ${{ github.head_ref }}\n${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
4 changes: 0 additions & 4 deletions .github/workflows/vizro-qa-tests-trigger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@ jobs:
include:
- label: integration tests
- label: vizro-ai ui tests
- label: component library tests
steps:
- name: Passed fork step
run: echo "Success!"
Expand All @@ -36,7 +35,6 @@ jobs:
include:
- label: integration tests
- label: vizro-ai ui tests
- label: component library tests
steps:
- uses: actions/checkout@v4
- name: Tests trigger
Expand All @@ -48,8 +46,6 @@ jobs:
export INPUT_WORKFLOW_FILE_NAME=${{ secrets.VIZRO_QA_INTEGRATION_TESTS_WORKFLOW }}
elif [ "${{ matrix.label }}" == "vizro-ai ui tests" ]; then
export INPUT_WORKFLOW_FILE_NAME=${{ secrets.VIZRO_QA_VIZRO_AI_UI_TESTS_WORKFLOW }}
elif [ "${{ matrix.label }}" == "component library tests" ]; then
export INPUT_WORKFLOW_FILE_NAME=${{ secrets.VIZRO_QA_VIZRO_COMPONENT_LIBRARY_TESTS_WORKFLOW }}
fi
export INPUT_GITHUB_TOKEN=${{ secrets.VIZRO_SVC_PAT }}
export INPUT_REF=main # because we should send existent branch to dispatch workflow
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
<!--
A new scriv changelog fragment.

Uncomment the section that is right (remove the HTML comment wrapper).
-->

<!--
### Highlights ✨

- A bullet item for the Highlights ✨ category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Removed

- A bullet item for the Removed category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Added

- A bullet item for the Added category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Changed

- A bullet item for the Changed category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Deprecated

- A bullet item for the Deprecated category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Fixed

- A bullet item for the Fixed category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
<!--
### Security

- A bullet item for the Security category with a link to the relevant PR at the end of your entry, e.g. Enable feature XXX. ([#1](https://github.com/mckinsey/vizro/pull/1))

-->
6 changes: 5 additions & 1 deletion vizro-core/hatch.toml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,10 @@ dependencies = [
"openpyxl",
"jupyter",
"pre-commit",
"PyGithub"
"PyGithub",
l0uden marked this conversation as resolved.
Show resolved Hide resolved
"imutils",
"opencv-python",
"pyhamcrest"
]
installer = "uv"

Expand All @@ -55,6 +58,7 @@ schema-check = ["python schemas/generate.py --check"]
# fix this, but we don't actually use `hatch run test` anywhere right now.
# See comments added in https://github.com/mckinsey/vizro/pull/444.
test = "pytest tests --headless {args}"
test-e2e-component-library = "pytest tests/e2e/test_component_library.py --headless {args}"
test-integration = "pytest tests/integration --headless {args}"
test-js = "./tools/run_jest.sh {args}"
test-unit = "pytest tests/unit {args}"
Expand Down
8 changes: 7 additions & 1 deletion vizro-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,13 @@ filterwarnings = [
# Ignore warning when using the fig.layout.title inside examples:
"ignore:Using the `title` argument in your Plotly chart function may cause misalignment:UserWarning",
# Ignore warning for Pydantic v1 API and Python 3.13:
"ignore:Failing to pass a value to the 'type_params' parameter of 'typing.ForwardRef._evaluate' is deprecated:DeprecationWarning"
"ignore:Failing to pass a value to the 'type_params' parameter of 'typing.ForwardRef._evaluate' is deprecated:DeprecationWarning",
# Ignore deprecation warning until this is solved: https://github.com/plotly/dash/issues/2590
# The `features` examples do add_type, which ideally we would clean up afterwards to restore vizro.models to
# its previous state. Since we don't currently do this, `hatch run test` fails.
# This is difficult to fix fully by un-importing vizro.models though, since we use `import vizro.models as vm` - see
# https://stackoverflow.com/questions/437589/how-do-i-unload-reload-a-python-module.
l0uden marked this conversation as resolved.
Show resolved Hide resolved
"ignore:HTTPResponse.getheader():DeprecationWarning"
]
norecursedirs = ["tests/tests_utils", "tests/js"]
pythonpath = ["tests/tests_utils"]
Expand Down
6 changes: 6 additions & 0 deletions vizro-core/tests/e2e/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
import pytest


@pytest.fixture
def get_test_name(request):
return request.node.name
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
88 changes: 88 additions & 0 deletions vizro-core/tests/e2e/test_component_library.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
import dash_bootstrap_components as dbc
import pandas as pd
from dash import Dash, html
from e2e_asserts import assert_image_equal

from vizro.figures.library import kpi_card, kpi_card_reference

df_kpi = pd.DataFrame(
{
"Actual": [100, 200, 700],
"Reference": [100, 300, 500],
"Category": ["A", "B", "C"],
}
)

example_cards = [
kpi_card(data_frame=df_kpi, value_column="Actual", title="KPI with value"),
kpi_card(
data_frame=df_kpi,
value_column="Actual",
title="KPI with aggregation",
agg_func="median",
),
kpi_card(
data_frame=df_kpi,
value_column="Actual",
title="KPI formatted",
value_format="${value:.2f}",
),
kpi_card(
data_frame=df_kpi,
value_column="Actual",
title="KPI with icon",
icon="shopping_cart",
),
]

example_reference_cards = [
kpi_card_reference(
data_frame=df_kpi,
value_column="Actual",
reference_column="Reference",
title="KPI ref. (pos)",
),
kpi_card_reference(
data_frame=df_kpi,
value_column="Actual",
reference_column="Reference",
agg_func="median",
title="KPI ref. (neg)",
),
kpi_card_reference(
data_frame=df_kpi,
value_column="Actual",
reference_column="Reference",
title="KPI ref. formatted",
value_format="{value}€",
reference_format="{delta}€ vs. last year ({reference}€)",
),
kpi_card_reference(
data_frame=df_kpi,
value_column="Actual",
reference_column="Reference",
title="KPI ref. with icon",
icon="shopping_cart",
),
]


def test_kpi_card_component_library(dash_duo, get_test_name):
app = Dash(__name__, external_stylesheets=[dbc.themes.BOOTSTRAP])
app.layout = dbc.Container(
[
html.H1(children="KPI Cards"),
dbc.Stack(
children=[
dbc.Row([dbc.Col(kpi_card) for kpi_card in example_cards]),
dbc.Row([dbc.Col(kpi_card) for kpi_card in example_reference_cards]),
],
gap=4,
),
]
)
dash_duo.start_server(app)
dash_duo.wait_for_page(timeout=20)
dash_duo.wait_for_element("div[class='card-kpi card']")
assert_image_equal(dash_duo.driver, get_test_name)
l0uden marked this conversation as resolved.
Show resolved Hide resolved
assert dash_duo.get_logs() == [], "browser console should contain no error"
7 changes: 0 additions & 7 deletions vizro-core/tests/integration/test_examples.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# ruff: noqa: F403, F405
import os
import runpy
from pathlib import Path
Expand Down Expand Up @@ -40,12 +39,6 @@ def dashboard(request, monkeypatch):
examples_path = Path(__file__).parents[2] / "examples"


# Ignore deprecation warning until this is solved: https://github.com/plotly/dash/issues/2590
# The `features` examples do add_type, which ideally we would clean up afterwards to restore vizro.models to
# its previous state. Since we don't currently do this, `hatch run test` fails.
# This is difficult to fix fully by un-importing vizro.models though, since we use `import vizro.models as vm` - see
# https://stackoverflow.com/questions/437589/how-do-i-unload-reload-a-python-module.
@pytest.mark.filterwarnings("ignore:HTTPResponse.getheader():DeprecationWarning")
l0uden marked this conversation as resolved.
Show resolved Hide resolved
# Ignore as it doesn't affect the test run
@pytest.mark.filterwarnings("ignore::pytest.PytestUnhandledThreadExceptionWarning")
@pytest.mark.filterwarnings("ignore:unclosed file:ResourceWarning")
Expand Down
50 changes: 50 additions & 0 deletions vizro-core/tests/tests_utils/e2e_asserts.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
import shutil
from pathlib import Path

import cv2
import imutils
from hamcrest import assert_that, equal_to


def _compare_images(original_image, new_image):
"""Comparison process."""
difference = cv2.subtract(original_image, new_image)
blue, green, red = cv2.split(difference)
assert_that(cv2.countNonZero(blue), equal_to(0), reason="Blue channel is different")
assert_that(cv2.countNonZero(green), equal_to(0), reason="Green channel is different")
assert_that(cv2.countNonZero(red), equal_to(0), reason="Red channel is different")


def _create_image_difference(original, new):
"""Creates new image with diff of images comparison."""
diff = original.copy()
cv2.absdiff(original, new, diff)
gray = cv2.cvtColor(diff, cv2.COLOR_BGR2GRAY)
for i in range(0, 3):
dilated = cv2.dilate(gray.copy(), None, iterations=i + 1)
(t_var, thresh) = cv2.threshold(dilated, 3, 255, cv2.THRESH_BINARY)
cnts = cv2.findContours(thresh, cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)
cnts = imutils.grab_contours(cnts)
for contour in cnts:
(x, y, width, height) = cv2.boundingRect(contour)
cv2.rectangle(new, (x, y), (x + width, y + height), (0, 255, 0), 2)
return new


def assert_image_equal(browserdriver, test_image_name):
"""Comparison logic and diff files creation."""
base_image_name = f"{test_image_name.replace('test', 'base')}.png"
browserdriver.save_screenshot(f"{test_image_name}_branch.png")
original = cv2.imread(f"tests/e2e/screenshots/{base_image_name}")
new = cv2.imread(f"{test_image_name}_branch.png")
try:
_compare_images(original, new)
Path(f"{test_image_name}_branch.png").unlink()
except AssertionError as exp:
l0uden marked this conversation as resolved.
Show resolved Hide resolved
shutil.copy(f"{test_image_name}_branch.png", base_image_name)
diff = _create_image_difference(original=new, new=original)
l0uden marked this conversation as resolved.
Show resolved Hide resolved
cv2.imwrite(f"{test_image_name}_diff_main.png", diff)
raise AssertionError("pictures are not the same") from exp
except cv2.error as exp:
shutil.copy(f"{test_image_name}_branch.png", base_image_name)
raise cv2.error("pictures has different sizes") from exp
l0uden marked this conversation as resolved.
Show resolved Hide resolved