Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated failure messages in JUnits #79

Merged
merged 4 commits into from
Aug 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 13 additions & 12 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ on:

permissions:
contents: write # Grant write permissions for contents
checks: write # Grant write permissions for checks
checks: write # Grant write permissions for checks, only effective on push
pull-requests: write # Explicitly grant write permissions for pull requests

jobs:
Expand All @@ -34,56 +34,57 @@ jobs:
run: chmod +x scripts/run_commit_tests.sh

- name: Run Shell Script to Generate Input File
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
run: |
./scripts/run_commit_tests.sh

- name: Run JUnit Report Generation Script
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
run: |
python scripts/into_junit.py /tmp/SHARED.UNITS > junit.xml

- name: Convert JUnit XML to Standard HTML Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
run: |
junit2html junit.xml junit-standard-report.html

- name: Convert JUnit XML to Matrix HTML Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
run: |
junit2html --report-matrix junit.xml junit-matrix-report.html

- name: Upload JUnit XML Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/upload-artifact@v3
with:
name: junit-report
path: junit.xml

- name: Upload Standard HTML Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/upload-artifact@v3
with:
name: junit-standard-html-report
path: junit-standard-report.html

- name: Upload Matrix HTML Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/upload-artifact@v3
with:
name: junit-matrix-html-report
path: junit-matrix-report.html

- name: Display JUnit Test Results
if: github.event_name == 'push' # Only run this step on pushes to main
uses: dorny/test-reporter@v1
with:
name: 'JUnit Results'
path: 'junit.xml'
reporter: 'java-junit'
fail-on-error: false # Do not fail the job if tests fail
fail-on-error: false

- name: Download Previous JUnit Results
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/download-artifact@v3
with:
name: junit-report
Expand All @@ -98,7 +99,7 @@ jobs:
reportgenerator -reports:"previous-junit.xml;junit.xml" -targetdir:"./comparison-report" -reporttypes:"HtmlSummary;HtmlChart"

- name: Upload JUnit Comparison Report
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/upload-artifact@v3
with:
name: junit-comparison-html-report
Expand All @@ -124,7 +125,7 @@ jobs:
allure generate --clean --output ./allure-report ./allure-results

- name: Upload Allure Report as Artifact
continue-on-error: true # extra: Continue even if this step fails
continue-on-error: true
uses: actions/upload-artifact@v3
with:
name: allure-html-report
Expand Down
48 changes: 48 additions & 0 deletions hyperon-wam.vpj
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,54 @@
Excludes=".git/;*.metta.html;*.bak;build/;.*/;*~*/"
L="1"/>
</Folder>
<Folder Name="reports">
<F N="reports/.gitignore"/>
<F N="reports/NewResults.md"/>
<F N="reports/PASS_FAIL.md"/>
<F N="reports/README.md"/>
<F N="reports/SHARED.UNITS.PREV.md"/>
<F N="reports/TEST_LINKS.md"/>
</Folder>
<Folder Name="scripts">
<F N="scripts/1-VSpaceTest.metta"/>
<F N="scripts/2-VSpaceTest.metta"/>
<F N="scripts/3-Learn-Rules.metta"/>
<F N="scripts/4-VSpaceTest.metta"/>
<F N="scripts/5-Learn-Flybase.metta"/>
<F N="scripts/6-Learn-Flybase-Full.metta"/>
<F N="scripts/7-FlybaseResults.metta"/>
<F N="scripts/8-FlybaseQuestions.metta"/>
<F N="scripts/add_lines.sh"/>
<F N="scripts/chado_to_tsv.sh"/>
<F N="scripts/colorize_output.sh"/>
<F N="scripts/convert_to_metta.sh"/>
<F N="scripts/ensure_venv"/>
<F N="scripts/envvars_mettalog.sh"/>
<F N="scripts/flybase_setup.sh"/>
<F N="scripts/html_pass_fail.sh"/>
<F N="scripts/into_junit.py"/>
<F N="scripts/metta-jupyter-kernel"/>
<F N="scripts/metta-jupyter-kernel-debug"/>
<F N="scripts/mettalog-docker"/>
<F N="scripts/mettalog-python"/>
<F N="scripts/mettalog.cmd"/>
<F N="scripts/old_test_in_metta2.sh"/>
<F N="scripts/open-metta-file.reg"/>
<F N="scripts/pass_fail_totals.sh"/>
<F N="scripts/rebuild-fast.sh"/>
<F N="scripts/Rebuild-hyperon.cmd"/>
<F N="scripts/rebuild-hyperon.sh"/>
<F N="scripts/rebuild-min.sh"/>
<F N="scripts/rebuild-nonminal.sh"/>
<F N="scripts/retest.sh"/>
<F N="scripts/run_commit_tests.sh"/>
<F N="scripts/send_keys_debug.sh"/>
<F N="scripts/start_jupyter.sh"/>
<F N="scripts/subtest.sh"/>
<F N="scripts/test_in_metta.sh"/>
<F N="scripts/test_in_metta1.sh"/>
<F N="scripts/total_loonits.sh"/>
</Folder>
<Folder Name="src">
<F
N="src/*.*"
Expand Down
9 changes: 6 additions & 3 deletions scripts/into_junit.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,20 @@ def create_testcase_element(testclass, testname, stdout, identifier, got, expect
# Create the testcase XML element with the class and test name attributes
testcase = ET.Element("testcase", classname=testclass, name=testname)

test_res = f"Assertion: {stdout}\nExpected: {expected}\nActual: {got}"
sys_out_text = f"<![CDATA[\n<a href=\"{url}\">Test Report</a>\n\n{test_res}\n]]>"

if status == "PASS":
# If the test passed, add system-out with a clickable link and details
system_out = ET.SubElement(testcase, "system-out")
system_out.text = f"<![CDATA[\n<a href=\"{url}\">Test Report</a>\n\nAssertion: {stdout}\nExpected: {expected}\nActual: {got}\n]]>"
system_out.text = sys_out_text
else: # status == "FAIL"
# If the test failed, add a failure element with details and a clickable link
failure_message = f"Test failed: Expected '{expected}' but got '{got}'"
failure = ET.SubElement(testcase, "failure", message=failure_message, type="AssertionError")
failure.text = f"<![CDATA[\nAssertionError: {failure_message}\n]]>"
failure.text = f"AssertionError: {failure_message}"
system_out = ET.SubElement(testcase, "system-out")
system_out.text = f"<![CDATA[\n<a href=\"{url}\">Test Report</a>\n\nAssertion: {stdout}\nExpected: {expected}\nActual: {got}\n]]>"
system_out.text = sys_out_text

return testcase

Expand Down
Loading