Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

report improvements that can be considered #105

Open
nagkumar opened this issue Jun 28, 2023 · 7 comments
Open

report improvements that can be considered #105

nagkumar opened this issue Jun 28, 2023 · 7 comments

Comments

@nagkumar
Copy link

  1. Downloaded CSV can contain full details of the test case including column header to say test attempt and time taken
image
  1. Unable to understand what exactly is the error
image

If throughput or latest so so.. or allow with a hyperlink to dig deeper on errors
Also, if error, why is nothing in the graph red in color, they are still shown as blue

  1. HTML report can have as header time stamp
@nagkumar
Copy link
Author

  1. Size of these html files are too huge, that way browsers and IDEs are unable to show or open
image

graphs are not shown

image

@nagkumar
Copy link
Author

Also download as CSV (may be a name of performance timing report can be a better name) is showing 100 iterations

image image

@noconnor
Copy link
Owner

noconnor commented Jun 28, 2023

Downloaded CSV can contain full details of the test case including column header to say test attempt and time taken

In the html report, the CSV download just contains the latency percentile distributions.
There will always be 100 distributions representing the 1th percentile up to the 100th percentile.
The percentile distribution tells you the percentage of request at or below a certain latency (i.e. if you hover over the graph you will see something like "95% of requests completed in 545ms or less)

Size of these html files are too huge, that way browsers and IDEs are unable to show or open

If the size of the html reports are too large you can look at using one of the other default reporters (i.e. Console reporter or CSV reporter)

Alternatively you can provide a custom class that implements the ReportGenerator interface and customise the reports.

You can also provide a customised version of report.template in your src/main/resources dir and this template will be used when generating Html reports.

Unable to understand what exactly is the error

To dig deeper into errors you can enable trace logging for the EvaluationTask class (or the whole com.github.noconnor.junitperf package) This will tell you exactly what errors are being asserted.
How you enable this logging will depend on the logging framework you use.
This logging is not enabled by default as it may be excessively noisy for long running perf tests that expect some level of error.

@nagkumar
Copy link
Author

nagkumar commented Jun 28, 2023

The label Latency in y-axis can say the unit as ms, though the tooltip is any way giving that as ms (Milliseconds)

@nagkumar
Copy link
Author

  1. Size of these html files are too huge, that way browsers and IDEs are unable to show or open
image graphs are not shown image

Did you change anything to reduce html file size.. as today with x.34 I see same tests create just 1.69 MB html file.. just confirming to make sure my previous observation was not wrong..

@noconnor
Copy link
Owner

I didn't change anything to do with report generation other then add skipped tests.

Would you have overridden the src/main/resources/report.template by any chance?

@nagkumar
Copy link
Author

I have not made any changes to report.template..unable to reproduce 25mb file now.. let me observe further to know when such a huge size is coming in..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants