-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JUnit4:always statistics.errorCount=statistics.evlauationCount=2000 #116
Comments
if error count == evaluation count, it suggests that each call to For debugging you can enable trace logging for the
|
another point i didn't mention above: the same performance metrics for the same test method logic give different results according to it depends on deltaspike and CDI or not as below: 1- Pure JUnit4 Test Class-> the test will pass :
its log and report generated succesfully and show no error count :
2- the same test case with the same performance metrices ,but extends CdiTest Runner and configured by deltaspike-> the test will fail :
its log and generated report show error count =evaluation count:
why then this diffrence in the evaluatio between these code snippets for the same test method to evaluate the same performance metrices |
You would need to enable the TRACE level logging for the If you can get that logging, particularly for the EvaluationTask class, we'll be able to see why you are getting 100% errors when you extend the |
ok that is what i did.
the Log on the console shows the test is failed as below output:
Kindly advice |
You need to set the logging to TRACE (not debug) for junitperf package |
sorry about my misunderstood and based on the real test case i have as follow :
and the Trace output (snippet) what is the exceptions thrown
and the test is failed with this output :
|
This seems to be the issue you are having:
And it appears to be happening in an Does the test pass when you comment out the |
yes without @rule the test is passed and this is the log trace in this case
|
and regarding @after is already exist in BusinessComponentTestBase as below:
|
So when the Is there any code in your before or after that would fail if it was executed multiple times? You may have to run the test in debug mode and see whats happening in your before/after methods, because from the trace log it looks like your after method is throwing an exception. |
It also might be worth mentioning that the test will be executed on a new thread. This stack overflow entry suggests the error you are seeing might be threading related https://stackoverflow.com/a/46538520 |
what in @before and @after is supposed to be executed only one time not to be repeated as @test . |
i have tried this way:
as you can see
as you can see the problem with the CDI (Contexts and Dependency Injection) context in the project's test environment[it is apache deltaspike framework] it need to make sure that RequestScoped context is activated and initialized correctly during test execution, so that i can prevent the ContextNotActiveException from being thrown.' WELD-001303: No active contexts for scope type javax.enterprise.context.RequestScoped' however it was initiated successfully there may be a missing of compatibility between JUnitPerf and CDI where org.apache.deltaspike.testcontrol.api.junit.CDITestRunner is responsible to create test task within a container of managed bean Note: the test case in this form is passed without JUnitPerf Rule. Kindly advise |
Again , this post seems to suggest the issue you are having is threading related Junitperf tests will be executed in on a thread separate to the thread that initialises your test (and initialises your Injected dependencies) That stack overflow links to a JIRA ticket discussing this error, and that JIRA contains a PR that updates WELD docs discussing propagating contexts across threads: https://docs.jboss.org/weld/reference/latest/en-US/html_single/#_propagating_built_in_contexts You will need to consult the weld documentation to fix this issue. |
Ah, just i didn't go through it |
i transfer the following JUniTest case (snippet of the code) to the performance test using your JUnitPerf library (version 1.34) as follow:
public class RawDataSet2PamWritePerformanceTest
{
//specifies that the performance test should be run using the JUnitPerfRule class.
@rule
public JUnitPerfRule perfRule = new JUnitPerfRule( new ConsoleReportGenerator());
@test
what is in bold is the exact function is that what i want to test and use case requirement to test the performance is
i want to be able to create 2000 iterations of these envelopes in the database through 1000 milliseconds and through one thread.
but what i used to get while debugging perfrmance is that statistics.errorCount is always equal to statistics.evlauationCount=2000
seems like that no evaluation is applied but i can not get why?
could you support me what could be the reason or missing in my configuration that raise this failure:
java.lang.AssertionError: Error threshold not achieved
The text was updated successfully, but these errors were encountered: