Skip to content

Commit

Permalink
Feature/add functionality to support async measurement tracking (#44)
Browse files Browse the repository at this point in the history
* Refactoring ExecutionContext to add is async identifier

* Adding Test context to support async metrics measurement

* Adding no op stats collector for async stats evalutaion

* Adding JUnitPerfAsyncRule

* Adding example Async test

* Updating readme instructions

* Stepping version to 1.12.0
  • Loading branch information
noconnor authored Mar 2, 2019
1 parent 7ed46bf commit 3448870
Show file tree
Hide file tree
Showing 20 changed files with 677 additions and 60 deletions.
119 changes: 117 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ while gathering statistical information.
This library interface was heavily influenced by the interface in the deprecated
[Contiperf library](https://github.com/lucaspouzac/contiperf) developed by [Lucas Pouzac](https://github.com/lucaspouzac)

<br />

## Contents

[Install Instructions](#install-instructions)
Expand All @@ -28,6 +30,8 @@ This library interface was heavily influenced by the interface in the deprecated

[Build Instructions](#build-instructions)

<br />

## Install Instructions

`JUnitPerf` is available in [maven central](http://bit.ly/2idQDvA)
Expand All @@ -36,10 +40,20 @@ To add the latest version of `JunitPerf` to your gradle project add the followin

`compile 'com.github.noconnor:junitperf:+'`

<br />

## Usage Instructions

This section contains usage details for the `JUnitPerf` library. To see example test cases browse to the [src/test/examples/](src/test/examples/) folder.
[Synchronous Usage](#synchronous-usage)

[Asynchronous Usage](#asynchronous-usage)

<br />

### Synchronous Usage

This section contains details for usage of the `JUnitPerf` library in *synchronous* mode.
To see example test cases browse to the [src/test/examples/](src/test/examples/) folder.

Add the JUnitPerf Rule to your test class

Expand Down Expand Up @@ -94,6 +108,98 @@ The latency is a measurement of the time taken to execute one loop (not includin

More information on statistic calculations can be found [here](#statistics)

<br />

### Asynchronous Usage




This section contains details for usage of the `JUnitPerf` library in *asynchronous* mode.
To see example test cases browse to the [src/test/examples/ExampleAsyncTests](src/test/examples/ExampleAsyncTests.java) folder.

Add the async JUnitPerf Rule to your test class

```
@Rule
public JUnitPerfAsyncRule rule = new JUnitPerfAsyncRule();
```

Next add the `JUnitPerfTest` annotation to the unit test you would like to convert into a performance test

```
@Test
@JUnitPerfTest(durationMs = 125_000, warmUpMs = 10_000, maxExecutionsPerSecond = 1000)
public void whenExecuting1Kqps_thenApiShouldNotCrash(){
TestContext context = rule.newContext();
threadPool.submit( () -> {
... EXECUTE ASYNC TASK ...
... THEN NOTIFY FRAMEWORK OF SUCCESS/FAILURE...
context.success();
// OR
context.Fail();
}
}
```

In the example above, the unittest `whenExecuting1Kqps_thenApiShouldNotCrash` will be executed in a loop for
125 secs (125,000ms).

Async tasks will be rate limited to 1,000 task submissions per second.

The `TestContext` instance is used to capture latency and error stats during the duration of the test. A timer is started when
`rule.newContext()` is called and the timer is stopped when either `context.success()` or `context.fail()` is called.

No statistical data will be captured during the warm up period (10 seconds - 10,000ms)

**NOTE: It is highly recommended to set a maxExecutionsPerSecond when running Async tests to prevent flooding the async client code with task submissions**


Optionally add the performance test requirement annotation (`JUnitPerfTestRequirement`).
The specified requirements will be applied to the statistics gathered during the performance test execution.
If thresholds are not met, test will fail.


```
@Test
@JUnitPerfTest(durationMs = 125_000, warmUpMs = 10_000, maxExecutionsPerSecond = 1000)
@JUnitPerfTestRequirement(percentiles = "90:7,95:7,98:7,99:8", executionsPerSec = 1000, allowedErrorPercentage = 0.10)
public void whenExecuting1Kqps_thenApiShouldNotCrash(){
TestContext context = rule.newContext();
threadPool.submit( () -> {
... EXECUTE ASYNC TASK ...
... THEN NOTIFY FRAMEWORK OF SUCCESS/FAILURE...
context.success();
// OR
context.Fail();
}
}
```


In the example above, the `JUnitPerfTestRequirement` annotation will apply a number of threshold constraints to the performance test.

The tests calculated throughput (executions per second) will be compared to the `executionsPerSec` requirement.
If the test throughput is *less* than the target throughput then the test will fail.

This example test also contains a requirement that the execution error rate be no more than 10% (`allowedErrorPercentage = 0.10`).
An error is an uncaught exception thrown during unittest execution.
If the specified `allowedErrorPercentage` is not met then the test will fail.

Finally the example sets a number of latency thresholds on the 90th, 95th, 98th and 99th percentiles (i.e. if the
99th percentile latency is *greater* than 8ms then the test will fail).
The latency is a measurement of the time taken to execute one loop (not including statistics measurement calculations)

More information on statistic calculations can be found [here](#statistics)





<br />

## Test Configuration Options

`@JUnitPerfTest` has the following configuration parameters:

| Property | Definition | Default value |
Expand All @@ -116,6 +222,7 @@ More information on statistic calculations can be found [here](#statistics)
| maxLatency | Expected maximum latency in ms, if maximum latency is above this value, test will fail | disabled |
| meanLatency | Expected mean latency in ms, if mean latency is above this value, test will fail | disabled |

<br />

## Reports

Expand All @@ -129,6 +236,7 @@ More information on statistic calculations can be found [here](#statistics)

[Multiple Reports](#multiple-reports)

<br />

#### HTML Reports

Expand All @@ -148,6 +256,7 @@ public JUnitPerfRule perfTestRule = new JUnitPerfRule(new HtmlReportGenerator("/
HTML reports are generated using the [jtwig library](http://jtwig.org/). The jtwig report template can be found under `src/main/resources/templates/report.twig`.
It is possible to override this template by placing a customised `templates/report.twig` file on the classpath ahead of the default template.

<br />

#### Console Reporting

Expand Down Expand Up @@ -180,6 +289,8 @@ Example output:
15:55:06.583 [main] INFO c.g.n.j.r.p.ConsoleReportGenerator -
```

<br />

#### CSV Reporting

It is also possible to use the built-in CSV reporter.
Expand Down Expand Up @@ -211,6 +322,7 @@ unittest1,10000,50,101,500000.0,1.430,6.430,1:0.0;2:0.0;3:0.0;4:0.0;5:0.0; ... ;
NOTE: the percentileData is formatted as ```percentile1:latency;percentile2:latency; ...```


<br />

#### Custom Reporting

Expand All @@ -221,6 +333,8 @@ If further customisation is required, a custom implementation of the `ReportGene
public JUnitPerfRule perfTestRule = new JUnitPerfRule(new CustomReportGeneratorImpl());
```

<br />

#### Multiple Reports

It is possible to set *more* than one reporter. This can be done at rule construction time:
Expand All @@ -233,7 +347,7 @@ public JUnitPerfRule perfTestRule = new JUnitPerfRule(new CsvReportGenerator(),
With this configuration a HTML report **AND** a CSV report will be generated



<br />

## Statistics

Expand All @@ -248,6 +362,7 @@ be passed to the `JUnitPerfRule` constructor:
public JUnitPerfRule perfTestRule = new JUnitPerfRule(new CustomStatisticsCalculatorImpl());
```

<br />

## Build Instructions

Expand Down
2 changes: 1 addition & 1 deletion build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ apply plugin: 'jacoco'
group = 'com.github.noconnor'
sourceCompatibility = 1.8
// http://semver.org/
version = '1.11.0'
version = '1.12.0'

repositories {
mavenCentral()
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
package com.github.noconnor.junitperf;

import com.github.noconnor.junitperf.data.EvaluationContext;
import com.github.noconnor.junitperf.data.NoOpTestContext;
import com.github.noconnor.junitperf.data.TestContext;
import com.github.noconnor.junitperf.reporting.ReportGenerator;
import com.github.noconnor.junitperf.reporting.providers.HtmlReportGenerator;
import com.github.noconnor.junitperf.statistics.StatisticsCalculator;
import com.github.noconnor.junitperf.statistics.providers.DescriptiveStatisticsCalculator;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;

import static java.lang.System.currentTimeMillis;
import static java.lang.System.nanoTime;
import static java.util.Objects.nonNull;

@SuppressWarnings("WeakerAccess")
public class JUnitPerfAsyncRule extends JUnitPerfRule {

private long measurementsStartTimeMs;

public JUnitPerfAsyncRule() {
this(new DescriptiveStatisticsCalculator(), new HtmlReportGenerator());
}

public JUnitPerfAsyncRule(ReportGenerator... reportGenerator) {
this(new DescriptiveStatisticsCalculator(), reportGenerator);
}

public JUnitPerfAsyncRule(StatisticsCalculator statisticsCalculator) {
this(statisticsCalculator, new HtmlReportGenerator());
}

public JUnitPerfAsyncRule(StatisticsCalculator statisticsCalculator, ReportGenerator... reportGenerator) {
super(statisticsCalculator, reportGenerator);
}

public TestContext newContext() {
return hasMeasurementPeriodStarted() ? new TestContext(statisticsCalculator) : NoOpTestContext.INSTANCE;
}

@Override
public Statement apply(Statement base, Description description) {
setMeasurementsStartTime(description.getAnnotation(JUnitPerfTest.class));
return super.apply(base, description);
}

@Override
EvaluationContext createEvaluationContext(Description description) {
return new EvaluationContext(description.getMethodName(), nanoTime(), true);
}

private void setMeasurementsStartTime(JUnitPerfTest perfTestAnnotation) {
if (nonNull(perfTestAnnotation)) {
measurementsStartTimeMs = currentTimeMillis() + perfTestAnnotation.warmUpMs();
}
}

private boolean hasMeasurementPeriodStarted() {
return currentTimeMillis() >= measurementsStartTimeMs;
}

}
29 changes: 14 additions & 15 deletions src/main/java/com/github/noconnor/junitperf/JUnitPerfRule.java
Original file line number Diff line number Diff line change
Expand Up @@ -19,34 +19,32 @@

import static com.google.common.collect.Maps.newHashMap;
import static com.google.common.collect.Sets.newHashSet;
import static java.lang.System.nanoTime;
import static java.util.Objects.nonNull;

@Slf4j
@SuppressWarnings("WeakerAccess")
public class JUnitPerfRule implements TestRule {

static final Map<Class, Set<EvaluationContext>> ACTIVE_CONTEXTS = newHashMap();


private final StatisticsCalculator statisticsCalculator;
private final Set<ReportGenerator> reporters;

StatisticsCalculator statisticsCalculator;
PerformanceEvaluationStatementBuilder perEvalBuilder;

@SuppressWarnings("WeakerAccess")
public JUnitPerfRule() {
this(new DescriptiveStatisticsCalculator(), new HtmlReportGenerator());
}

@SuppressWarnings("WeakerAccess")
public JUnitPerfRule(ReportGenerator... reportGenerator) {
this(new DescriptiveStatisticsCalculator(), reportGenerator);
}

@SuppressWarnings("WeakerAccess")
public JUnitPerfRule(StatisticsCalculator statisticsCalculator) {
this(statisticsCalculator, new HtmlReportGenerator());
}

@SuppressWarnings("WeakerAccess")
public JUnitPerfRule(StatisticsCalculator statisticsCalculator, ReportGenerator... reportGenerator) {
this.perEvalBuilder = PerformanceEvaluationStatement.builder();
this.statisticsCalculator = statisticsCalculator;
Expand All @@ -56,17 +54,19 @@ public JUnitPerfRule(StatisticsCalculator statisticsCalculator, ReportGenerator.
@Override
public Statement apply(Statement base, Description description) {
Statement activeStatement = base;

JUnitPerfTest perfTestAnnotation = description.getAnnotation(JUnitPerfTest.class);
JUnitPerfTestRequirement requirementsAnnotation = description.getAnnotation(JUnitPerfTestRequirement.class);

if (nonNull(perfTestAnnotation)) {
// Group test contexts by test class
ACTIVE_CONTEXTS.putIfAbsent(description.getTestClass(), newHashSet());

EvaluationContext context = new EvaluationContext(description.getMethodName(), generateTestStartTime());
EvaluationContext context = createEvaluationContext(description);
context.loadConfiguration(perfTestAnnotation);
context.loadRequirements(requirementsAnnotation);

// Group test contexts by test class
ACTIVE_CONTEXTS.putIfAbsent(description.getTestClass(), newHashSet());
ACTIVE_CONTEXTS.get(description.getTestClass()).add(context);

activeStatement = perEvalBuilder.baseStatement(base)
.statistics(statisticsCalculator)
.context(context)
Expand All @@ -76,15 +76,14 @@ public Statement apply(Statement base, Description description) {
return activeStatement;
}

EvaluationContext createEvaluationContext(Description description) {
return new EvaluationContext(description.getMethodName(), nanoTime());
}

private synchronized void updateReport(Class<?> testClass) {
reporters.forEach(r -> {
r.generateReport(ACTIVE_CONTEXTS.get(testClass));
});

}

private String generateTestStartTime() {
return LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss"));
}

}
Loading

0 comments on commit 3448870

Please sign in to comment.