Skip to content

Commit

Permalink
[CI] Enhance comments in "managed" github issues v2
Browse files Browse the repository at this point in the history
Unfortunately the simple first take in
#628 didn't work since it looks
like the GitHub API doesn't allow us to access the workflow runs while
the workflow is still running.

To work around this, this new approach introduces and runs a new "GitHub
Issue Updater" workflow which runs after the "Nightly CI" and "Weekly
CI" workflows.

This comes with the following complication. The reporter is run once for
all jobs in the matrix, so it needs to figure out which failure should
be reported to which issue in which repository.

For details about the implementation please see the following comment
also available in the jbang script:

// Unfortunately it's not possible to pass information from a triggering
// workflow to the triggered workflow (in this case Nightly/Weekly CI to
// the Github Issue Updater). As a result, to work around this, we parse
// the logs of the jobs of the workflow that triggered this workflow, in
// these logs we can find information like the inputs "issue-number",
// "issue-repo" etc. But we still need to somehow group the jobs
// corresponding to the detected issue-numbers. To do so, we first parse
// the logs of the "Set distribution" job, which is the first job of each
// configuration. This job contains the issue-number and issue-repo inputs
// which we use to get the github issue and map it to the job name prefix
// of jobs that are part of the same configuration.
//
// We then check the status of the jobs of the triggered workflow, and
// if any of them failed, we check if the job name starts with one of the
// job name prefixes we found earlier. If it does, we add it to the list
// of failed jobs for the corresponding issue.
//
// Finally, we process the list of failed jobs for each issue, and if
// the issue is still open, we add a comment with the list of failed jobs
// and the filtered logs of the first failed job.
//
// Mandrel integration tests are treated specially, as they have a fixed
// issue repository, we can directly get the issue number from the logs
// of the job, and we don't need to group the jobs by issue number, since
// the structure of the workflow is simpler.

The resulting comment in the GitHub issue will look like this:

The build is still failing!

* [Q main M 22 latest / Q IT Data5](https://github.com/graalvm/mandrel/actions/runs/7108970473/job/19353697276)
  * Step: Build with Maven
    Filtered Logs:
```
2023-12-06T02:12:48.3045067Z Error: Class initialization of io.vertx.pgclient.impl.codec.DataTypeCodec failed. Use the option
2023-12-06T02:17:23.8202565Z Error: Class initialization of io.vertx.pgclient.impl.codec.DataTypeCodec failed. Use the option
2023-12-06T02:20:25.5812579Z [INFO] Quarkus - Integration Tests - JPA - PostgreSQL ..... FAILURE [04:01 min]
2023-12-06T02:20:25.5815363Z [INFO] Quarkus - Integration Tests - Hibernate Reactive - PostgreSQL FAILURE [01:09 min]
2023-12-06T02:20:25.5818516Z [INFO] Quarkus - Integration Tests - Reactive Pg Client ... FAILURE [ 58.794 s]
```

* [Q main M 22 latest / Q IT Data7](https://github.com/graalvm/mandrel/actions/runs/7108970473/job/19353697645)
  * Step: Build with Maven
    Filtered Logs:
```
2023-12-06T02:21:35.5488777Z Error: Class initialization of io.vertx.pgclient.impl.codec.DataTypeCodec failed. Use the option
2023-12-06T02:23:24.7112818Z Error: Class initialization of io.vertx.pgclient.impl.codec.DataTypeCodec failed. Use the option
2023-12-06T02:34:32.8654383Z [INFO] Quarkus - Integration Tests - Hibernate Reactive with Panache FAILURE [01:29 min]
2023-12-06T02:34:32.8655349Z [INFO] Quarkus - Integration Tests - Hibernate Reactive with Panache and Kotlin FAILURE [01:49 min]
```

* [Q main M 22 latest / Q IT Security3](https://github.com/graalvm/mandrel/actions/runs/7108970473/job/19353698616)
  * Step: Build with Maven
    Filtered Logs:
```
2023-12-06T02:19:21.8703399Z Error: Class initialization of io.vertx.pgclient.impl.codec.DataTypeCodec failed. Use the option
2023-12-06T02:26:58.5434481Z [INFO] Quarkus - Integration Tests - Security WebAuthn .... FAILURE [01:36 min]
```

* [Q main M 22 latest / Q IT Misc2](https://github.com/graalvm/mandrel/actions/runs/7108970473/job/19353699213)
  * Step: Build with Maven
    Filtered Logs:
```
2023-12-06T02:11:17.0198258Z Error: Class initialization of com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl failed. Use the option
2023-12-06T02:21:35.8962084Z [INFO] Quarkus - Integration Tests - Test Extension - Tests FAILURE [ 32.070 s]
```

* [Q main M 22 latest / Q IT AWT, ImageIO and Java2D](https://github.com/graalvm/mandrel/actions/runs/7108970473/job/19353700284)
  * Step: Build with Maven
    Filtered Logs:
```
2023-12-06T02:05:32.5303045Z Error: Error loading a referenced type: com.oracle.svm.hosted.substitute.DeletedElementException: Unsupported method jdk.internal.loader.NativeLibrary.findEntry0(long, String) is reachable
2023-12-06T02:07:51.7787718Z [INFO] Quarkus - Integration Tests - AWT .................. FAILURE [01:24 min]
```

Link to failing CI run: https://github.com/graalvm/mandrel/actions/runs/7108970473
  • Loading branch information
zakkak committed Dec 8, 2023
1 parent d86c2e2 commit 99bcf75
Show file tree
Hide file tree
Showing 4 changed files with 239 additions and 285 deletions.
285 changes: 193 additions & 92 deletions .github/quarkus-ecosystem-issue.java
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,11 @@

import java.io.IOException;
import java.io.UncheckedIOException;
import java.util.function.Predicate;
import java.util.HashMap;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import java.io.BufferedReader;
import java.io.InputStreamReader;

Expand All @@ -42,131 +44,200 @@ class Report implements Runnable {
@Option(names = "token", description = "Github token to use when calling the Github API")
private String token;

@Deprecated
@Option(names = "status", description = "The status of the CI run")
private String status;

@Option(names = "issueRepo", description = "The repository where the issue resides (i.e. quarkusio/quarkus)")
private String issueRepo;

@Option(names = "issueNumber", description = "The issue to update")
private Integer issueNumber;

@Option(names = "thisRepo", description = "The repository for which we are reporting the CI status")
private String thisRepo;

@Option(names = "runId", description = "The ID of the Github Action run for which we are reporting the CI status")
private String runId;


@Option(names = "--dry-run", description = "Whether to actually update the issue or not")
private boolean dryRun;

@Override
public void run() {
try {
final GitHub github = new GitHubBuilder().withOAuthToken(token).build();
final GHRepository issueRepository = github.getRepository(issueRepo);
final GHRepository workflowRepository = github.getRepository(thisRepo);
GHWorkflowRun workflowRun = workflowRepository.getWorkflowRun(Long.parseLong(runId));
Conclusion status = workflowRun.getConclusion();

System.out.println(String.format("The CI build had status %s.", status));


if (status.equals(Conclusion.CANCELLED) || status.equals(Conclusion.SKIPPED)) {
System.out.println("Job status is `cancelled` or `skipped` - exiting");
System.exit(0);
}

final HashMap<GHIssue, String> issues = new HashMap<>();
final HashMap<GHIssue, List<GHWorkflowJob>> failedMandrelJobs = new HashMap<>();

final GHIssue issue = issueRepository.getIssue(issueNumber);
if (issue == null) {
System.out.println(String.format("Unable to find the issue %s in project %s", issueNumber, issueRepo));
System.exit(-1);
} else {
System.out.println(String.format("Report issue found: %s - %s", issue.getTitle(), issue.getHtmlUrl().toString()));
System.out.println(String.format("The issue is currently %s", issue.getState().toString()));
}
// Get the github issue number and repository from the logs
//
// Unfortunately it's not possible to pass information from a triggering
// workflow to the triggered workflow (in this case Nightly/Weekly CI to
// the Github Issue Updater). As a result, to work around this, we parse
// the logs of the jobs of the workflow that triggered this workflow, in
// these logs we can find information like the inputs "issue-number",
// "issue-repo" etc. But we still need to somehow group the jobs
// corresponding to the detected issue-numbers. To do so, we first parse
// the logs of the "Set distribution" job, which is the first job of each
// configuration. This job contains the issue-number and issue-repo inputs
// which we use to get the github issue and map it to the job name prefix
// of jobs that are part of the same configuration.
//
// We then check the status of the jobs of the triggered workflow, and
// if any of them failed, we check if the job name starts with one of the
// job name prefixes we found earlier. If it does, we add it to the list
// of failed jobs for the corresponding issue.
//
// Finally, we process the list of failed jobs for each issue, and if
// the issue is still open, we add a comment with the list of failed jobs
// and the filtered logs of the first failed job.
//
// Mandrel integration tests are treated specially, as they have a fixed
// issue repository, we can directly get the issue number from the logs
// of the job, and we don't need to group the jobs by issue number, since
// the structure of the workflow is simpler.
PagedIterable<GHWorkflowJob> listJobs = workflowRun.listJobs();
listJobs.forEach(job -> {
// Each configuration starts with the Set distribution job
if (job.getName().contains("Set distribution")) {
String fullContent = getJobsLogs(job, "issue-number", "issue-repo");
if (!fullContent.isEmpty()) {
// Get the issue number and repository for mandrel issues
Matcher issueNumberMatcher = Pattern.compile(" issue-number: (\\d+)").matcher(fullContent);
Matcher issueRepoMatcher = Pattern.compile(" issue-repo: (.*)").matcher(fullContent);
if (issueNumberMatcher.find() && issueRepoMatcher.find()) {
int issueNumber = Integer.parseInt(issueNumberMatcher.group(1));
String issueRepo = issueRepoMatcher.group(1);

if (status.equals(Conclusion.SUCCESS)) {
if (issue != null && isOpen(issue)) {
String comment = String.format("Build fixed:\n* Link to latest CI run: https://github.com/%s/actions/runs/%s", thisRepo, runId);
if (!dryRun) {
// close issue with a comment
issue.comment(comment);
issue.close();
System.out.println(String.format("Found issue https://github.com/%s/issues/%s in logs for job %s", issueRepo, issueNumber, job.getName()));
try {
GHRepository issueRepository = github.getRepository(issueRepo);
GHIssue issue = issueRepository.getIssue(issueNumber);
if (issue == null) {
System.out.println(String.format("Unable to find the issue %s in project %s", issueNumber, issueRepo));
System.exit(-1);
} else {
System.out.println(String.format("Report issue found: %s - %s", issue.getTitle(), issue.getHtmlUrl().toString()));
System.out.println(String.format("The issue is currently %s", issue.getState().toString()));
Object oldIssue = issues.put(issue, job.getName().split(" / ")[0]);
if (oldIssue != null) {
System.out.println("WARNING: The issue has already been seen, please check the workflow configuration");
};
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
System.out.println(String.format("Comment added on issue %s\n%s\n, the issue has also been closed", issue.getHtmlUrl().toString(), comment));
} else {
System.out.println("Nothing to do - the build passed and the issue is already closed");
}
} else {

/*
* If the issue contains a line like:
*
* Filter: Q main G 22 latest
*
* then we will only report on the jobs that contain "Q main G 22" in their name, e.g. "Q main G 22 latest / Q IT Misc2".
* This is useful when the github action contains multiple reusable jobs and we want to use a different issue for each of them.
*/
final String filter;
String body = issue.getBody();
if (body != null) {
String regex = "^Job Filter: (.*)$";
Pattern pattern = Pattern.compile(regex);
Matcher matcher = pattern.matcher(body);
if (matcher.find()) {
filter = matcher.group(1);
} else {
filter = "";
}
} else {
filter = "";
}

Predicate<? super GHWorkflowJob> predicate;
if (filter != "") {
System.out.println(String.format("Getting logs from failed jobs with names containing: %s", filter));
predicate = job -> job.getConclusion().equals(Conclusion.FAILURE) && job.getName().contains(filter);
} else {
System.out.println("Getting logs from all failed jobs");
predicate = job -> job.getConclusion().equals(Conclusion.FAILURE);
}

StringBuilder sb = new StringBuilder("Failed jobs:\n");
workflowRun.listJobs().toList().stream().filter(predicate).forEach(job -> {
sb.append(String.format("* [%s](%s)\n", job.getName(), job.getHtmlUrl()));
job.getSteps().stream().filter(step -> step.getConclusion().equals(Conclusion.FAILURE)).forEach(step -> {
sb.append(String.format(" * Step: %s\n", step.getName()));
});
String fullContent = "";
try {
fullContent = job.downloadLogs(getLogArchiveInputStreamFunction());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} else if (job.getConclusion().equals(Conclusion.FAILURE) && (job.getName().contains("Q IT") || job.getName().contains("Mandrel build"))) {
for (GHIssue issue: issues.keySet()) {
if (job.getName().startsWith(issues.get(issue))) {
List<GHWorkflowJob> failedJobsList = failedMandrelJobs.get(issue);
if (failedJobsList == null) {
failedJobsList = new java.util.ArrayList<>();
failedMandrelJobs.put(issue, failedJobsList);
}
System.out.println(String.format("Adding job %s to the list of failed jobs for issue %s", job.getName(), issue.getHtmlUrl().toString()));
failedJobsList.add(job);
}
}
} else if (job.getName().contains("Q Mandrel IT")) {
String fullContent = getJobsLogs(job, "mandrel-it-issue-number", "FAILURE [", "Z Error:");
if (!fullContent.isEmpty()) {
sb.append(" Filtered logs:\n");
sb.append(String.format("```\n%s```\n", fullContent));
// Get the issue number for mandrel-integration-tests issues
Matcher mandrelIssueNumberMatcher = Pattern.compile(" mandrel-it-issue-number: (\\d+)").matcher(fullContent);
if (mandrelIssueNumberMatcher.find()) {
int mandrelIssueNumber = Integer.parseInt(mandrelIssueNumberMatcher.group(1));
System.out.println(String.format("Found issue https://github.com/karm/mandrel-integration-tests/issues/%s in logs for job %s", mandrelIssueNumber, job.getName()));
try {
GHRepository issueRepository = github.getRepository("karm/mandrel-integration-tests");
final GHIssue issue = issueRepository.getIssue(mandrelIssueNumber);
if (issue == null) {
System.out.println(String.format("Unable to find the issue %s in project %s", mandrelIssueNumber, "karm/mandrel-integration-tests"));
System.exit(-1);
} else {
System.out.println(String.format("Report issue found: %s - %s", issue.getTitle(), issue.getHtmlUrl().toString()));
System.out.println(String.format("The issue is currently %s", issue.getState().toString()));
if (job.getConclusion().equals(Conclusion.SUCCESS)) {
if (isOpen(issue)) {
String comment = String.format("Build fixed:\n* Link to latest CI run: https://github.com/%s/actions/runs/%s", thisRepo, runId);
if (!dryRun) {
// close issue with a comment
issue.comment(comment);
issue.close();
}
System.out.println(String.format("Comment added on issue %s\n%s\n, the issue has also been closed", issue.getHtmlUrl().toString(), comment));
} else {
System.out.println("Nothing to do - the build passed and the issue is already closed");
}
} else if (job.getConclusion().equals(Conclusion.FAILURE)) {
StringBuilder sb = new StringBuilder();
if (isOpen(issue)) {
sb.append("The build is still failing!\n\n");
} else {
sb.append("Unfortunately, the build failed!\n\n");
if (!dryRun) {
issue.reopen();
}
System.out.println("The issue has been re-opened");
}
sb.append(String.format("Filtered Logs:\n```\n%s\n```\n\n", fullContent.lines().filter(x -> !x.contains("mandrel-it-issue-number")).collect(Collectors.joining("\n"))));
sb.append(String.format("Link to failing CI run: %s", job.getHtmlUrl()));
String comment = sb.toString();
if (!dryRun) {
issue.comment(comment);
}
System.out.println(String.format("\nComment added on issue %s\n\n%s\n", issue.getHtmlUrl().toString(), comment));
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

}
}
});
}
});

if (isOpen(issue)) {
String comment = String.format("The build is still failing!\n\n%s\nLink to latest CI run: https://github.com/%s/actions/runs/%s", sb.toString(), thisRepo, runId);
if (!dryRun) {
issue.comment(comment);
// Process the failed jobs
for (GHIssue issue: issues.keySet()) {
List<GHWorkflowJob> failedJobs = failedMandrelJobs.get(issue);
if (failedJobs == null || failedJobs.isEmpty()) {
if (isOpen(issue)) {
String comment = String.format("Build fixed:\n* Link to latest CI run: https://github.com/%s/actions/runs/%s", thisRepo, runId);
if (!dryRun) {
// close issue with a comment
issue.comment(comment);
issue.close();
}
System.out.println(String.format("Comment added on issue %s\n%s\n, the issue has also been closed", issue.getHtmlUrl().toString(), comment));
} else {
System.out.println("Nothing to do - the build passed and the issue is already closed");
}
System.out.println(String.format("Comment added on issue %s\n%s", issue.getHtmlUrl().toString(), comment));
} else {
String comment = String.format("Unfortunately, the build failed!\n\n%s\nLink to latest CI run: https://github.com/%s/actions/runs/%s", sb.toString(), thisRepo, runId);
StringBuilder sb = new StringBuilder();
if (isOpen(issue)) {
sb.append("The build is still failing!\n\n");
} else {
sb.append("Unfortunately, the build failed!\n\n");
if (!dryRun) {
issue.reopen();
}
System.out.println("The issue has been re-opened");
}
for (GHWorkflowJob job: failedJobs) {
processFailedJob(sb, job);
}
sb.append(String.format("Link to failing CI run: https://github.com/%s/actions/runs/%s", thisRepo, runId));
String comment = sb.toString();
if (!dryRun) {
issue.reopen();
issue.comment(comment);
}
System.out.println(String.format("Comment added on issue %s\n%s, the issue has been re-opened", issue.getHtmlUrl().toString(), comment));
System.out.println(String.format("\nComment added on issue %s\n\n%s\n", issue.getHtmlUrl().toString(), comment));
}
}
}
Expand All @@ -175,15 +246,45 @@ public void run() {
}
}

private static InputStreamFunction<String> getLogArchiveInputStreamFunction() {
private void processFailedJob(StringBuilder sb, GHWorkflowJob job) {
sb.append(String.format("* [%s](%s)\n", job.getName(), job.getHtmlUrl()));
GHWorkflowJob.Step step = job.getSteps().stream().filter(s -> s.getConclusion().equals(Conclusion.FAILURE)).findFirst().get();
sb.append(String.format(" * Step: %s\n", step.getName()));
String fullContent = getJobsLogs(job, "FAILURE [", "Z Error:");
if (!fullContent.isEmpty()) {
sb.append(String.format(" Filtered Logs:\n```\n%s```\n\n", fullContent));
}
}

private String getJobsLogs(GHWorkflowJob job, String... filters) {
String fullContent = "";
try {
System.out.println(String.format("\nGetting logs for job %s", job.getName()));
fullContent = job.downloadLogs(getLogArchiveInputStreamFunction(filters));
} catch (IOException e) {
System.out.println(String.format("Unable to get logs for job %s", job.getName()));
e.printStackTrace();
}
return fullContent;
}

private static InputStreamFunction<String> getLogArchiveInputStreamFunction(String... filters) {
return (is) -> {
StringBuilder stringBuilder = new StringBuilder();
try (BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(is))) {
String line;
while ((line = bufferedReader.readLine()) != null) {
if (line.contains("FAILURE [") || line.contains("Error:")) {
if (filters.length == 0) {
stringBuilder.append(line);
stringBuilder.append(System.lineSeparator());
} else {
for (String filter : filters) {
if (line.contains(filter)) {
stringBuilder.append(line);
stringBuilder.append(System.lineSeparator());
break;
}
}
}
}
}
Expand Down
Loading

0 comments on commit 99bcf75

Please sign in to comment.