Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support linking downstream pipelines to upstream pipelines in CI Visibility #405

Merged
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ To spin up a development environment for the *jenkins-datadog* plugin repository

1. Set the `JENKINS_PLUGIN` environment variable to point to the directory where this repository is cloned/forked.
1. Set the `JENKINS_PLUGIN_DATADOG_API_KEY` environment variable with your api key.
1. Set the `JENKINS_PLUGIN_DATADOG_CI_INSTANCE_NAME` to a name you would like your instance to have (makes it easier to identify your pipeline executions by setting `@ci.provider.instance:your-instance-name` filter in the Datadog UI).
1. Optionally set the `GITHUB_SSH_KEY` and `GITHUB_SSH_KEY_PASSPHRASE` environment variables with the key and passphrase that can be used to access GitHub. This allows to automatically create GitHub credentials in Jenkins.
1. Run `mvn clean package -DskipTests` and `docker-compose -p datadog-jenkins-plugin -f docker/docker-compose.yaml up` from the directory where this repository is cloned/forked (if the `docker-compose` command fails with a `path ... not found` error, try updating it to the latest version).
- NOTE: This spins up the Jenkins docker image and auto mounts the target folder of this repository (the location where the binary is built).
Expand Down
3 changes: 3 additions & 0 deletions docker/controller-node/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,9 @@ USER jenkins
COPY plugin-dependencies.txt /var/jenkins_home/plugin-dependencies.txt
RUN jenkins-plugin-cli --latest-specified --plugin-file /var/jenkins_home/plugin-dependencies.txt

# this is not a dependency of the Datadog plugin, it is just needed to test upstream/downstream pipeline linking
RUN jenkins-plugin-cli --latest-specified --plugins pipeline-build-step

COPY jobs /var/jenkins_home/sample-jobs

COPY 10-create-admin-user.groovy /usr/share/jenkins/ref/init.groovy.d/10-create-admin-user.groovy
Expand Down
12 changes: 12 additions & 0 deletions docker/controller-node/jobs/test-pipeline-downstream.cps
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
echo test downstream
'''
}
}
}
}
13 changes: 13 additions & 0 deletions docker/controller-node/jobs/test-pipeline-upstream.cps
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
echo test
'''
build job: 'test-pipeline-downstream'
}
}
}
}
2 changes: 2 additions & 0 deletions docker/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ services:
DATADOG_JENKINS_PLUGIN_TARGET_HOST: datadog # `dogstatsd` or `datadog` based on the container you wish to use
DATADOG_JENKINS_PLUGIN_TARGET_LOG_COLLECTION_PORT: 10518
DATADOG_JENKINS_PLUGIN_TARGET_API_KEY: $JENKINS_PLUGIN_DATADOG_API_KEY
DATADOG_JENKINS_PLUGIN_ENABLE_CI_VISIBILITY: true
DATADOG_JENKINS_PLUGIN_CI_VISIBILITY_CI_INSTANCE_NAME: $JENKINS_PLUGIN_DATADOG_CI_INSTANCE_NAME
volumes:
- jenkins_shared:/var/jenkins_home/shared
- $JENKINS_PLUGIN/target/datadog.hpi:/var/jenkins_home/plugins/datadog.hpi
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -114,10 +114,16 @@ public void onInitialize(Run run) {
return;
}

final TraceSpan buildSpan = new TraceSpan("jenkins.build", TimeUnit.MILLISECONDS.toNanos(buildData.getStartTime(0L)));
BuildSpanManager.get().put(buildData.getBuildTag(""), buildSpan);
TraceSpan.TraceSpanContext buildSpanContext = new TraceSpan.TraceSpanContext();
BuildSpanManager.get().put(buildData.getBuildTag(""), buildSpanContext);

final BuildSpanAction buildSpanAction = new BuildSpanAction(buildSpan.context());
TraceSpan.TraceSpanContext upstreamBuildSpanContext = null;
String upstreamBuildTag = buildData.getUpstreamBuildTag("");
if (upstreamBuildTag != null) {
upstreamBuildSpanContext = BuildSpanManager.get().get(upstreamBuildTag);
drodriguezhdez marked this conversation as resolved.
Show resolved Hide resolved
}

final BuildSpanAction buildSpanAction = new BuildSpanAction(buildSpanContext, upstreamBuildSpanContext);
run.addAction(buildSpanAction);

run.addAction(new GitCommitAction());
Expand Down Expand Up @@ -404,8 +410,6 @@ public void onFinalized(Run run) {
traceWriter.submitBuild(buildData, run);
logger.fine("End DatadogBuildListener#onFinalized");

BuildSpanManager.get().remove(buildData.getBuildTag(""));

} catch (InterruptedException e) {
Thread.currentThread().interrupt();
DatadogUtilities.severe(logger, e, "Interrupted while processing build finalization");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,6 @@ of this software and associated documentation files (the "Software"), to deal
import org.datadog.jenkins.plugins.datadog.DatadogUtilities;
import org.datadog.jenkins.plugins.datadog.traces.BuildConfigurationParser;
import org.datadog.jenkins.plugins.datadog.traces.BuildSpanAction;
import org.datadog.jenkins.plugins.datadog.traces.BuildSpanManager;
import org.datadog.jenkins.plugins.datadog.traces.message.TraceSpan;
import org.datadog.jenkins.plugins.datadog.util.TagsUtil;
import org.datadog.jenkins.plugins.datadog.util.git.GitUtils;
Expand All @@ -84,7 +83,6 @@ public class BuildData implements Serializable {
private static final long serialVersionUID = 1L;

private static transient final Logger LOGGER = Logger.getLogger(BuildData.class.getName());

private String buildNumber;
private String buildId;
private String buildUrl;
Expand All @@ -94,6 +92,8 @@ public class BuildData implements Serializable {
private String jobName;
private Map<String, String> buildConfigurations;
private String buildTag;
@Nullable
private String upstreamBuildTag;
private String jenkinsUrl;
private String executorNumber;
private String javaHome;
Expand Down Expand Up @@ -149,8 +149,11 @@ public class BuildData implements Serializable {
* The backend needs version to determine the relative order of these multiple events.
*/
private Integer version;
private String traceId;
private String spanId;
private Long traceId;
private Long spanId;

private String upstreamPipelineUrl;
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
private Long upstreamPipelineTraceId;
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved

public BuildData(Run<?, ?> run, @Nullable TaskListener listener) throws IOException, InterruptedException {
if (run == null) {
Expand All @@ -168,6 +171,10 @@ public BuildData(Run<?, ?> run, @Nullable TaskListener listener) throws IOExcept
this.buildUrl = buildSpanAction.getBuildUrl();
}
this.version = buildSpanAction.getAndIncrementVersion();

TraceSpan.TraceSpanContext buildSpanContext = buildSpanAction.getBuildSpanContext();
this.traceId = buildSpanContext.getTraceId();
this.spanId = buildSpanContext.getSpanId();
}

// Populate instance using environment variables.
Expand Down Expand Up @@ -264,11 +271,37 @@ public BuildData(Run<?, ?> run, @Nullable TaskListener listener) throws IOExcept
// Build parameters
populateBuildParameters(run);

// Set Tracing IDs
final TraceSpan buildSpan = BuildSpanManager.get().get(getBuildTag(""));
if(buildSpan !=null) {
this.traceId = Long.toUnsignedString(buildSpan.context().getTraceId());
this.spanId = Long.toUnsignedString(buildSpan.context().getSpanId());
populateUpstreamPipelineData(run, envVars);
}

private void populateUpstreamPipelineData(Run<?, ?> run, EnvVars envVars) {
CauseAction causeAction = run.getAction(CauseAction.class);
if (causeAction == null) {
return;
}
Cause.UpstreamCause upstreamCause = causeAction.findCause(Cause.UpstreamCause.class);
if (upstreamCause == null) {
return;
}

String hudsonUrl = envVars.get("HUDSON_URL");
drodriguezhdez marked this conversation as resolved.
Show resolved Hide resolved
String upstreamUrl = upstreamCause.getUpstreamUrl();
int upstreamBuild = upstreamCause.getUpstreamBuild();
if (hudsonUrl != null && upstreamUrl != null) {
upstreamPipelineUrl = hudsonUrl + upstreamUrl + upstreamBuild + "/";
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
}

String upstreamProject = upstreamCause.getUpstreamProject();
if (upstreamProject != null) {
upstreamBuildTag = "jenkins-" + upstreamProject.replace('/', '-') + "-" + upstreamBuild;
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved

BuildSpanAction buildSpanAction = run.getAction(BuildSpanAction.class);
if (buildSpanAction != null) {
TraceSpan.TraceSpanContext upstreamSpanContext = buildSpanAction.getUpstreamSpanContext();
if (upstreamSpanContext != null) {
upstreamPipelineTraceId = upstreamSpanContext.getTraceId();
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
}
}
}
}

Expand Down Expand Up @@ -670,10 +703,23 @@ public Integer getVersion() {
return version;
}

public Long getTraceId() {
return traceId;
}

public Long getSpanId() {
return spanId;
}

public String getBuildTag(String value) {
return defaultIfNull(buildTag, value);
}

@Nullable
public String getUpstreamBuildTag(String value) {
return defaultIfNull(upstreamBuildTag, value);
}

public String getJenkinsUrl(String value) {
return defaultIfNull(jenkinsUrl, value);
}
Expand Down Expand Up @@ -816,6 +862,14 @@ private String getUserEmailByUserId(String userId) {
}
}

public String getUpstreamPipelineUrl() {
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
return upstreamPipelineUrl;
}

public Long getUpstreamPipelineTraceId() {
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
return upstreamPipelineTraceId;
}

public JSONObject addLogAttributes(){

JSONObject payload = new JSONObject();
Expand Down Expand Up @@ -883,11 +937,11 @@ public JSONObject addLogAttributes(){
payload.put("hostname", this.hostname);

if(traceId != null){
payload.put("dd.trace_id", this.traceId);
payload.put("dd.trace_id", Long.toUnsignedString(traceId));
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
}

if(spanId != null) {
payload.put("dd.span_id", this.spanId);
payload.put("dd.span_id", Long.toUnsignedString(spanId));
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
}
return payload;
} catch (Exception e){
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import com.thoughtworks.xstream.io.HierarchicalStreamWriter;
import java.util.Objects;
import java.util.concurrent.atomic.AtomicInteger;
import javax.annotation.Nullable;
import org.datadog.jenkins.plugins.datadog.model.DatadogPluginAction;
import org.datadog.jenkins.plugins.datadog.traces.message.TraceSpan;
import org.datadog.jenkins.plugins.datadog.util.conversion.DatadogActionConverter;
Expand All @@ -20,16 +21,19 @@ public class BuildSpanAction extends DatadogPluginAction {
private static final long serialVersionUID = 1L;

private final TraceSpan.TraceSpanContext buildSpanContext;
private final TraceSpan.TraceSpanContext upstreamSpanContext;
private final AtomicInteger version;
private volatile String buildUrl;

public BuildSpanAction(final TraceSpan.TraceSpanContext buildSpanContext){
this.buildSpanContext = buildSpanContext;
this.version = new AtomicInteger(0);
public BuildSpanAction(final TraceSpan.TraceSpanContext buildSpanContext, @Nullable final TraceSpan.TraceSpanContext upstreamSpanContext) {
this.buildSpanContext = buildSpanContext;
this.upstreamSpanContext = upstreamSpanContext;
this.version = new AtomicInteger(0);
}

public BuildSpanAction(TraceSpan.TraceSpanContext buildSpanContext, int version, String buildUrl) {
public BuildSpanAction(TraceSpan.TraceSpanContext buildSpanContext, TraceSpan.TraceSpanContext upstreamSpanContext, int version, String buildUrl) {
nikita-tkachenko-datadog marked this conversation as resolved.
Show resolved Hide resolved
this.buildSpanContext = buildSpanContext;
this.upstreamSpanContext = upstreamSpanContext;
this.version = new AtomicInteger(version);
this.buildUrl = buildUrl;
}
Expand All @@ -38,6 +42,10 @@ public TraceSpan.TraceSpanContext getBuildSpanContext() {
return buildSpanContext;
}

public TraceSpan.TraceSpanContext getUpstreamSpanContext() {
return upstreamSpanContext;
}

public String getBuildUrl() {
return buildUrl;
}
Expand Down Expand Up @@ -74,7 +82,7 @@ public String toString() {

public static final class ConverterImpl extends DatadogActionConverter<BuildSpanAction> {
public ConverterImpl(XStream xs) {
super(new ConverterV1());
super(new ConverterV1(), new ConverterV2());
}
}

Expand Down Expand Up @@ -110,7 +118,50 @@ public BuildSpanAction unmarshal(HierarchicalStreamReader reader, UnmarshallingC
reader.moveUp();
}

return new BuildSpanAction(spanContext, version, buildUrl);
return new BuildSpanAction(spanContext, null, version, buildUrl);
}
}

public static final class ConverterV2 extends VersionedConverter<BuildSpanAction> {

private static final int VERSION = 2;

public ConverterV2() {
super(VERSION);
}

@Override
public void marshal(BuildSpanAction action, HierarchicalStreamWriter writer, MarshallingContext context) {
writeField("version", action.version.get(), writer, context);
writeField("spanContext", action.buildSpanContext, writer, context);
if (action.upstreamSpanContext != null) {
writeField("upstreamSpanContext", action.upstreamSpanContext, writer, context);
}
if (action.buildUrl != null) {
writeField("buildUrl", action.buildUrl, writer, context);
}
}

@Override
public BuildSpanAction unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context) {
int version = readField(reader, context, int.class);
TraceSpan.TraceSpanContext spanContext = readField(reader, context, TraceSpan.TraceSpanContext.class);

String buildUrl = null;
TraceSpan.TraceSpanContext upstreamSpanContext = null;
while (reader.hasMoreChildren()) {
reader.moveDown();
String fieldName = reader.getNodeName();
if ("buildUrl".equals(fieldName)) {
buildUrl = (String) context.convertAnother(null, String.class);
}
if ("upstreamSpanContext".equals(fieldName)) {
upstreamSpanContext= readField(reader, context, TraceSpan.TraceSpanContext.class);
}
reader.moveUp();
}

return new BuildSpanAction(spanContext, upstreamSpanContext, version, buildUrl);
}
}
}
Original file line number Diff line number Diff line change
@@ -1,34 +1,50 @@
package org.datadog.jenkins.plugins.datadog.traces;

import org.datadog.jenkins.plugins.datadog.traces.message.TraceSpan;

import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ConcurrentHashMap;
import java.util.logging.Logger;
import org.datadog.jenkins.plugins.datadog.traces.message.TraceSpan;

/**
* Used to propagate the build Span between onStart() and onComplete() methods.
* This mechanism is needed because the Span object cannot be serialized in a Jenkins Action.
* Used to store trace data after the build has finished.
* The data is needed to link upstream build to a downstream build.
*/
public class BuildSpanManager {

private static final Logger LOGGER = Logger.getLogger(BuildSpanManager.class.getName());

private static final BuildSpanManager INSTANCE = new BuildSpanManager();
private final Map<String, TraceSpan> traceSpanByBuildTag = new HashMap<>();
private final Map<String, TraceSpan.TraceSpanContext> contextByTag = new ConcurrentHashMap<>();
private final BlockingQueue<String> tags = new ArrayBlockingQueue<>(getCapacity());
drodriguezhdez marked this conversation as resolved.
Show resolved Hide resolved

public static BuildSpanManager get() {
return INSTANCE;
}

public TraceSpan put(final String tag, final TraceSpan span) {
return traceSpanByBuildTag.put(tag, span);
public void put(final String tag, final TraceSpan.TraceSpanContext context) {
drodriguezhdez marked this conversation as resolved.
Show resolved Hide resolved
while (!tags.offer(tag)) {
// drop the oldest tag if the storage is full
contextByTag.remove(tags.poll());
}
contextByTag.put(tag, context);
}

public TraceSpan get(final String tag) {
return traceSpanByBuildTag.get(tag);
public TraceSpan.TraceSpanContext get(final String tag) {
return contextByTag.get(tag);
}

public TraceSpan remove(final String tag){
return traceSpanByBuildTag.remove(tag);
private static int getCapacity() {
String maxSize = System.getenv("DD_JENKINS_SPAN_CONTEXT_STORAGE_MAX_SIZE");
if (maxSize != null) {
try {
return Integer.parseInt(maxSize);
drodriguezhdez marked this conversation as resolved.
Show resolved Hide resolved
} catch (NumberFormatException e) {
LOGGER.warning("Invalid value for DD_JENKINS_SPAN_CONTEXT_STORAGE_MAX_SIZE: " + maxSize);
}
}
return 1024;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we have this in a constant?

Suggested change
return 1024;
return DEFAULT_SPAN_CONTEXT_STORAGE_MAX_SIZE;

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Introduced a constant

}


}
Loading
Loading