diff --git a/content/en/data_streams/java_manual_instrumentation.md b/content/en/data_streams/manual_instrumentation.md
similarity index 54%
rename from content/en/data_streams/java_manual_instrumentation.md
rename to content/en/data_streams/manual_instrumentation.md
index f4ae79384658d..814021e129d16 100644
--- a/content/en/data_streams/java_manual_instrumentation.md
+++ b/content/en/data_streams/manual_instrumentation.md
@@ -1,6 +1,8 @@
---
title: Set up Data Streams Monitoring for Java through Manual Instrumentation
private: true
+aliases:
+ - /data_streams/java_manual_instrumentation
further_reading:
- link: '/integrations/kafka/'
tag: 'Documentation'
@@ -10,9 +12,12 @@ further_reading:
text: 'Service Catalog'
---
-
Manual instrumentation is available for Java. For a list of technologies supported in Java today, see
Data Streams Monitoring Setup.
If you're interested in manual instrumentation for additional languages, reach out to support@datadoghq.com.
+Manual instrumentation is available for Java, Node.js, and Python.
If you're interested in manual instrumentation for additional languages, reach out to support@datadoghq.com.
-Data Streams Monitoring propagates context through message headers. If you use a message queue technology that is not yet supported by DSM, a technology without headers (such as Kinesis), or lambdas, use manual instrumentation to set up Data Streams Monitoring.
+Data Streams Monitoring (DSM) propagates context through message headers. Use manual instrumentation to set up DSM if you are using:
+- a message queue technology that is not supported by DSM
+- a message queue technology without headers, such as Kinesis, or
+- Lambdas
### Manual instrumentation installation
@@ -24,19 +29,20 @@ kinesis, kafka, rabbitmq, sqs, sns
{{< /code-block >}}
3. Call the Data Streams Monitoring checkpoints when messages are produced and when they are consumed, as shown in the example code below:
-
- {{< code-block lang="shell" >}}
+{{< tabs >}}
+{{% tab "Java" %}}
+{{< code-block lang="java" >}}
import datadog.trace.api.experimental.*;
Carrier headersAdapter = new Carrier(headers);
-# Before calling database PUT
+// Before calling database PUT
DataStreamsCheckpointer.get().setProduceCheckpoint("", "", headersAdapter);
-# After calling database GET
+// After calling database GET
DataStreamsCheckpointer.get().setConsumeCheckpoint("", "", headersAdapter);
-# Replace Headers with whatever you're using to pass the context
+// Replace headers with whatever you're using to pass the context
private class Carrier implements DataStreamsContextCarrier {
private Headers headers;
@@ -54,7 +60,43 @@ private class Carrier implements DataStreamsContextCarrier {
}
}
{{< /code-block >}}
+{{% /tab %}}
+{{% tab "Node.js" %}}
+{{< code-block lang="javascript" >}}
+const tracer = require('dd-trace').init({})
+
+// before calling produce
+const headers = {}
+tracer.dataStreamsCheckpointer.setProduceCheckpoint(
+"", "", headers
+)
+
+// after calling consume
+tracer.dataStreamsCheckpointer.setConsumeCheckpoint(
+"", "", headers
+)
+
+{{< /code-block >}}
+{{% /tab %}}
+{{% tab "Python" %}}
+{{< code-block lang="python" >}}
+from ddtrace.data_streams import set_consume_checkpoint
+from ddtrace.data_streams import set_produce_checkpoint
+
+# before calling produce
+headers = {}
+set_produce_checkpoint(
+"", "", headers.setdefault
+)
+# after calling consume
+set_consume_checkpoint(
+"", "", headers.get
+)
+
+{{< /code-block >}}
+{{% /tab %}}
+{{< /tabs >}}
## Further Reading
{{< partial name="whats-next/whats-next.html" >}}