-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
DOCS-9341: adding languages for manual instrumentation (#25941)
* adding languages for manual instrumentation * formatting
- Loading branch information
Showing
1 changed file
with
49 additions
and
7 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,8 @@ | ||
--- | ||
title: Set up Data Streams Monitoring for Java through Manual Instrumentation | ||
private: true | ||
aliases: | ||
- /data_streams/java_manual_instrumentation | ||
further_reading: | ||
- link: '/integrations/kafka/' | ||
tag: 'Documentation' | ||
|
@@ -10,9 +12,12 @@ further_reading: | |
text: 'Service Catalog' | ||
--- | ||
|
||
<div class="alert alert-info">Manual instrumentation is available for Java. For a list of technologies supported in Java today, see <a href="/data_streams/#setup">Data Streams Monitoring Setup</a>.<br /><br />If you're interested in manual instrumentation for additional languages, reach out to [email protected].</div> | ||
<div class="alert alert-info">Manual instrumentation is available for Java, Node.js, and Python. <br /><br />If you're interested in manual instrumentation for additional languages, reach out to [email protected].</div> | ||
|
||
Data Streams Monitoring propagates context through message headers. If you use a message queue technology that is not yet supported by DSM, a technology without headers (such as Kinesis), or lambdas, use manual instrumentation to set up Data Streams Monitoring. | ||
Data Streams Monitoring (DSM) propagates context through message headers. Use manual instrumentation to set up DSM if you are using: | ||
- a message queue technology that is not supported by DSM | ||
- a message queue technology without headers, such as Kinesis, or | ||
- Lambdas | ||
|
||
### Manual instrumentation installation | ||
|
||
|
@@ -24,19 +29,20 @@ kinesis, kafka, rabbitmq, sqs, sns | |
{{< /code-block >}} | ||
|
||
3. Call the Data Streams Monitoring checkpoints when messages are produced and when they are consumed, as shown in the example code below: | ||
|
||
{{< code-block lang="shell" >}} | ||
{{< tabs >}} | ||
{{% tab "Java" %}} | ||
{{< code-block lang="java" >}} | ||
import datadog.trace.api.experimental.*; | ||
| ||
Carrier headersAdapter = new Carrier(headers); | ||
| ||
# Before calling database PUT | ||
// Before calling database PUT | ||
DataStreamsCheckpointer.get().setProduceCheckpoint("<database-type>", "<topic-name>", headersAdapter); | ||
| ||
# After calling database GET | ||
// After calling database GET | ||
DataStreamsCheckpointer.get().setConsumeCheckpoint("<database-type>", "<topic-name>", headersAdapter); | ||
|
||
# Replace Headers with whatever you're using to pass the context | ||
// Replace headers with whatever you're using to pass the context | ||
private class Carrier implements DataStreamsContextCarrier { | ||
private Headers headers; | ||
|
||
|
@@ -54,7 +60,43 @@ private class Carrier implements DataStreamsContextCarrier { | |
} | ||
} | ||
{{< /code-block >}} | ||
{{% /tab %}} | ||
{{% tab "Node.js" %}} | ||
{{< code-block lang="javascript" >}} | ||
const tracer = require('dd-trace').init({}) | ||
|
||
// before calling produce | ||
const headers = {} | ||
tracer.dataStreamsCheckpointer.setProduceCheckpoint( | ||
"<datastream-type>", "<queue-name>", headers | ||
) | ||
|
||
// after calling consume | ||
tracer.dataStreamsCheckpointer.setConsumeCheckpoint( | ||
"<datastream-type>", "<queue-name>", headers | ||
) | ||
|
||
{{< /code-block >}} | ||
{{% /tab %}} | ||
{{% tab "Python" %}} | ||
{{< code-block lang="python" >}} | ||
from ddtrace.data_streams import set_consume_checkpoint | ||
from ddtrace.data_streams import set_produce_checkpoint | ||
|
||
# before calling produce | ||
headers = {} | ||
set_produce_checkpoint( | ||
"<datastream-type>", "<datastream-name>", headers.setdefault | ||
) | ||
|
||
# after calling consume | ||
set_consume_checkpoint( | ||
"<datastream-type>", "<datastream-name>", headers.get | ||
) | ||
|
||
{{< /code-block >}} | ||
{{% /tab %}} | ||
{{< /tabs >}} | ||
## Further Reading | ||
|
||
{{< partial name="whats-next/whats-next.html" >}} | ||
|