Skip to content

Commit

Permalink
update__remote_config_and_tech_support.md (#22727)
Browse files Browse the repository at this point in the history
* update__remote_config_and_tech_support.md

Updating setup docs with remote configuration and additional tech support

* Update nodejs.md

Updating node with RabbitMQ packages

* Update python.md

updating python with Rabbitmq package

* Update _index.md

shortened explanation of config at runtime

* Update _index.md

updating metrics

* Update _index.md

* Update _index.md

* Update java.md

updating config at runtime in java docs

* Update _index.md

* Update java.md

* Update nodejs.md

* Update python.md

* Update java.md

* Update nodejs.md

* Update python.md

* Add screenshot of enabling DSM from the Service Catalog page

---------

Co-authored-by: Esther Kim <[email protected]>
  • Loading branch information
Nancyzhu278 and estherk15 authored May 6, 2024
1 parent d572933 commit c8fed8d
Show file tree
Hide file tree
Showing 5 changed files with 21 additions and 6 deletions.
11 changes: 7 additions & 4 deletions content/en/data_streams/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,11 +50,10 @@ To get started, follow the installation instructions to configure services with
| Runtime | Supported technologies |
|---|----|
| Java/Scala | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), RabbitMQ, HTTP, gRPC, Amazon SQS |
| Python | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), Amazon SQS |
| Python | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), RabbitMQ, Amazon SQS |
| .NET | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), RabbitMQ, Amazon SQS |
| Node.js | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), Amazon SQS |
| Node.js | Kafka (self-hosted, Amazon MSK, Confluent Cloud / Platform), RabbitMQ, Amazon SQS |
| Go | All (with [manual instrumentation][1]) |


## Explore Data Streams Monitoring

Expand All @@ -64,8 +63,10 @@ Once Data Streams Monitoring is configured, you can measure the time it usually

| Metric Name | Notable Tags | Description |
|---|---|-----|
| data_streams.latency | `start`, `end`, `env` | End to end latency of a pathway from a specified source to destination service |
| data_streams.latency | `start`, `end`, `env` | End to end latency of a pathway from a specified source to destination service. |
| data_streams.kafka.lag_seconds | `consumer_group`, `partition`, `topic`, `env` | Lag in seconds between producer and consumer. Requires Java Agent v1.9.0 or later. |
| data_streams.payload_size | `consumer_group`, `topic`, `datacenter`, `env` and [the second primary tag][7] | Incoming and outgoing throughput in bytes.|


You can also graph and visualize these metrics on any dashboard or notebook:

Expand Down Expand Up @@ -115,3 +116,5 @@ Datadog automatically links the infrastructure powering your services and relate
[3]: /getting_started/tagging/unified_service_tagging
[4]: /integrations/kafka/
[5]: /integrations/amazon_sqs/
[6]: /tracing/trace_collection/runtime_config/
[7]: /tracing/guide/setting_primary_tags_to_scope/?tab=helm#add-a-second-primary-tag-in-datadog
7 changes: 7 additions & 0 deletions content/en/data_streams/java.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,12 @@ As an alternative, you can set the `-Ddd.data.streams.enabled=true` system prope
```bash
java -javaagent:/path/to/dd-java-agent.jar -Ddd.data.streams.enabled=true -jar path/to/your/app.jar
```

### One-Click Installation
To set up Data Streams Monitoring from the Datadog UI without needing to restart your service, use [Configuration at Runtime][5]. Navigate to the APM Service Page and `Enable DSM`.

{{< img src="data_streams/enable_dsm_service_catalog.png" alt="Enable the Data Streams Monitoring from the Dependencies section of the APM Service Page" >}}

### Supported libraries
Data Streams Monitoring supports the [confluent-kafka library][3].

Expand All @@ -51,3 +57,4 @@ Data Streams Monitoring uses one [message attribute][4] to track a message's pat
[2]: /tracing/trace_collection/dd_libraries/java/
[3]: https://pypi.org/project/confluent-kafka/
[4]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[5]: /agent/remote_config/?tab=configurationyamlfile#enabling-remote-configuration
5 changes: 4 additions & 1 deletion content/en/data_streams/nodejs.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ To start with Data Streams Monitoring, you need recent versions of the Datadog A
* [Node.js Tracer][2]
* Kafka: v2.39.0, v3.26.0, v4.5.0, or later
* Amazon SQS: v4.21.0
* RabbitMQ: v3.48.0, v4.27.0, v5.3.0 or later

### Installation

Expand All @@ -33,7 +34,7 @@ environment:
```
### Supported libraries
Data Streams Monitoring supports the [confluent-kafka library][3].
Data Streams Monitoring supports the [confluent-kafka library][3], [amqplib package][5], and [rhea package][6].
### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
Expand All @@ -46,3 +47,5 @@ Data Streams Monitoring uses one [message attribute][4] to track a message's pat
[2]: /tracing/trace_collection/dd_libraries/nodejs
[3]: https://pypi.org/project/confluent-kafka/
[4]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[5]: https://www.npmjs.com/package/amqplib
[6]: https://www.npmjs.com/package/rhea
4 changes: 3 additions & 1 deletion content/en/data_streams/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ To start with Data Streams Monitoring, you need recent versions of the Datadog A
* [Python Tracer][2]
* Kafka: v1.16.0 or later
* Amazon SQS and Amazon Kinesis: v1.20.0
* RabbitMQ: v2.6.0 or later

### Installation

Expand All @@ -33,7 +34,7 @@ environment:
```
### Libraries Supported
Data Streams Monitoring supports the [confluent-kafka library][3].
Data Streams Monitoring supports the [confluent-kafka library][3] and [kombu package][5].
### Monitoring SQS Pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or less message attributes set, allowing the remaining attribute for Data Streams Monitoring.
Expand All @@ -49,3 +50,4 @@ There are no message attributes in Kinesis to propagate context and track a mess
[2]: /tracing/trace_collection/dd_libraries/python
[3]: https://pypi.org/project/confluent-kafka/
[4]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[5]: https://pypi.org/project/kombu/
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit c8fed8d

Please sign in to comment.