Skip to content

Commit

Permalink
Apply suggestions from code review
Browse files Browse the repository at this point in the history
  • Loading branch information
maycmlee authored May 9, 2024
1 parent c6d90c8 commit ed0a383
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 3 deletions.
4 changes: 3 additions & 1 deletion content/en/agent/logs/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,9 @@ Log collection requires the Datadog Agent v6.0+. Older versions of the Agent do

## Activate log collection

Collecting logs is **not enabled** by default in the Datadog Agent. If you are running the Agent in a Kubernetes or Docker environment, see the dedicated [Kubernetes Log Collection][2] or [Docker Log Collection][3] documentation. If you want to send logs via other vendors' collectors/forwarders, or you want to pre-process your logging data within your own environment before shipping, see [Observability Pipelines][13].
Collecting logs is **not enabled** by default in the Datadog Agent. If you are running the Agent in a Kubernetes or Docker environment, see the dedicated [Kubernetes Log Collection][2] or [Docker Log Collection][3] documentation.

If you want to send logs using another vendor's collector or forwarder, or you want to preprocess your log data within your environment before shipping, see [Observability Pipelines][13].

To enable log collection with an Agent running on your host, change `logs_enabled: false` to `logs_enabled: true` in the Agent's [main configuration file][4] (`datadog.yaml`).

Expand Down
2 changes: 1 addition & 1 deletion content/en/logs/guide/best-practices-for-log-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ This guide also goes through how to monitor your log usage by:
- [Alerting on indexed logs when the volume passes a specified threshold](#alert-when-an-indexed-log-volume-passes-a-specified-threshold)
- [Setting up exclusion filters on high-volume logs](#set-up-exclusion-filters-on-high-volume-logs)

If you want to transform, redact sensitive data from, or more to your logs before they leave your environment, see how to [aggregate, process, and transform your logging data with Observability Pipelines][29].
If you want to transform your logs or redact sensitive data in your logs before they leave your environment, see how to [aggregate, process, and transform your log data with Observability Pipelines][29].

## Log account configuration

Expand Down
4 changes: 3 additions & 1 deletion content/en/sensitive_data_scanner/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,9 @@ Sensitive data, such as credit card numbers, bank routing numbers, and API keys

Sensitive Data Scanner is a stream-based, pattern matching service used to identify, tag, and optionally redact or hash sensitive data. Security and compliance teams can implement Sensitive Data Scanner as a new line of defense, helping prevent against sensitive data leaks and limiting non-compliance risks.

To use Sensitive Data Scanner, set up a scanning group to define what data to scan and then set up scanning rules to determine what sensitive information to match within the data. If you want to redact your sensitive data within your own environment before shipping to your downstream destinations, see how to [redact sensitive data with Observability Pipelines][14].
To use Sensitive Data Scanner, set up a scanning group to define what data to scan and then set up scanning rules to determine what sensitive information to match within the data.

If you want to redact your sensitive data in your environment before shipping to your downstream destinations, see how to [redact sensitive data with Observability Pipelines][14].

This document walks you through the following:

Expand Down

0 comments on commit ed0a383

Please sign in to comment.