Skip to content

Commit

Permalink
OD5829: Minor fix to APM bundled metrics (#1040)
Browse files Browse the repository at this point in the history
* DOCS-2512: Org metrics updates (#989)

* DOCS-2512: Org metrics updates

* Added minimum to def

* Added two more metrics

* sf.org.datapointsTotalCountByToken

* Typo

* sf.org.grossDpmContentBytesReceived

* Arranged alphabetically

* sf.org.log.grossContentBytesReceived WIP

* Finalizing batch, reordering lohs

* Updates

* Revert "DOCS-2512: Org metrics updates (#989)" (#990)

This reverts commit 239a202.

* Removed old metrics (#1005)

See https://signalfuse.atlassian.net/browse/DOCS-5040.

* Updating Org metrics (#1007)

* Putting updated metrics back

See DOCS-5040.

* Removed dimension

* Add new GCP metrics and deprecate old ones (#1010)

sf.org.num.gcpStackdriverClientCallCount* are being deprecated and will be removed.
Those are being replaced with sf.org.num.gcpServiceClientCallCount* metrics that
present more accurate data.

* DOCS-5114-removing-metrics (#1011)

* DOCS-5114-fix (#1013)

* Update metrics - 4680 - batch C.yaml (#1021)

* Update metrics - 4680 - batch C.yaml

* Update metrics.yaml

WIP

* Update metrics.yaml

* Update metrics - batch D.yaml (#1023)

* Update metrics - batch D.yaml

* Update metrics.yaml

* Final metrics from batch

* OD5392-removed-reserved (#1025)

Removed "reserved" note.

* OD5829-update

As requested.

* Update metrics.yaml

---------

Co-authored-by: akonarska <[email protected]>
  • Loading branch information
aurbiztondo-splunk and akonarska authored Nov 10, 2023
1 parent 0d0c1d8 commit 6466edf
Showing 1 changed file with 46 additions and 27 deletions.
73 changes: 46 additions & 27 deletions signalfx-org-metrics/metrics.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -1128,19 +1128,19 @@ sf.org.numApmApiCalls:
title: sf.org.numApmApiCalls

sf.org.numApmBundledMetrics:
brief: APM Bundled Metrics limit for your org
brief: Number of APM Bundled Metrics for your org
description: |
APM Bundled Metrics limit for your org.
Number of APM Bundled Metrics for your org.
* Dimension(s): `orgId`
* Data resolution: 10 minutes
metric_type: gauge
title: sf.org.numApmBundledMetrics

sf.org.numApmBundledMetricsByToken:
brief: APM Bundled Metrics limit for your org, for a token
brief: Number of APM Bundled Metrics for your org, for a token
description: |
APM Bundled Metrics limit for your org for a specific token.
Number of APM Bundled Metrics for your org for a specific token.
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 10 minutes
Expand Down Expand Up @@ -1178,6 +1178,16 @@ sf.org.numBackfillCallsByToken:
* Data resolution: 1 second
metric_type: counter
title: sf.org.numBackfillCallsByToken

sf.org.numComputationsStarted:
brief: Rate at which you're starting new jobs
description: |
Number of computations started, use it to know the rate at which you're starting new jobs.
* Dimension(s): `orgId`
* Data resolution: 10 seconds
metric_type: counter
title: sf.org.numComputationsStarted

sf.org.numBadDimensionMetricTimeSeriesCreateCalls:
brief: Number of bad calls to create MTS due to an error with dimensions
Expand Down Expand Up @@ -1325,8 +1335,9 @@ sf.org.numDatapointsDroppedBatchSizeByToken:
brief: Number of data points dropped because a single request contained more than 100,000 data points, by token.
description: |
Number of data points dropped because a single request contained more than 100,000 data points, per token. In this scenario, Observability Cloud drops data points because it perceives sending more than 100,000 data points in a single request as excessive.
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
metric_type: counter
title: sf.org.numDatapointsDroppedBatchSizeByToken
Expand Down Expand Up @@ -1453,28 +1464,31 @@ sf.org.numDimensionObjectsCreated:
brief: Number of dimensions created
description: |
Total number of dimensions created.
* Dimension(s): `orgId`
* Data resolution: 10 seconds
* Dimension(s): `orgId`
* Data resolution: 10 seconds
metric_type: gauge
title: sf.org.numDimensionObjectsCreated

sf.org.numDimensionObjectsCreatedByToken:
brief: Number of dimensions created by token
description: |
Total number of dimensions created by token.
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
metric_type: gauge
title: sf.org.numDimensionObjectsCreatedByToken

sf.org.numEventSearches:
brief: Number of event searches
description: |
Number of event searches.
* Dimension(s): `orgId`
* Data resolution: 10 seconds
* Dimension(s): `orgId`
* Data resolution: 10 seconds
metric_type: counter
title: sf.org.numEventSearches
Expand All @@ -1483,9 +1497,9 @@ sf.org.numEventSearchesThrottled:
brief: Number of event searches that were throttled
description: |
Number of event searches that were throttled.
* Dimension(s): `orgId`
* Data resolution: 10 seconds
* Dimension(s): `orgId`
* Data resolution: 10 seconds
metric_type: counter
title: sf.org.numEventSearchesThrottled

Expand Down Expand Up @@ -1577,18 +1591,20 @@ sf.org.numHostMetaDataEventsDroppedThrottle:
brief: Number of host metadata events dropped due to throttling
description: |
Number of host metadata events dropped because of throttling.
* Dimension(s): `orgId`
* Data resolution: 40 seconds
* Dimension(s): `orgId`
* Data resolution: 40 seconds
metric_type: counter
title: sf.org.numHostMetaDataEventsDroppedThrottle

sf.org.numHostMetaDataEventsDroppedThrottleByToken:
brief: Number of host metadata events dropped due to throttling by token
description: |
Number of host metadata events dropped because of throttling, per token.
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
* Dimension(s): `orgId, tokenId`
* Data resolution: 40 seconds
metric_type: counter
title: sf.org.numHostMetaDataEventsDroppedThrottleByToken
Expand Down Expand Up @@ -1710,9 +1726,10 @@ sf.org.numLogsReceivedByToken:
brief: Number of logs received by token
description: |
Number of logs received by token.
* Dimension(s): `orgId`, `tokenId`
* Dimension(s): `orgId, tokenId`
* Data resolution: 10 seconds
metric_type: counter
title: sf.org.numLogsReceivedByToken

Expand Down Expand Up @@ -1785,9 +1802,10 @@ sf.org.numMetricObjectsCreatedByToken:
brief: Number of metric objects created by token
description: |
Number of metric objects created by token.
* Dimension(s): `tokenId`, `orgId`
* Data resolution: 40 seconds
* Dimension(s): `tokenId`, `orgId`
* Data resolution: 40 seconds
metric_type: gauge
title: sf.org.numMetricObjectsCreatedByToken

Expand Down Expand Up @@ -1816,8 +1834,9 @@ sf.org.numProcessDataEventsDroppedThrottleByToken:
brief: Number of data ingest process events dropped due to throttling by token
description: |
Per token, number of data ingest process events dropped due to throttling, because you exceeded your system limits.
* Dimension(s): `orgId`, `tokenId`
* Data resolution: 40 seconds
* Dimension(s): `orgId, tokenIn`
* Data resolution: 40 seconds
metric_type: counter
title: sf.org.numProcessDataEventsDroppedThrottleByToken
Expand Down Expand Up @@ -1869,7 +1888,7 @@ sf.org.numResourceMetricsbyToken:
* Dimension(s): `orgId`, `resourceType`, `tokenId`
* Data resolution: 10 minutes
metric_type: gauge
title: sf.org.numResourceMetricsbyToken

Expand Down

0 comments on commit 6466edf

Please sign in to comment.