Skip to content

Commit

Permalink
Add Open Telememetry in Python backend
Browse files Browse the repository at this point in the history
  • Loading branch information
piotrm-nvidia committed Jun 19, 2024
1 parent 0966bf2 commit f51b272
Show file tree
Hide file tree
Showing 21 changed files with 866 additions and 98 deletions.
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,9 @@ limitations under the License.

# Changelog

## Unreleased
## 0.5.7 (2024-06-19)

- New: Open Telemetry propagation support for tracing

[//]: <> (put here on external component update with short summary what change or link to changelog)

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ The distinct capabilities of PyTriton are summarized in the feature matrix:
| ------- | ----------- |
| Native Python support | You can create any [Python function](https://triton-inference-server.github.io/pytriton/latest/inference_callables/) and expose it as an HTTP/gRPC API. |
| Framework-agnostic | You can run any Python code with any framework of your choice, such as: PyTorch, TensorFlow, or JAX. |
| Performance optimization | You can benefit from [dynamic batching](https://triton-inference-server.github.io/pytriton/latest/inference_callables/decorators/#batch), response cache, model pipelining, [clusters](https://triton-inference-server.github.io/pytriton/latest/guides/deploying_in_clusters/), and GPU/CPU inference. |
| Performance optimization | You can benefit from [dynamic batching](https://triton-inference-server.github.io/pytriton/latest/inference_callables/decorators/#batch), response cache, model pipelining, [clusters](https://triton-inference-server.github.io/pytriton/latest/guides/deploying_in_clusters/), performance [tracing](https://triton-inference-server.github.io/pytriton/latest/guides/distributed_tracing/), and GPU/CPU inference.
| Decorators | You can use batching [decorators](https://triton-inference-server.github.io/pytriton/latest/inference_callables/decorators/) to handle batching and other pre-processing tasks for your inference function. |
| Easy [installation](https://triton-inference-server.github.io/pytriton/latest/installation/) and setup | You can use a simple and familiar interface based on Flask/FastAPI for easy installation and [setup](https://triton-inference-server.github.io/pytriton/latest/binding_models/). |
| [Model clients](https://triton-inference-server.github.io/pytriton/latest/clients) | You can access high-level model clients for HTTP/gRPC requests with configurable options and both synchronous and [asynchronous](https://triton-inference-server.github.io/pytriton/latest/clients/#asynciomodelclient) API. |
Expand Down
2 changes: 2 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,6 +202,8 @@ used to profile models served through PyTriton. We have prepared an example of
using the Perf Analyzer to profile the BART PyTorch model. The example code can be found
in [examples/perf_analyzer](../examples/perf_analyzer).

Open Telemetry is a set of APIs, libraries, agents, and instrumentation to provide observability for cloud-native software. We have prepared an
[guide](guides/distributed_tracing.md) on how to use Open Telemetry with PyTriton.

## What next?

Expand Down
Binary file added docs/assets/jaeger_traces_list.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/jaeger_traces_list_only_triton.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/jaeger_traces_list_propagation.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/guides/assets/jaeger_trace_details.png
Binary file not shown.
Binary file removed docs/guides/assets/jaeger_traces_list.png
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading

0 comments on commit f51b272

Please sign in to comment.