Skip to content

Commit

Permalink
Update tutorial docs to include confluent code examples (#1131)
Browse files Browse the repository at this point in the history
* Remove link to spectacularfailure github profile

* Update missing docs for confluent in tutorial

* Add separate index page for kafka with links to confluent and aiokafka
  • Loading branch information
kumaranvpl authored Jan 12, 2024
1 parent 5cac95b commit 7535b24
Show file tree
Hide file tree
Showing 121 changed files with 1,499 additions and 145 deletions.
40 changes: 22 additions & 18 deletions docs/docs/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,24 +45,25 @@ search:
- [Config Management](getting-started/config/index.md)
- [Task Scheduling](scheduling.md)
- [FastStream Project Template](getting-started/template/index.md)
- [Kafka](kafka/index.md)
- [Subscription](kafka/Subscriber/index.md)
- [Batch Subscriber](kafka/Subscriber/batch_subscriber.md)
- [Publishing](kafka/Publisher/index.md)
- [Batch Publishing](kafka/Publisher/batch_publisher.md)
- [Publish With Key](kafka/Publisher/using_a_key.md)
- [Acknowledgement](kafka/ack.md)
- [Message Information](kafka/message.md)
- [Security Configuration](kafka/security.md)
- [Confluent](confluent/index.md)
- [Subscription](confluent/Subscriber/index.md)
- [Batch Subscriber](confluent/Subscriber/batch_subscriber.md)
- [Publishing](confluent/Publisher/index.md)
- [Batch Publishing](confluent/Publisher/batch_publisher.md)
- [Publish With Key](confluent/Publisher/using_a_key.md)
- [Acknowledgement](confluent/ack.md)
- [Message Information](confluent/message.md)
- [Security Configuration](confluent/security.md)
- [Kafka](kafka/kafka.md)
- [AIOKafka](kafka/index.md)
- [Subscription](kafka/Subscriber/index.md)
- [Batch Subscriber](kafka/Subscriber/batch_subscriber.md)
- [Publishing](kafka/Publisher/index.md)
- [Batch Publishing](kafka/Publisher/batch_publisher.md)
- [Publish With Key](kafka/Publisher/using_a_key.md)
- [Acknowledgement](kafka/ack.md)
- [Message Information](kafka/message.md)
- [Security Configuration](kafka/security.md)
- [Confluent](confluent/index.md)
- [Subscription](confluent/Subscriber/index.md)
- [Batch Subscriber](confluent/Subscriber/batch_subscriber.md)
- [Publishing](confluent/Publisher/index.md)
- [Batch Publishing](confluent/Publisher/batch_publisher.md)
- [Publish With Key](confluent/Publisher/using_a_key.md)
- [Acknowledgement](confluent/ack.md)
- [Message Information](confluent/message.md)
- [Security Configuration](confluent/security.md)
- [RabbitMQ](rabbit/index.md)
- [Subscription](rabbit/examples/index.md)
- [Direct](rabbit/examples/direct.md)
Expand Down Expand Up @@ -342,6 +343,9 @@ search:
- [TopicPartition](api/faststream/confluent/client/TopicPartition.md)
- [check_msg_error](api/faststream/confluent/client/check_msg_error.md)
- [create_topics](api/faststream/confluent/client/create_topics.md)
- fastapi
- [Context](api/faststream/confluent/fastapi/Context.md)
- [KafkaRouter](api/faststream/confluent/fastapi/KafkaRouter.md)
- handler
- [LogicHandler](api/faststream/confluent/handler/LogicHandler.md)
- message
Expand Down
14 changes: 5 additions & 9 deletions docs/docs/en/confluent/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,24 +8,20 @@ search:
boost: 10
---

# Kafka Routing
# Confluent Kafka Routing

## Kafka Overview
## Confluent's Python Client for Apache Kafka

### What is Kafka?
The Confluent Kafka Python library is developed by Confluent, the company founded by the creators of Apache Kafka. It offers a high-level Kafka producer and consumer API that integrates well with the Kafka ecosystem. The Confluent library provides a comprehensive set of features, including support for Avro serialization, schema registry integration, and various configurations to fine-tune performance.

[Kafka](https://kafka.apache.org/){.external-link target="_blank"} is an open-source distributed streaming platform developed by the Apache Software Foundation. It is designed to handle high-throughput, fault-tolerant, real-time data streaming. Kafka is widely used for building real-time data pipelines and streaming applications.

### Confluent's Python Client for Apache Kafka

The Confluent Kafka Python library is developed by Confluent, the company founded by the creators of Apache Kafka. It provides a high-level Kafka producer and consumer API that integrates well with the Kafka ecosystem. The Confluent library offers a comprehensive set of features, including support for Avro serialization, schema registry integration, and various configurations to fine-tune performance. As it is developed by Confluent, it enjoys strong support from the core team behind Kafka. This often translates to better compatibility with the latest Kafka releases and a more robust feature set.
Developed by Confluent, this library enjoys strong support from the core team behind Kafka. This often translates to better compatibility with the latest Kafka releases and a more robust feature set.

!!! note ""
If you prefer the `aiokafka` library instead, then please refer to [aiokafka's KafkaBroker](../kafka/index.md)

### FastStream Confluent KafkaBroker

The FastStream Confluent KafkaBroker is a key component of the FastStream framework that enables seamless integration with Apache Kafka using [confluent kafka](https://github.com/confluentinc/confluent-kafka-python) python library. With the KafkaBroker, developers can easily connect to Kafka brokers, produce messages to Kafka topics, and consume messages from Kafka topics within their FastStream applications.
The FastStream Confluent KafkaBroker is a key component of the FastStream framework that enables seamless integration with Apache Kafka using [confluent kafka](https://github.com/confluentinc/confluent-kafka-python){.external-link target="_blank"} python library. With the KafkaBroker, developers can easily connect to Kafka brokers, produce messages to Kafka topics, and consume messages from Kafka topics within their FastStream applications.

### Establishing a Connection

Expand Down
33 changes: 3 additions & 30 deletions docs/docs/en/kafka/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,43 +8,16 @@ search:
boost: 10
---

# Kafka Routing
# AIOKafka Routing

## Kafka Overview

### What is Kafka?

[Kafka](https://kafka.apache.org/){.external-link target="_blank"} is an open-source distributed streaming platform developed by the Apache Software Foundation. It is designed to handle high-throughput, fault-tolerant, real-time data streaming. Kafka is widely used for building real-time data pipelines and streaming applications.

### Key Kafka Concepts

#### 1. Publish-Subscribe Model

Kafka is built around the publish-subscribe messaging model. In this model, data is published to topics, and multiple consumers can subscribe to these topics to receive the data. This decouples the producers of data from the consumers, allowing for flexibility and scalability.

#### 2. Topics

A **topic** in Kafka is a logical channel or category to which messages are published by producers and from which messages are consumed by consumers. Topics are used to organize and categorize data streams. Each topic can have multiple **partitions**, which enable Kafka to distribute data and provide parallelism for both producers and consumers.

### AIOKafka library
## AIOKafka library

The `aiokafka` library, is an asynchronous Kafka client for Python, built on top of the `asyncio` framework. It is designed to work seamlessly with asynchronous code, making it suitable for applications with high concurrency requirements.

!!! note ""
If you prefer the `confluent-kafka-python` library instead, then please refer to [Confluent's KafkaBroker](../confluent/index.md)

## Kafka Topics

### Understanding Kafka Topics

Topics are fundamental to Kafka and serve as the central point of data distribution. Here are some key points about topics:

- Topics allow you to logically group and categorize messages.
- Each message sent to Kafka is associated with a specific topic.
- Topics can have one or more partitions to enable parallel processing and scaling.
- Consumers subscribe to topics to receive messages.

### FastStream KafkaBroker
## FastStream KafkaBroker

The FastStream KafkaBroker is a key component of the FastStream framework that enables seamless integration with Apache Kafka using [aiokafka](https://github.com/aio-libs/aiokafka) library. With the KafkaBroker, developers can easily connect to Kafka brokers, produce messages to Kafka topics, and consume messages from Kafka topics within their FastStream applications.

Expand Down
59 changes: 59 additions & 0 deletions docs/docs/en/kafka/kafka.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
---
# 0.5 - API
# 2 - Release
# 3 - Contributing
# 5 - Template Page
# 10 - Default
search:
boost: 10
---

# Kafka Routing

## Kafka Overview

### What is Kafka?

[Kafka](https://kafka.apache.org/){.external-link target="_blank"} is an open-source distributed streaming platform developed by the Apache Software Foundation. It is designed to handle high-throughput, fault-tolerant, real-time data streaming. Kafka is widely used for building real-time data pipelines and streaming applications.

### Key Kafka Concepts

#### 1. Publish-Subscribe Model

Kafka is built around the publish-subscribe messaging model. In this model, data is published to topics, and multiple consumers can subscribe to these topics to receive the data. This decouples the producers of data from the consumers, allowing for flexibility and scalability.

#### 2. Topics

A **topic** in Kafka is a logical channel or category to which messages are published by producers and from which messages are consumed by consumers. Topics are used to organize and categorize data streams. Each topic can have multiple **partitions**, which enable Kafka to distribute data and provide parallelism for both producers and consumers.

## Kafka Topics

### Understanding Kafka Topics

Topics are fundamental to Kafka and serve as the central point of data distribution. Here are some key points about topics:

- Topics allow you to logically group and categorize messages.
- Each message sent to Kafka is associated with a specific topic.
- Topics can have one or more partitions to enable parallel processing and scaling.
- Consumers subscribe to topics to receive messages.

## Library support

`FastStream` provides two different `KafkaBroker`s based on the following libraries:

- [Confluent Kafka](https://github.com/confluentinc/confluent-kafka-python){.external-link target="_blank"}
- [aiokafka](https://github.com/aio-libs/aiokafka){.external-link target="_blank"}

### Confluent's Python Client for Apache Kafka

The Confluent Kafka Python library is developed by Confluent, the company founded by the creators of Apache Kafka. It offers a high-level Kafka producer and consumer API that integrates well with the Kafka ecosystem. The Confluent library provides a comprehensive set of features, including support for Avro serialization, schema registry integration, and various configurations to fine-tune performance.

Developed by Confluent, this library enjoys strong support from the core team behind Kafka. This often translates to better compatibility with the latest Kafka releases and a more robust feature set.

Check out [Confluent's KafkaBroker](../confluent/index.md).

### AIOKafka library

The `aiokafka` library, is an asynchronous Kafka client for Python, built on top of the `asyncio` framework. It is designed to work seamlessly with asynchronous code, making it suitable for applications with high concurrency requirements.

Check out [aiokafka's KafkaBroker](../kafka/index.md).
31 changes: 29 additions & 2 deletions docs/docs/en/release.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,33 @@ hide:
---

# Release Notes
## 0.4.0rc0

### What's Changed

This is a **preview version** of 0.4.0 release introducing support for Confluent-based Kafka broker.

Here's a simplified code example demonstrating how to establish a connection to Kafka using FastStream's KafkaBroker module:
```python
from faststream import FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)

@broker.subscriber("in-topic")
@broker.publisher("out-topic")
async def handle_msg(user: str, user_id: int) -> str:
return f"User: {user_id} - {user} registered"
```

#### Changes

* Add support for confluent python lib by [@kumaranvpl](https://github.com/kumaranvpl){.external-link target="_blank"} in [#1042](https://github.com/airtai/faststream/pull/1042){.external-link target="_blank"}


**Full Changelog**: [#0.3.13...0.4.0rc0](https://github.com/airtai/faststream/compare/0.3.13...0.4.0rc0){.external-link target="_blank"}

## 0.3.13

### What's Changed
Expand Down Expand Up @@ -249,11 +276,11 @@ Bug fixes:

Documentation:

* docs: fix misspelled FastDepends reference in README.md by [@spectacularfailure](https://github.com/spectacularfailure){.external-link target="_blank"} in [#1013](https://github.com/airtai/faststream/pull/1013){.external-link target="_blank"}
* docs: fix misspelled FastDepends reference in README.md by @spectacularfailure in [#1013](https://github.com/airtai/faststream/pull/1013){.external-link target="_blank"}

### New Contributors

* [@spectacularfailure](https://github.com/spectacularfailure){.external-link target="_blank"} made their first contribution in [#1013](https://github.com/airtai/faststream/pull/1013){.external-link target="_blank"}
* @spectacularfailure made their first contribution in [#1013](https://github.com/airtai/faststream/pull/1013){.external-link target="_blank"}

**Full Changelog**: [#0.3.0...0.3.1](https://github.com/airtai/faststream/compare/0.3.0...0.3.1){.external-link target="_blank"}

Expand Down
37 changes: 19 additions & 18 deletions docs/docs/navigation_template.txt
Original file line number Diff line number Diff line change
Expand Up @@ -45,24 +45,25 @@ search:
- [Config Management](getting-started/config/index.md)
- [Task Scheduling](scheduling.md)
- [FastStream Project Template](getting-started/template/index.md)
- [Kafka](kafka/index.md)
- [Subscription](kafka/Subscriber/index.md)
- [Batch Subscriber](kafka/Subscriber/batch_subscriber.md)
- [Publishing](kafka/Publisher/index.md)
- [Batch Publishing](kafka/Publisher/batch_publisher.md)
- [Publish With Key](kafka/Publisher/using_a_key.md)
- [Acknowledgement](kafka/ack.md)
- [Message Information](kafka/message.md)
- [Security Configuration](kafka/security.md)
- [Confluent](confluent/index.md)
- [Subscription](confluent/Subscriber/index.md)
- [Batch Subscriber](confluent/Subscriber/batch_subscriber.md)
- [Publishing](confluent/Publisher/index.md)
- [Batch Publishing](confluent/Publisher/batch_publisher.md)
- [Publish With Key](confluent/Publisher/using_a_key.md)
- [Acknowledgement](confluent/ack.md)
- [Message Information](confluent/message.md)
- [Security Configuration](confluent/security.md)
- [Kafka](kafka/kafka.md)
- [AIOKafka](kafka/index.md)
- [Subscription](kafka/Subscriber/index.md)
- [Batch Subscriber](kafka/Subscriber/batch_subscriber.md)
- [Publishing](kafka/Publisher/index.md)
- [Batch Publishing](kafka/Publisher/batch_publisher.md)
- [Publish With Key](kafka/Publisher/using_a_key.md)
- [Acknowledgement](kafka/ack.md)
- [Message Information](kafka/message.md)
- [Security Configuration](kafka/security.md)
- [Confluent](confluent/index.md)
- [Subscription](confluent/Subscriber/index.md)
- [Batch Subscriber](confluent/Subscriber/batch_subscriber.md)
- [Publishing](confluent/Publisher/index.md)
- [Batch Publishing](confluent/Publisher/batch_publisher.md)
- [Publish With Key](confluent/Publisher/using_a_key.md)
- [Acknowledgement](confluent/ack.md)
- [Message Information](confluent/message.md)
- [Security Configuration](confluent/security.md)
- [RabbitMQ](rabbit/index.md)
- [Subscription](rabbit/examples/index.md)
- [Direct](rabbit/examples/direct.md)
Expand Down
16 changes: 16 additions & 0 deletions docs/docs_src/getting_started/cli/confluent_context.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
from faststream import FastStream, ContextRepo
from faststream.confluent import KafkaBroker
from pydantic_settings import BaseSettings

broker = KafkaBroker()

app = FastStream(broker)

class Settings(BaseSettings):
host: str = "localhost:9092"

@app.on_startup
async def setup(env: str, context: ContextRepo):
settings = Settings(_env_file=env)
await broker.connect(settings.host)
context.set_global("settings", settings)
Empty file.
18 changes: 18 additions & 0 deletions docs/docs_src/getting_started/context/confluent/annotated.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
from typing import Annotated

from faststream import Context, FastStream
from faststream.confluent import KafkaBroker
from faststream.confluent.message import KafkaMessage

Message = Annotated[KafkaMessage, Context()]

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)


@broker.subscriber("test")
async def base_handler(
body: str,
message: Message, # get access to raw message
):
...
13 changes: 13 additions & 0 deletions docs/docs_src/getting_started/context/confluent/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
from faststream import Context, FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)


@broker.subscriber("test")
async def base_handler(
body: str,
message=Context(), # get access to raw message
):
...
18 changes: 18 additions & 0 deletions docs/docs_src/getting_started/context/confluent/cast.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
from faststream import Context, FastStream, context
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)
context.set_global("secret", "1")

@broker.subscriber("test-topic")
async def handle(
secret: int = Context(),
):
assert secret == "1"

@broker.subscriber("test-topic2")
async def handle_int(
secret: int = Context(cast=True),
):
assert secret == 1
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
from faststream import FastStream, ContextRepo, Context
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)


@broker.subscriber("test-topic")
async def handle(
msg: str,
secret_str: str=Context(),
):
assert secret_str == "my-perfect-secret" # pragma: allowlist secret


@app.on_startup
async def set_global(context: ContextRepo):
context.set_global("secret_str", "my-perfect-secret")
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from faststream import Context, FastStream, apply_types
from faststream.confluent import KafkaBroker
from faststream.confluent.annotations import ContextRepo, KafkaMessage

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)


@broker.subscriber("test-topic")
async def handle(
msg: str,
message: KafkaMessage,
context: ContextRepo,
):
with context.scope("correlation_id", message.correlation_id):
call()


@apply_types
def call(
message: KafkaMessage,
correlation_id=Context(),
):
assert correlation_id == message.correlation_id
Loading

0 comments on commit 7535b24

Please sign in to comment.