Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Batching configuration for publishing #14

Open
medikent opened this issue Oct 16, 2020 · 0 comments
Open

Implement Batching configuration for publishing #14

medikent opened this issue Oct 16, 2020 · 0 comments

Comments

@medikent
Copy link

medikent commented Oct 16, 2020

The default batching behavior indicates a publishing delay for small message volumes:

The cost of batching is latency for individual messages, which are queued in memory until their corresponding batch is filled and ready to be sent over the network. To minimize latency, batching should be turned off.

This latency is problematic for point-to-point messaging in event driven architectures or similar use cases. We should add a batching configuration implementation to enable adjusting batching configuration.

See Google's PubSub Batching docs

PubSub was designed for high throughput and high message volumes and latency under low load has been experienced by many. See thisStackOverflow post and this Issue in the NodeJS pubsub library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant