Skip to content
This repository has been archived by the owner on Jul 21, 2023. It is now read-only.

Can we stream with RTMP? #393

Open
RobertoMinelli opened this issue Nov 17, 2021 · 3 comments
Open

Can we stream with RTMP? #393

RobertoMinelli opened this issue Nov 17, 2021 · 3 comments

Comments

@RobertoMinelli
Copy link

Hello!

Is there a chance to live stream to Midspace using Real-Time Messaging Protocol (RTMP) as for YouTube?

Best,
Roberto

@EdNutting
Copy link
Member

Hi Roberto,

Midspace is built on top of AWS's MediaLive live-streaming infrastructure. This can be configured to ingest an RTMP feed as an input to a stream. We also provide a UI for specifying output of a Midspace live-stream (over RTMP) to destinations such as YouTube.

Best,
Ed

@RobertoMinelli
Copy link
Author

Hi Ed,

Thank you a lot for your message!

For our next conference, we would like to use a professional video service (e.g., multiple cameras, video mixing). To this purpose, our use case would be to configure Midspace to ingest an RTMP feed as an input.

As a follow up: our group has experience in using Midspace in an online settings (i.e., speakers in remote/pre-recorded, audience in remote). The conference we're organizing will be hybrid (i.e., speakers in presence, speakers in remote, audience in presence, audience in remote). Could the AWS's MediaLive live-streaming infrastructure support such an event? In other words, is the latency extremely low so that a remote attendee could interact with the live streaming, i.e., asking questions?

Best,
Roberto

@EdNutting
Copy link
Member

Hi Roberto,

Sorry it took me a while to get back to you.

RTMP-based streaming comes with a delay between source and audience. Some of this delay is in the configuration - frame buffer size, etc - and some is in the content distribution. A lag of at least 3 seconds is unavoidable with RTMP. Maximum lag (in our current configuration) is up to 45 seconds, with most users seeing around 20 seconds lag. Multi-camera compositing and encoding would also introduce some delay additional delay on the physical side.

Midspace is moving towards using WebRTC input for everything - including physical cameras in the room - and then compositing that in the cloud for distribution via AWS IVS (Interactive Video Service) streaming. This has significantly less latency. WebRTC latency is sub-500ms and IVS is sub-5-second latency (it's designed for interactive live-streaming including Q&A). It is likely to be possible to incorporate an RTMP input into the IVS half of this but we do not yet have numbers from a physical test to determine what the latency would be - theoretically (according to AWS) it would still be sub-5-second. IVS does not offer a better guarantee and is best-in-class for this kind of interactive streaming. When designing a hybrid conference, it is important to plan to accommodate this few seconds of lag.

With that said, we're not ready to release the new video architecture so we can't provide more detail at this stage. We expect to have a solid hybrid solution (based on the above WebRTC+RTMP+IVS) for events like you've described by the end of Q1 2022.

Best,
Ed

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Development

No branches or pull requests

2 participants