RTMP vs RTSP/RTP: Which to choose for an interactive livestream?

25,975

You make a lot of assumptions in your answer.

WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.

That's simply not true. WebRTC doesn't know or care how you structure your applications server-side. There are plenty of off-the-shelf services for handling large group calls and low latency video distribution via WebRTC.

You should also know that for the media streams, WebRTC is RTP under the hood.

It cannot be TCP, because this could not ensure real-time latency?

Of course it can. There's some overhead with TCP, but nothing that prevents you from using it in a real time scenario. The overhead with TCP is minimal.

UDP is traditionally used for these sorts of scenarios, as reliability isn't required, but that doesn't mean TCP can't be used almost as performantly.

RTMP

RTMP is a dead protocol for Flash. No browsers support it. Other clients only support it for legacy reasons. You shouldn't use it for anything new going forward.

Only that the famous mobile live-streaming app Periscope is using RTMP.

Well, that's not a reason to do much of anything.

  1. Which protocol delivers better results regarding end-to-end latency, session start-up time?

WebRTC

  1. Which one consumes more hardware resources?

That's not the right question to ask. Your overhead in almost any other parts of the application is going to be far more than the transport overhead of the protocol used for distribution.

The real list of things you need to think about:

  • Client compatibility. What sort of clients must you support?
  • Do you really need low latency everywhere? Do you understand the tradeoffs you're making with that demand? Are you willing to destroy any sense of video quality and reliability for all your users if only a handful of them are going to be interactive?
  • What's your budget? Off-the-shelf solutions for distribution are much cheaper. If you can push off your stream to YouTube for non-interactive users, you can save yourself a ton of money. If you can't use existing infrastructure, be prepared to spend mountains of cash.
  • What are your actual latency requirements? Are you prepared to reduce the number of people that can use your application when these latency requirements cannot be met on crappier networks and mobile devices?
  • What are your quality requirements?
  • Where will you transcode video to a variety of bitrates?
  • Do your viewers need adaptive bitrate viewing?
  • Do you need to push streams to other platforms simultaneously?
  • Do you need to record the streaming for on-demand watching or going back in time?

You might also find my post here helpful: https://stackoverflow.com/a/37475943/362536

In short, check your assumptions. Understand the tradeoffs. Make decisions based on real information, not sweeping generalizations.

Share:
25,975

Related videos on Youtube

Joey
Author by

Joey

поехали!

Updated on July 11, 2020

Comments

  • Joey
    Joey almost 4 years

    If you are trying to develop an interactive livestream application, you rely on ultra low (real-time) latency. For example for a video conference or a remote laboratory.

    The two protocols, which should be suitable for this circumstances are:

    • RTSP, while transmitting the data over RTP
    • RTMP

    *WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.

    My questions:

    1. Which one should I choose for this use-case? RTSP/RTP or RTMP?

    2. Which protocol delivers better results regarding end-to-end latency, session start-up time?

    3. Which one consumes more hardware resources?

    4. RTMP seems to use a persistent TCP connection. But which protocol is used for the transmission? It cannot be TCP, because this could not ensure real-time latency?

    5. What are in general the pros and cons for using either of the protocols?

    I did not find any comparison of these two protocols in scientific papers or books. Only that the famous mobile live-streaming app Periscope is using RTMP.

    Other apps like Instagram or Facebook are for example providing text-based interaction with the streamer. If developers want to build the next "killer application" based on interactive live-streams: I think this question is essential to answer.

  • Joey
    Joey almost 7 years
    Thank you Brad! Two points: (1) regarding TCP: In a video conference, where ent-to-end latency for transmitting audio should be below 400ms. How would TCP make here sense in retransmitting lost packets, when they are most likely outdated when they arrive? (2) regarding WebRTC: WebRTC is a P2P-Protocol, where clients communicate directly with each other. A server is only for connection establishment needed. So it cannot scale up for a big audience, unless what I think you point is: powerful server/s are part of the communication?
  • Joey
    Joey almost 7 years
    Make decisions based on real information, not sweeping generalizations. With this question I was looking for real information like results of benchmarks comparing the two protocols under the same conditions. So you just compare the performance regarding latency etc. by absolute numbers. Your answer was detailed, but it unfortunately also didn't include "real information".
  • Brad
    Brad almost 7 years
    WebRTC doesn't have to be P2P, in a strict sense. The server can be (and is very often) one of those "peers". Regarding TCP retransmissions, sure, they can take some time, but first you need to actually figure out what "low latency" means to you, and what tradeoffs you're willing to make to get it. Almost never is so important that you're willing to make huge quality and reliability sacrifices. Most folks find a balance at a few seconds of latency. What I'm trying to convey to you is that you're asking the wrong questions.
  • Brad
    Brad almost 7 years
    Which is faster: A car with a top speed of 120MPH, or a car with a top speed of 115 MPH? Neither, because you're stuck in traffic. Should have taken the train instead. While you're focusing on comparing two different protocols, the rest of your situation matters a lot more. We didn't even get into codecs and their parameters, which are going to change those latency calculations 10x more than your transport protocol is going to matter. Choose the transport protocol for its functionality.
  • Shayne
    Shayne over 6 years
    TCP is lossless. So it doesnt make sense retransmitting lost packets, because as long as the connection holds, you arent losing packets
  • Brad
    Brad over 6 years
    @Shayne The discussion here related to retransmissions are those provided by TCP itself. When a packet is lost and a retransmission has to occur, there's added latency. When playing low latency streams, this means that the stream drops out for a bit while this happens.
  • Muhammad
    Muhammad about 2 years
    By "RTMP is a dead protocol" do you mean only for Flash or for all scenarios e.g. server to server streaming?
  • Brad
    Brad about 2 years
    @Muhammad If were building a greenfield application today, I would not use RTMP at all. The only reason for RTMP is to support legacy applications that don't support anything else. Even a simple HTTP PUT is better than RTMP and obviously has better client support. If you use RTMP, you have a proprietary poorly documented protocol, you're locked in to certain codecs and tracks, you have zero chance of browser support, etc. Even in your server-to-server scenario, there are better solutions depending on your specific needs.
  • Muhammad
    Muhammad about 2 years
    Thanks you @Brad for the detailed answer, currently we are doing a streaming system where we receive multiple streams via RTMP on our streaming server then we combine them into 1 stream via FFmpeg and then we stream it back as a multitrack stream to our media server and play it in browsers as HLS but the delay is too high around 50 seconds, what part we can optimize to lower the delay? Thanks
  • Brad
    Brad about 2 years
    @Muhammad HLS and DASH are inherently high latency, but they don't have to be as high as 50 seconds. HLS is more compatible than DASH due to Apple's anti-competitive policies. HLS is cheaper to host for large audiences because you can re-use any file-based CDN. WebRTC offers the lowest latency but requires special streaming services. Regular HTTP is possible, but requires a custom client and servers. Web Socket-based streaming offers no benefit over regular HTTP, and only adds complexity. So, everything has its tradeoff. Which you choose depends on your specific requirements.
  • Brad
    Brad about 2 years
    @Muhammad Please contact me at [email protected] if you're interested in hiring me as a consultant. I think first step would be to diagnose why you're getting 50-second latency in your HLS streaming.