How to implement HTTP Live Streaming server on Unix?

19,259

Solution 1

If you want to stream live content from your webcam : FMLE ( flash media live encoder )

If you want to stream static content (movie) : ffmpeg & xuggle

red5:

You media server could be red5 ( open-source and free) or FMS or wowza. But i used only red5, i don't know about the others. You can find red5 here.

You server can be anywhere but you will need to open some port (1935 for rtmp at least ) , 5080 for "admin panel", you could see 9999 in the list. ( Check the doc ) Red5 is a media server in java, so you will need java jdk >= 1.6.

Red5 1.0 RC can be found here. You can find a version for windows, osx or linux. I used the tarball version. Extract it and run "red5.sh". You should be able to access the admin at http://localhost:5080/ and you should also see a video being display. If not, something wrong and you can't go further until this is working.

Stream with ffmpeg:

You can find xuggle here and you can find more information about this here

ffmpeg -i your_file.flv -re -acodec copy -vcodec copy -f flv rtmp://localhost_or_yourred5serverip/live/livestream

Keep in mind that if you want to stream it on the web only flv and mp4 can be playing in flash player ( i think ). Once it's streaming you should be able to see it in the "admin panel" here. Connect to your server (rtmp://localhost/live/) and go to the view tab and put "livestream". You could use mplayer rtmp://localhost/live/livestream to see your video too.

stream in a flash player:

You can use flowplayer (with the rtmp plugin) or jwplayer.

Solution 2

There are several competing technologies, but today if you want whatever files to be compatible for streaming on Apple devices (iPhones, iPads, etc) then HLS is the way to go. Incidentally it is also supported by most browsers and Android so not a bad place to start. Note however it is not suitable for streaming live content despite the name.

Unless you want live video, you really DON'T need red5 or wowza or fms or anything like that. HLS is basically a set of short video segments (e.g. 5 minutes each) encoded at different bitrates and an m3u playlist you give to your flash or HTML5 based player in the browser. It is kind of up to you to decide the segment length or how you encode it.

This is the best article I've seen about how to pick resolutions, bitrates, segment sizes, etc: http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Adaptive-Streaming-in-the-Field-73017.aspx

From there you just for example create a directory structure, e.g.

/data/video/video_id/original.mp4
/data/video/video_id/quality1/chunk1.mp4
/data/video/video_id/quality1/chunk2.mp4
/data/video/video_id/quality2/chunk1.mp4
etc..

Then you need to generate an m3u playlist for all the chunks and qualities and it's up to the player itself to implement the switching between qualities and playing the next file (which most modern players already have).

I also highly recommend checking out: https://developer.apple.com/streaming/ - Apple provide a bunch of free tools to prepare the videos and playlists for HTTP Live Streaming.

Solution 3

The easiest way to stream HLS is using something like Wowza or FMIS (neither of which come cheap). Wowza will take input (either live stream or stored VOD content and do the segmentation on the fly.

Share:
19,259
alex
Author by

alex

Updated on June 05, 2022

Comments

  • alex
    alex almost 2 years

    I just realized that Apple required HTTP Live Streaming in order to view videos in iPhone apps. I was not aware of this before... I am now trying to understand what this involves so I can decide whether I want to do the work and make the videos available in 3G or limit video playing to users who are connected to wi-fi.

    I read the overview provided by Apple, and now understand that my server needs to segment and index my media files. I also understand that I don't have to host the content to be able to stream it (I can point to a video hosted somewhere else, right?).

    What's not clear to me at this point is what to implement on my server (Ubuntu Hardy) to do the actual segmenting and indexing on the fly (once again, I do not host the videos I want to serve).

    I found a link explaining how to install FFmpeg and X264, but I don't know if this is the best solution (since I have an Ubuntu server, I can't use the Apple Live Streaming tools, is it correct?). Also, I do not understand at which point my server knows that a video needs to be converted and starts the job...

    Any feedback that could help me understand exactly what to do on the server side to be able to stream videos on my iPhone app in 3G would be greatly appreciated! (Oh, and just it makes any difference, my app back-end is in Rails)

  • alex
    alex over 12 years
    and here is a follow-up question: stackoverflow.com/questions/8497541/… in case you can help again! thanks.
  • Dai Bok
    Dai Bok almost 10 years
    Thank you Roman, an excellent link (Adaptive Streaming in the Field - by Jan Ozer ) streamingmedia.com/Articles/Editorial/Featured-Articles/…
  • onmyway133
    onmyway133 over 9 years
    Wowza has free trial, worths trying