Upload live streaming video from iPhone like Ustream or Qik

23,384

Solution 1

There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.

The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.

Here's the flow: https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

And here's some code:

// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];

// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];

// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];

// go!
[captureSession startRunning];

Then the output device's delegate (here, self) has to implement the callback:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

   NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}

EDIT/UPDATE

Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...

Basically, in the didOutputSampleBuffer function above, you add the samples into an AVAssetWriter. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.

The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future and restart the sequence.

This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.

Solution 2

I have found one library that will help you on this.

HaishinKit Streaming Library

Above Library is giving you all option streaming Via RTMP or HLS.

Just follow this library given step and read it all instruction carefully. Please don't direct run example code given in this library it is having some error instead of that get required class and pod into your demo app.

I have just done it with this you can record screen, Camera and Audio.

Share:
23,384
0pcl
Author by

0pcl

Updated on December 19, 2020

Comments

  • 0pcl
    0pcl over 3 years

    How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.

    Is Apple's Http Living Streaming something I should use? Or something else? Thanks.

    • Hunter
      Hunter over 14 years
      They're not using HTTP Live Streaming. All of the recently approved apps are actually using a private API for capturing the screen. Almost inexplicably, Apple reversed the policy on this specific set of CoreGraphics calls and allowed these apps in. Expect a true API for this feature in a future iPhone OS release - these apps will be required to use that when it is available. In the meantime, these currently private calls are okay.
    • 0pcl
      0pcl over 14 years
      Hi, I found that we might need a media server like Wowza to allow RTSP streaming, but you can also do something similar without using RTSP by HTTP. I am a bit clueless on this topic now actually, correct me if I am wrong. I understand that people use private API for capturing the screen, but what does it have to do with streaming them to the server? Thanks!
  • jab
    jab over 12 years
    I should add that I'm not doing it this way anymore, since frame-by-frame upload turned out to be too slow for me. But if you're looking for a way to edit frames as they come in from the device's camera, this is it.
  • Janak Nirmal
    Janak Nirmal over 12 years
    Can you please share/assist code for uploading video mechanism which is not slow as you mentioned ? Any hint please ?
  • jab
    jab about 12 years
    Well, to speed up the data transfer, the video has to be compressed. So, two possibilities: 1) Compress it on the fly, requiring a codec library plus lots of CPU; or 2) Use the iPhone's built-in, hardware-accelerated mp4 compression -- but that only supports streaming to disk. I am streaming to disk, changing target files every few seconds and uploading the finished files. It's very tricky and complex, even without the workarounds for several Apple bugs I found. You can't easily use a single file as a pipe, because the frame index doesn't get written until the file is closed.
  • jab
    jab about 12 years
    @NoMoreWishes My list of solutions above is stated a different way in this answer.
  • Ramz
    Ramz over 11 years
    How can we implement Live broadcasting from iOS device to a server by using above answer?
  • sajwan
    sajwan about 11 years
    Hi how did you achieve this without frame by frame upload ?
  • Siriss
    Siriss almost 11 years
    I understand this is old, but I am stuck on the server side of this very topic. How did you configure your server to handle the stream of image frames?
  • jab
    jab almost 11 years
    @Siriss - I uploaded short MP4s instead, and concatenated them with ffmpeg. See the final paragraph of my edit above.
  • kashifmehmood
    kashifmehmood over 10 years
    you said HTTP Live Streaming... can it be done with RTMP server like wowza...?
  • jab
    jab over 10 years
    @kashifmehmood I think you're talking about downloading (i.e., watching videos)? This question is about getting the videos from an iPhone to a server. What you do with your videos once they're on the server is a separate topic.
  • kashifmehmood
    kashifmehmood over 10 years
    @jab i was saying about broadcasting... i was just asking how do you send those frames to media server like wowza over rtmp...?
  • jab
    jab over 10 years
    @kashifmehmood HTTP Live Streaming is for downloads, so it's not a direct comparison. I don't know anything about Wowza or RTMP, sorry.
  • kashifmehmood
    kashifmehmood over 10 years
    @jab what am trying to ask is that you said "callback sends each frame over the network to the server" ... i just want to know how to send frames to the server...
  • jab
    jab over 10 years
    @kashifmehmood I was using HTTP POST on each frame. It wasn't efficient. As I said above, I later switched to uploading 5-second videos, still by HTTP POST.
  • johk95
    johk95 over 10 years
    Could somebody please take a look at my question regarding live streaming directly from iPhone to iPhone? Thanks stackoverflow.com/questions/20894810/…
  • Amin Ariana
    Amin Ariana over 8 years
    Streaming media involves four steps: (1) encoding the data from hardware, (2) transferring the data to a server, (3) transcoding the data to the right downstream format, (4) downloading, decoding and playing the data. The question is about step 2. Your answer concerns step 3.
  • zr0gravity7
    zr0gravity7 over 2 years
    What kind of latency do you experience with this approach? Say from the instant the image is captured to when it is received by the app and when it is sent to the server?
  • Hardik Vyas
    Hardik Vyas over 2 years
    I have used this for Screen Record and create m3u8 file.@zr0gravity7