streaming video FROM an iPhone

13,272

You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession's beginConfiguration and commitConfiguration methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:

  • The files can be directly used for HTTP live streaming without any server side processing.
  • The gap between data transfers allow the antennas to sleep in between if the connection is fast enough, saving battery life.
  • Conversely, if the connection is slow so upload is slower than recording, managing delayed upload of a set of files is much easier than a stream of bytes.
Share:
13,272
iHorse
Author by

iHorse

Updated on June 17, 2022

Comments

  • iHorse
    iHorse almost 2 years

    I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.

    Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.