AVCapture capturing and getting framebuffer at 60 fps in iOS 7

22,925

Solution 1

I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter.

You have to set thew AVCaptureSession to a format that supports 60 fps:

AVsession = [[AVCaptureSession alloc] init];

AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (capInput) [AVsession addInput:capInput];

for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] ) 
{
    CMFormatDescriptionRef description= vFormat.formatDescription;
    float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;

    if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
    {
        if ( YES == [videoDevice lockForConfiguration:NULL] ) 
        {
           videoDevice.activeFormat = vFormat;
           [videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
           [videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
           [videoDevice unlockForConfiguration];
           NSLog(@"formats  %@ %@ %@",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
        }
     }
}

prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: prevLayer];

AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoOut setSampleBufferDelegate:self queue:videoQueue];

videoOut.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
videoOut.alwaysDiscardsLateVideoFrames=YES;

if (videoOut)
{
    [AVsession addOutput:videoOut];
    videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
}

Two other comment if you want to write to a file using AVAssetWriter. Don't use the pixelAdaptor, just ad the samples with

[videoWriterInput appendSampleBuffer:sampleBuffer]

Secondly when setting up the assetwriter use

[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                   outputSettings:videoSettings 
                                 sourceFormatHint:formatDescription];

The sourceFormatHint makes a difference in writing speed.

Solution 2

I have developed the same function for Swift 2.0. I post here the code for who could need it:

// Set your desired frame rate
func setupCamera(maxFpsDesired: Double = 120) {
var captureSession = AVCaptureSession()
    captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    do{ let input = try AVCaptureDeviceInput(device: backCamera)
        captureSession.addInput(input) }
    catch { print("Error: can't access camera")
        return
    }
    do {
        var finalFormat = AVCaptureDeviceFormat()
        var maxFps: Double = 0
        for vFormat in backCamera!.formats {
            var ranges      = vFormat.videoSupportedFrameRateRanges as!  [AVFrameRateRange]
            let frameRates  = ranges[0]
            /*
                 "frameRates.maxFrameRate >= maxFps" select the video format
                 desired with the highest resolution available, because
                 the camera formats are ordered; else
                 "frameRates.maxFrameRate > maxFps" select the first
                 format available with the desired fps 
            */
            if frameRates.maxFrameRate >= maxFps && frameRates.maxFrameRate <= maxFpsDesired {
                maxFps = frameRates.maxFrameRate
                finalFormat = vFormat as! AVCaptureDeviceFormat
            }
        }
        if maxFps != 0 {
           let timeValue = Int64(1200.0 / maxFps)
           let timeScale: Int64 = 1200
           try backCamera!.lockForConfiguration()
           backCamera!.activeFormat = finalFormat
           backCamera!.activeVideoMinFrameDuration = CMTimeMake(timeValue, timeScale)
           backCamera!.activeVideoMaxFrameDuration = CMTimeMake(timeValue, timeScale)              backCamera!.focusMode = AVCaptureFocusMode.AutoFocus
           backCamera!.unlockForConfiguration()
        }
    }
    catch {
         print("Something was wrong")
    }
    let videoOutput = AVCaptureVideoDataOutput()
    videoOutput.alwaysDiscardsLateVideoFrames = true
    videoOutput.videoSettings = NSDictionary(object: Int(kCVPixelFormatType_32BGRA),
        forKey: kCVPixelBufferPixelFormatTypeKey as String) as [NSObject : AnyObject]
    videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
    if captureSession.canAddOutput(videoOutput){
        captureSession.addOutput(videoOutput) }
    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    view.layer.addSublayer(previewLayer)
    previewLayer.transform =  CATransform3DMakeRotation(-1.5708, 0, 0, 1);
    previewLayer.frame = self.view.bounds
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.view.layer.addSublayer(previewLayer)
    captureSession.startRunning()
}

Solution 3

Had the same problem. Fixed by using this function after [AVCaptureSession addInput:cameraDeviceInput]. Somehow I could not change the framerate on my iPad pro before capture session was started. So at first I changed video format after the device was added to the capture session.

- (void)switchFormatWithDesiredFPS:(CGFloat)desiredFPS
{
    BOOL isRunning = _captureSession.isRunning;

    if (isRunning)  [_captureSession stopRunning];

    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceFormat *selectedFormat = nil;
    int32_t maxWidth = 0;
    AVFrameRateRange *frameRateRange = nil;

    for (AVCaptureDeviceFormat *format in [videoDevice formats]) {

        for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {

            CMFormatDescriptionRef desc = format.formatDescription;
            CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
            int32_t width = dimensions.width;

            if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {

                selectedFormat = format;
                frameRateRange = range;
                maxWidth = width;
            }
        }
    }

    if (selectedFormat) {

        if ([videoDevice lockForConfiguration:nil]) {

            NSLog(@"selected format:%@", selectedFormat);
            videoDevice.activeFormat = selectedFormat;
            videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            [videoDevice unlockForConfiguration];
        }
    }

    if (isRunning) [_captureSession startRunning];
}
Share:
22,925

Related videos on Youtube

Alexander Taraymovich
Author by

Alexander Taraymovich

Updated on July 19, 2022

Comments

  • Alexander Taraymovich
    Alexander Taraymovich almost 2 years

    I'm developping an app which requires capturing framebuffer at as much fps as possible. I've already figured out how to force iphone to capture at 60 fps but

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    

    method is being called only 15 times a second, which means that iPhone downgrades capture output to 15 fps.

    Has anybody faced such problem? Is there any possibility to increase capturing frame rate?

    Update my code:

    camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
       [camera lockForConfiguration:nil];
       camera.torchMode=AVCaptureTorchModeOn;
       [camera unlockForConfiguration];
    }
    [self configureCameraForHighestFrameRate:camera];
    
    // Create a AVCaptureInput with the camera device
    NSError *error=nil;
    AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
    if (cameraInput == nil) {
       NSLog(@"Error to create camera capture:%@",error);
    }
    
    // Set the output
    AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    
    // create a queue to run the capture on
    dispatch_queue_t captureQueue=dispatch_queue_create("captureQueue", NULL);
    
    // setup our delegate
    [videoOutput setSampleBufferDelegate:self queue:captureQueue];
    
    // configure the pixel format
    videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                                 nil];
    
    // Add the input and output
    [captureSession addInput:cameraInput];
    [captureSession addOutput:videoOutput];
    

    I took configureCameraForHighestFrameRate method here https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html

  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    I do exactly the same (except maybe kCVPixelFormatType_420YpCbCr8BiPlanarFullRange type) but the delegate method is still being called only 15 times for second. Could you provide full source code where you initialize video input and output?
  • Sten
    Sten over 10 years
    I have added the video output code, it is pretty standard. You are not doing any heavy calculations in captureOutput that slow things down?
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    My code looks like the same. I figured out how to increase frame rate up to 30 fps (I tried to set session preset before) but it still not enough. I have some calculations but for testing purposes I disabled them and I still have 30 fps.
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    I debugged a little and discovered that [videoOutput connectionWithMediaType:AVMediaTypeVideo] returns connection with minFrameRateDuration (1,30)
  • Sten
    Sten over 10 years
    I suppose that you have hardware that supports 60 fps (i.e. iPhone 5 or iPad air or mini). Have you checked that you get maxFrameRate = 60 from the format description when you set it up?
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    Yes, my device is iPhone 5. Yes, maxFrameRate is 60
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    BTW, I checked the sampleBuffer and discovered that its width is 1920 and height is 1080. But the format of the camera is 1280x720. Should I configure the videoOutput for my purposes?
  • Sten
    Sten over 10 years
    You can't get 60 fps from 1920x1080, only from 1280x720. You must be setting videoDevice.activeFormat to the wrong format. I have added a NSlog in the code. Add that and see that you only get one format as log output and that it has 60 fps.
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    Here is my Log output 2013-12-03 08:42:03.761 [3000:60b] formats vide <CMVideoFormatDescription 0x14e59460 [0x38862ad0]> { mediaType:'vide' mediaSubType:'420v' mediaSpecific: { codecType: '420v' dimensions: 1280 x 720 } extensions: {(null)} } ( "<AVFrameRateRange: 0x14d473a0 1 - 60>" ) 2013-12-03 08:42:05.983 [3000:410b] SampleBuffer width - 1920 SampleBiffer height - 1080
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    I added it to the question
  • Alexander Taraymovich
    Alexander Taraymovich over 10 years
    This iOS programming makes me mad! I didn't suspect that a bunch of things depend on operations order. I simply put [captureSession addInput:cameraInput]; line right before format configuration and it works! Thank you for your help!
  • Form
    Form almost 10 years
    @AlexanderTaraymovich Damn, it does work! Talk about a flimsy API! AVFoundation is certainly powerful, but there are clearly some temporal coupling design issues there.
  • Thyselius
    Thyselius over 8 years
    NICE! However, my iPhone 6 plus reported the following framerate Framerates = <AVFrameRateRange: 0x127654170 5 - 240> so you should make a small addition in your answer to: if frameRates.maxFrameRate == 120 || frameRates.maxFrameRate == 240 {
  • Thyselius
    Thyselius over 8 years
    also change to CMTimeMake(5,1200) I then got my camera running at 240
  • Adda_25
    Adda_25 over 8 years
    I edit my answer. CMTimeMake(5, 1200) get your camera running at 240 because 1200 / 5 = 240. In CMTimeMake the first value is the numerator, the second the denominator. See stackoverflow.com/questions/12902410/…
  • Swifty McSwifterton
    Swifty McSwifterton about 4 years
    Worked perfectly for me. Thank you.