ios capturing image using AVFramework
Solution 1
Add the following line
output.minFrameDuration = CMTimeMake(5, 1);
below the comment
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
but above the
[session startRunning];
Edit
Use the following code to preview the camera output.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
UIView *aView = self.view;
CGRect videoRect = CGRectMake(0.0, 0.0, 320.0, 150.0);
previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view.
[aView.layer addSublayer:previewLayer];
Edit 2: Ok fine..
Apple has provided a way to set the minFrameDuration here
So now, use the following code to set the frame duration
AVCaptureConnection *conn = [output connectionWithMediaType:AVMediaTypeVideo];
if (conn.supportsVideoMinFrameDuration)
conn.videoMinFrameDuration = CMTimeMake(5,1);
if (conn.supportsVideoMaxFrameDuration)
conn.videoMaxFrameDuration = CMTimeMake(5,1);
Solution 2
Be careful - callback from AVCaptureOutput is posted in dispatch queue you specified. I saw you perform UI updates from this callback, and that is wrong. You should perform them only in main queue. E.g.
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"captureOutput: didOutputSampleBufferFromConnection");
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
dispatch_async(dispatch_get_main_queue(), ^{
//< Add your code here that uses the image >
[self.imageView setImage:image];
[self.view setNeedsDisplay];
}
}
Solution 3
And here is a Swift version of imageFromSampleBuffer function:
func imageFromSampleBuffer(sampleBuffer:CMSampleBuffer!) -> UIImage {
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, 0)
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
let width = CVPixelBufferGetWidth(imageBuffer)
let height = CVPixelBufferGetHeight(imageBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo:CGBitmapInfo = [.ByteOrder32Little, CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)]
let context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue)
let quartzImage = CGBitmapContextCreateImage(context)
CVPixelBufferUnlockBaseAddress(imageBuffer, 0)
let image = UIImage(CGImage: quartzImage!)
return image
}
Above working for me with following video settings:
videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput?.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_32BGRA)]
videoDataOutput?.setSampleBufferDelegate(self, queue: queue)
Oleg
Updated on June 22, 2020Comments
-
Oleg almost 4 years
I'm capturing images using this code
#pragma mark - image capture // Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil; // Create the session AVCaptureSession *session = [[AVCaptureSession alloc] init]; // Configure the session to produce lower resolution video frames, if your // processing algorithm can cope. We'll specify medium quality for the // chosen device. session.sessionPreset = AVCaptureSessionPresetMedium; // Find a suitable AVCaptureDevice AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; // Create a device input with the device and add it to the session. AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { NSLog(@"PANIC: no media input"); } [session addInput:input]; // Create a VideoDataOutput and add it to the session AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; [session addOutput:output]; // Configure your output. dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); [output setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); // Specify the pixel format output.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; // If you wish to cap the frame rate to a known value, such as 15 fps, set // minFrameDuration. // Start the session running to start the flow of data [session startRunning]; // Assign session to an ivar. [self setSession:session]; } // Delegate routine that is called when a sample buffer was written - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"captureOutput: didOutputSampleBufferFromConnection"); // Create a UIImage from the sample buffer data UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; //< Add your code here that uses the image > [self.imageView setImage:image]; [self.view setNeedsDisplay]; } // Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { NSLog(@"imageFromSampleBuffer: called"); // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImage); return (image); } -(void)setSession:(AVCaptureSession *)session { NSLog(@"setting session..."); self.captureSession=session; }
Capturing code works. But! I need to change to things: - video stream from the camera in my view. - getting images every (for example 5 seconds) from it. Help me please, how can it be done?
-
Oleg over 12 yearsThanks. How to set the video stream playing from the camera in my view? Should I add another AVCaptureOutput?
-
Oleg over 12 yearsand by the way this method is deprecated
-
Ilanchezhian over 12 yearsUse
AVCaptureVideoPreviewLayer
to play the preview. See my updated answer. -
Oleg over 12 yearsOK. That's clear. But what to do with deprecated setMinFrameDuration?
-
Ilanchezhian over 12 yearsNow the durations can be set with the help of AVCaptureConnection. See my Edit 2..
-
Jim True about 11 yearsYou are a life saver! This was stopping my imageview from being updated from the camera, thanks!
-
iosLearner over 9 yearsHi @Aadhira , I understood how to set fps. i have small doubt, i want to calculate fps and droprate, how can i do those things?