Convert CMSampleBuffer to UIImage

16,401

Solution 1

The conversion is simple:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
     let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
     let ciimage = CIImage(cvPixelBuffer: imageBuffer)   
     let image = self.convert(cmage: ciimage) 
}

// Convert CIImage to UIImage
func convert(cmage: CIImage) -> UIImage {
     let context = CIContext(options: nil)
     let cgImage = context.createCGImage(cmage, from: cmage.extent)!
     let image = UIImage(cgImage: cgImage)
     return image
}

Solution 2

The solutions above can now be improved further with new convenience methods on UIImage. I have outlined below a modern solution also with image orientation correction. The solution doesn't use a CGImage conversion, which improved runtime performance.

func orientation() -> UIImage.Orientation {
    let curDeviceOrientation = UIDevice.current.orientation
    var exifOrientation: UIImage.Orientation
    switch curDeviceOrientation {
        case UIDeviceOrientation.portraitUpsideDown:  // Device oriented vertically, Home button on the top
            exifOrientation = .left
        case UIDeviceOrientation.landscapeLeft:       // Device oriented horizontally, Home button on the right
            exifOrientation = .upMirrored
        case UIDeviceOrientation.landscapeRight:      // Device oriented horizontally, Home button on the left
            exifOrientation = .down
        case UIDeviceOrientation.portrait:            // Device oriented vertically, Home button on the bottom
            exifOrientation = .up
        default:
            exifOrientation = .up
    }
    return exifOrientation
}

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
    guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
    let ciimage = CIImage(cvPixelBuffer: imageBuffer)
    let image = UIImage(ciImage: ciimage, scale: 1.0, orientation: orientation())
}

Solution 3

Looks like the CMSampleBuffer is giving you RGBA data from which you then directly construct the grayscale image. You will either need to construct a new buffer where for each pixel you do something like gray = (pixel.red+pixel.green+pixel.blue)/3. Or you need to create a normal RGBA image from the data you received and then convert it to grayscale.

But in your code you do no transition at all. You took the raw pointer to buffer using CVPixelBufferGetBaseAddress regardless of what sort of data is in there. And then you just pass the same pointer in creating the image which assumes the data received are grayscale.

Share:
16,401

Related videos on Youtube

Rubaiyat Jahan Mumu
Author by

Rubaiyat Jahan Mumu

The only way to discover the limit of possible is to go beyond them into the impossible. - Arthur C. Clarke

Updated on September 15, 2022

Comments

  • Rubaiyat Jahan Mumu
    Rubaiyat Jahan Mumu over 1 year

    I am trying to convert sampleBuffer to a UIImage and display it in an image view with colorspaceGray. But it displays as the following image. I think there is a problem regarding the conversion. How can I convert the CMSampleBuffer?

    It is a gray version of a red colored image

    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        print("buffered")
        let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
        let width: Int = CVPixelBufferGetWidth(imageBuffer)
        let height: Int = CVPixelBufferGetHeight(imageBuffer)
        let bytesPerRow: Int = CVPixelBufferGetBytesPerRow(imageBuffer)
        let lumaBuffer = CVPixelBufferGetBaseAddress(imageBuffer)
    
        //let planeCount : Int = CVPixelBufferGetPlaneCount(imageBuffer)
        let grayColorSpace: CGColorSpace = CGColorSpaceCreateDeviceGray()
        let context: CGContext = CGContext(data: lumaBuffer, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow , space: grayColorSpace, bitmapInfo: CGImageAlphaInfo.none.rawValue)!
        let dstImageFilter: CGImage = context.makeImage()!
        let imageRect : CGRect = CGRect(x: 0, y: 0, width: width, height: height)
        context.draw(dstImageFilter, in: imageRect)
        let image = UIImage(cgImage: dstImageFilter)
        DispatchQueue.main.sync(execute: {() -> Void in
            self.imageTest.image = image
        })
    }
    
  • Léo Natan
    Léo Natan about 7 years
    I think it's also possible to specify the output format of the capture, so that the format is as expected by the capture output.
  • Rubaiyat Jahan Mumu
    Rubaiyat Jahan Mumu about 7 years
    If the data is already grayscale then why these strips are coming? Why not just a normal gray image?
  • Rubaiyat Jahan Mumu
    Rubaiyat Jahan Mumu about 7 years
    @Matic I tried the 1st method before this one. But it crashed while making image from context and gave the this error : copy read only: vm_copy failed: status 1
  • Matic Oblak
    Matic Oblak about 7 years
    @RubaiyatJahanMumu "If the data is already grayscale then why these strips are coming?": It is hard to predict the result but it might have something to do with using the alpha channel (being always 255) as a color component. "it crashed while making image": You are probably having an overflow. You need to examine the buffer you received carefully and match how you use them to generate the image. Maybe the buffer is still RGB. Or maybe you are using some other type of a buffer. Show me what are your results for bytesPerRow and width. Also what is bitsPerComponent.
  • Matic Oblak
    Matic Oblak about 7 years
    @RubaiyatJahanMumu I would start with trying to create a normal image. Then try to play around on how to create a grayscale version of it.
  • user924
    user924 about 6 years
    thanks, but also how to convert UIImage back to CMSampleBuffer?
  • Dhaval H. Nena
    Dhaval H. Nena over 5 years
    Perfect solution !! Thanks
  • mikey
    mikey about 5 years
    this is great! thank you. Only issue I have is the orientation of the image. It always default to landscape. Do you happen to know why is that the case and how to fix that
  • Rubaiyat Jahan Mumu
    Rubaiyat Jahan Mumu about 5 years
    @mikey So far I remember is that I rotated the image after getting it or you can change the orientation of the image in .
  • mikey
    mikey about 5 years
    hi @mumu I end up rotating it after getting the image. Thank you for your answer!
  • FH-
    FH- over 4 years
    How to convert uiimage to cmsamplebuffer?
  • famfamfam
    famfamfam over 2 years
    i got wrong orientation when get image