IOS:Swift: Video Screen Capture

11,121

Here is a working copy of iOSSwiftSimpleAVCamera in swift. It doesn't quite solve your problem but it is a bit of a starting point for anyone else that looks at this thread. Some of the error checking was removed from this code so be weary, it will only work on an actual device not in the simulator.

App delegate

//
//  AppDelegate.swift
//  iOSSwiftSimpleAVCamera
//
//  Created by Bradley Griffith on 7/1/14.
//  Copyright (c) 2014 Bradley Griffith. All rights reserved.
//

import UIKit
import CoreData

@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {

    var window: UIWindow?


    func applicationDidFinishLaunching(application: UIApplication) {

    }

    func applicationWillResignActive(application: UIApplication) {
        // Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
        // Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game.
    }

    func applicationDidEnterBackground(application: UIApplication) {
        // Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
        // If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
    }

    func applicationWillEnterForeground(application: UIApplication) {
        // Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
    }

    func applicationDidBecomeActive(application: UIApplication) {
        // Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
    }

    func applicationWillTerminate(application: UIApplication) {
        // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
        // Saves changes in the application's managed object context before the application terminates.
        self.saveContext()
    }

    func saveContext () {
        var error: NSError? = nil
        let managedObjectContext = self.managedObjectContext
        //if managedObjectContext != nil {
            if managedObjectContext.hasChanges && !managedObjectContext.save(&error) {
                // Replace this implementation with code to handle the error appropriately.
                // abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
                //println("Unresolved error \(error), \(error.userInfo)")
                abort()
           // &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
        }
    }

    // #pragma mark - Core Data stack

    // Returns the managed object context for the application.
    // If the context doesn't already exist, it is created and bound to the persistent store coordinator for the application.
    var managedObjectContext: NSManagedObjectContext {
        if !(_managedObjectContext != nil) {
            let coordinator = self.persistentStoreCoordinator
            //if coordinator != nil {
                _managedObjectContext = NSManagedObjectContext()
                _managedObjectContext!.persistentStoreCoordinator = coordinator
            //&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
        }
        return _managedObjectContext!
    }
    var _managedObjectContext: NSManagedObjectContext? = nil

    // Returns the managed object model for the application.
    // If the model doesn't already exist, it is created from the application's model.
    var managedObjectModel: NSManagedObjectModel {
        if !(_managedObjectModel != nil) {
            let modelURL = NSBundle.mainBundle().URLForResource("iOSSwiftSimpleAVCamera", withExtension: "momd")
            _managedObjectModel = NSManagedObjectModel(contentsOfURL: modelURL!)
        }
        return _managedObjectModel!
    }
    var _managedObjectModel: NSManagedObjectModel? = nil

    // Returns the persistent store coordinator for the application.
    // If the coordinator doesn't already exist, it is created and the application's store added to it.
    var persistentStoreCoordinator: NSPersistentStoreCoordinator {
        if !(_persistentStoreCoordinator != nil) {
            let storeURL = self.applicationDocumentsDirectory.URLByAppendingPathComponent("iOSSwiftSimpleAVCamera.sqlite")
            var error: NSError? = nil
            _persistentStoreCoordinator = NSPersistentStoreCoordinator(managedObjectModel: self.managedObjectModel)
            if _persistentStoreCoordinator!.addPersistentStoreWithType(NSSQLiteStoreType, configuration: nil, URL: storeURL, options: nil, error: &error) == nil {
                /*
                Replace this implementation with code to handle the error appropriately.

                abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.

                Typical reasons for an error here include:
                * The persistent store is not accessible;
                * The schema for the persistent store is incompatible with current managed object model.
                Check the error message to determine what the actual problem was.


                If the persistent store is not accessible, there is typically something wrong with the file path. Often, a file URL is pointing into the application's resources directory instead of a writeable directory.

                If you encounter schema incompatibility errors during development, you can reduce their frequency by:
                * Simply deleting the existing store:
                NSFileManager.defaultManager().removeItemAtURL(storeURL, error: nil)

                * Performing automatic lightweight migration by passing the following dictionary as the options parameter:
                [NSMigratePersistentStoresAutomaticallyOption: true, NSInferMappingModelAutomaticallyOption: true}

                Lightweight migration will only work for a limited set of schema changes; consult "Core Data Model Versioning and Data Migration Programming Guide" for details.

                */
                //println("Unresolved error \(error), \(error.userInfo)")
                abort()
            }
        }
        return _persistentStoreCoordinator!
    }
    var _persistentStoreCoordinator: NSPersistentStoreCoordinator? = nil

    // #pragma mark - Application's Documents directory

    // Returns the URL to the application's Documents directory.
    var applicationDocumentsDirectory: NSURL {
        let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
        return urls[urls.endIndex-1] as! NSURL
    }

}

CameraSessionController

//
//  CameraSessionController.swift
//  iOSSwiftSimpleAVCamera
//
//  Created by Bradley Griffith on 7/1/14.
//  Copyright (c) 2014 Bradley Griffith. All rights reserved.
//


import UIKit
import AVFoundation
import CoreMedia
import CoreImage

@objc protocol CameraSessionControllerDelegate {
    optional func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!)
}

class CameraSessionController: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    var session: AVCaptureSession!
    var sessionQueue: dispatch_queue_t!
    var videoDeviceInput: AVCaptureDeviceInput!
    var videoDeviceOutput: AVCaptureVideoDataOutput!
    var stillImageOutput: AVCaptureStillImageOutput!
    var runtimeErrorHandlingObserver: AnyObject?
    var sessionDelegate: CameraSessionControllerDelegate?


    /* Class Methods
    ------------------------------------------*/

    class func deviceWithMediaType(mediaType: NSString, position: AVCaptureDevicePosition) -> AVCaptureDevice {
        var devices: NSArray = AVCaptureDevice.devicesWithMediaType(mediaType as String)
        var captureDevice: AVCaptureDevice = devices.firstObject as! AVCaptureDevice

        for object:AnyObject in devices {
            let device = object as! AVCaptureDevice
            if (device.position == position) {
                captureDevice = device
                break
            }
        }

        return captureDevice
    }


    /* Lifecycle
    ------------------------------------------*/

    override init() {
        super.init();

        self.session = AVCaptureSession()

        self.authorizeCamera();

        self.sessionQueue = dispatch_queue_create("CameraSessionController Session", DISPATCH_QUEUE_SERIAL)

        dispatch_async(self.sessionQueue, {
            self.session.beginConfiguration()
            self.addVideoInput()
            self.addVideoOutput()
            self.addStillImageOutput()
            self.session.commitConfiguration()
            })
    }


    /* Instance Methods
    ------------------------------------------*/

    func authorizeCamera() {
        AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: {
            (granted: Bool) -> Void in
            // If permission hasn't been granted, notify the user.
            if !granted {
                dispatch_async(dispatch_get_main_queue(), {
                    UIAlertView(
                        title: "Could not use camera!",
                        message: "This application does not have permission to use camera. Please update your privacy settings.",
                        delegate: self,
                        cancelButtonTitle: "OK").show()
                    })
            }
            });
    }

    func addVideoInput() -> Bool {
        var success: Bool = false
        var error: NSError?

        var videoDevice: AVCaptureDevice = CameraSessionController.deviceWithMediaType(AVMediaTypeVideo, position: AVCaptureDevicePosition.Back)
        self.videoDeviceInput = AVCaptureDeviceInput.deviceInputWithDevice(videoDevice, error: &error) as! AVCaptureDeviceInput;
        if !(error != nil) {
            if self.session.canAddInput(self.videoDeviceInput) {
                self.session.addInput(self.videoDeviceInput)
                success = true
            }
        }

        return success
    }

    func addVideoOutput() {
        //&&&&&&&&&&&&&&&&&&&&&var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)

        self.videoDeviceOutput = AVCaptureVideoDataOutput()

        self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true

        self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)

        if self.session.canAddOutput(self.videoDeviceOutput) {
            self.session.addOutput(self.videoDeviceOutput)
        }
    }

    func addStillImageOutput() {
        self.stillImageOutput = AVCaptureStillImageOutput()
        self.stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

        if self.session.canAddOutput(self.stillImageOutput) {
            self.session.addOutput(self.stillImageOutput)
        }
    }

    func startCamera() {
        dispatch_async(self.sessionQueue, {
            var weakSelf: CameraSessionController? = self
            self.runtimeErrorHandlingObserver = NSNotificationCenter.defaultCenter().addObserverForName(AVCaptureSessionRuntimeErrorNotification, object: self.sessionQueue, queue: nil, usingBlock: {
                (note: NSNotification!) -> Void in

                let strongSelf: CameraSessionController = weakSelf!

                dispatch_async(strongSelf.sessionQueue, {
                    strongSelf.session.startRunning()
                })
            })
            self.session.startRunning()
        })
    }

    func teardownCamera() {
        dispatch_async(self.sessionQueue, {
            self.session.stopRunning()
            NSNotificationCenter.defaultCenter().removeObserver(self.runtimeErrorHandlingObserver!)
        })
    }

    func focusAndExposeAtPoint(point: CGPoint) {
        dispatch_async(self.sessionQueue, {
            var device: AVCaptureDevice = self.videoDeviceInput.device
            var error: NSErrorPointer!

            if device.lockForConfiguration(error) {
                if device.focusPointOfInterestSupported && device.isFocusModeSupported(AVCaptureFocusMode.AutoFocus) {
                    device.focusPointOfInterest = point
                    device.focusMode = AVCaptureFocusMode.AutoFocus
                }

                if device.exposurePointOfInterestSupported && device.isExposureModeSupported(AVCaptureExposureMode.AutoExpose) {
                    device.exposurePointOfInterest = point
                    device.exposureMode = AVCaptureExposureMode.AutoExpose
                }

                device.unlockForConfiguration()
            }
            else {
                // TODO: Log error.
            }
        })
    }

    func captureImage(completion:((image: UIImage?, error: NSError?) -> Void)?) {




        if (completion != nil){
        if(self.stillImageOutput != nil) {
            return
            }}

        dispatch_async(self.sessionQueue, {

            self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo), completionHandler: {
                (imageDataSampleBuffer: CMSampleBuffer?, error: NSError?) -> Void in


                if (imageDataSampleBuffer != nil)
                {
                    if(error != nil)
                    {
                        completion!(image:nil, error:nil)

                    }
                }

                else if (imageDataSampleBuffer != nil) {
                    var imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
                    var image: UIImage = UIImage(data: imageData)!
                    completion!(image:image, error:nil)
                }
                })
            })
    }


    /* AVCaptureVideoDataOutput Delegate
    ------------------------------------------*/

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        self.sessionDelegate?.cameraSessionDidOutputSampleBuffer?(sampleBuffer)
    }

}

camera view controller

//
//  CameraViewController.swift
//  iOSSwiftSimpleAVCamera
//
//  Created by Bradley Griffith on 7/1/14.
//  Copyright (c) 2014 Bradley Griffith. All rights reserved.
//


import UIKit
import CoreMedia
import AVFoundation

class CameraViewController: UIViewController, CameraSessionControllerDelegate {

    var cameraSessionController: CameraSessionController!
    var previewLayer: AVCaptureVideoPreviewLayer!


    /* Lifecycle
    ------------------------------------------*/

    override func viewDidLoad() {
        super.viewDidLoad()

        self.cameraSessionController = CameraSessionController()
        self.cameraSessionController.sessionDelegate = self
        self.setupPreviewLayer()
    }

    override func viewWillAppear(animated: Bool) {
        super.viewWillAppear(animated)

        self.cameraSessionController.startCamera()
    }

    override func viewWillDisappear(animated: Bool) {
        super.viewWillDisappear(animated)

        self.cameraSessionController.teardownCamera()
    }


    /* Instance Methods
    ------------------------------------------*/

    func setupPreviewLayer() {
        var minSize = min(self.view.bounds.size.width, self.view.bounds.size.height)
        var bounds: CGRect = CGRectMake(0.0, 0.0, minSize, minSize)
        self.previewLayer = AVCaptureVideoPreviewLayer(session: self.cameraSessionController.session)
        self.previewLayer.bounds = bounds
        self.previewLayer.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds))
        self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill

        self.view.layer.addSublayer(self.previewLayer)
    }

    func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!) {
        // Any frame processing could be done here.
    }

}
Share:
11,121
μολὼν.λαβέ
Author by

μολὼν.λαβέ

Updated on June 09, 2022

Comments

  • μολὼν.λαβέ
    μολὼν.λαβέ almost 2 years

    What is the screen device called in iOS/Swift?

    When I print the devices I get

    (
    "<AVCaptureFigVideoDevice: 0x134d0f210 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
    "<AVCaptureFigVideoDevice: 0x134e0af80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
    "<AVCaptureFigAudioDevice: 0x174265440 [iPad Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
    )
    

    So where's the screen ID?

    There's just too much outdated objective c code while swift is a moving target. I'm looking for a swift solution to capture video from my iPad screen and audio from built-in microphone. The audio will be a separate question.

    Here is a screen grabber for OS X

    https://github.com/kennyledet/SwiftCap

        // AVCaptureSession holds inputs and outputs for real-time capture
    let mSession = AVCaptureSession()
    let mScreenCapOutput = AVCaptureMovieFileOutput()
    var mOutputPath = ""
    // Just capture main display for now
    let mMainDisplayId = CGMainDisplayID()
    

    but I cannot find in the documentation the display ID, CGMainDisplayID, for an iPad...

    Here is a typical solution for a camera in swift

    https://github.com/bradley/iOSSwiftSimpleAVCamera

    but it has too many errors and doesn't compile with iOS 8.1 or 8.2 and grabs video from camera.

    func addVideoOutput() {
    var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
    
        self.videoDeviceOutput = AVCaptureVideoDataOutput()
    
        self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
    
        self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
    
        if self.session.canAddOutput(self.videoDeviceOutput) {
            self.session.addOutput(self.videoDeviceOutput)
        }
    }
    

    https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW18

    Apple gives an objective-c solution like this

    /*
     * Create video connection
     */
    AVCaptureDeviceInput *videoIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
    if ([_captureSession canAddInput:videoIn])
        [_captureSession addInput:videoIn];
    
    AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
    [videoOut setAlwaysDiscardsLateVideoFrames:YES];
    [videoOut setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
    dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
    [videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
    
    if ([_captureSession canAddOutput:videoOut])
        [_captureSession addOutput:videoOut];
    _videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
    self.videoOrientation = _videoConnection.videoOrientation;
    
    if([self.session canSetSessionPreset:AVCaptureSessionPreset640x480])
        [self.session setSessionPreset:AVCaptureSessionPreset640x480]; // Lower video resolution to decrease recorded movie size
    
    return YES;
    }
    

    This should be easy.....???

    • rizzes
      rizzes almost 9 years
      Good question. I agree it should be easy. Any update on it?
    • μολὼν.λαβέ
      μολὼν.λαβέ almost 9 years
      no update yet. I gave up on it converting the objc as I have other pressing code to write.
    • rizzes
      rizzes almost 9 years
      Darn. Thanks for responding. I'll let you know if I find something.
    • nsij22
      nsij22 almost 9 years
      @rizzes @G Alexander Any update now?
    • rizzes
      rizzes almost 9 years
      @nsij22 -- Nope. Taking lots of screenshots seems like the best option thus far.
    • nsij22
      nsij22 almost 9 years
      @rizzes i got a copy of iOSSwiftSimpleAVCamera working by making a few changes to the code. I will upload it in a bit for you
    • rizzes
      rizzes almost 9 years
      @nsij22 Awesome! Let me know where you uploaded it (and if you need me to do anything)!
    • μολὼν.λαβέ
      μολὼν.λαβέ almost 9 years
      rizzzes I went the screenshot way as well, it's very easy to do. I ran out time to mess around with obj-c. this was a feature request not mandatory. @nsij22 I'd be interested to see the code as well
    • nsij22
      nsij22 almost 9 years
      @GAlexander i posted it as an answer for you guys
  • μολὼν.λαβέ
    μολὼν.λαβέ almost 9 years
    Great effort and work. Does it capture everything like CALayers? WebUI? I found that taking snapshots if the screen area contains a youtube video, the video does not show up in the screen capture.
  • μολὼν.λαβέ
    μολὼν.λαβέ almost 9 years
    iOS 9. It looks someone at apple has been watching my bitching about what is missing from apple's sdks. developer.apple.com/ios/pre-release "ReplayKit Games can leverage ReplayKit to record video of their content, and allow the user to quickly edit the video within the app and share it online."
  • charles.cc.hsu
    charles.cc.hsu over 8 years
    @GAlexander Can iOSSwiftSimpleAVCamera record a movie playing by AVPlayer? I found iPhone Screen Capture, but it seems can not record that.
  • μολὼν.λαβέ
    μολὼν.λαβέ over 8 years
    Short answer probably not without major headaches. that code looks like it just doing a raw UIImage dumps of the screen @fps and piping the images into a video codec, similar to ffmpeg. If you already have a video stream going to AVPlayer, just intercept/reroute/dupe the stream and send it to a new output device.