How to create video from its frames iPhone

10,965

You can refer following links hope you get some help :-

  1. http://www.iphonedevsdk.com/forum/iphone-sdk-development-advanced-discussion/77999-make-video-nsarray-uiimages.html

  2. Using FFMPEG library with iPhone SDK for video encoding

  3. Iphone SDK,Create a Video from UIImage

  4. iOS Video Editing - Is it possible to merge (side by side not one after other) two video files into one using iOS 4 AVFoundation classes?

Share:
10,965
Mehul Mistri
Author by

Mehul Mistri

iOS developer, computer geek, mentor, and life-long learner. LinkedIn Twitter Medium Connect with me on Skype for iOS app development. My Skype IM is live:mehul_325

Updated on June 03, 2022

Comments

  • Mehul Mistri
    Mehul Mistri about 2 years

    I had done R&D and got success in how to get frames in terms of images from video file played in MPMoviePlayerController.

    Got all frames from this code, and save all images in one Array.

    for(int i= 1; i <= moviePlayerController.duration; i++)
    {
        UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame];
        [arrImages addObject:img];
    }
    

    Now the question is that, After change some image file, like adding emotions to the images and also adding filters, such as; movie real, black and white, How can we create video again and store the same video in Document directory with the same frame rate and without losing quality of video.

    After changing some images I had done following code to save that video again.

    - (void) writeImagesAsMovie:(NSString*)path 
    {
        NSError *error  = nil;
        UIImage *first = [arrImages objectAtIndex:0];
        CGSize frameSize = first.size;
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];
        NSParameterAssert(videoWriter);
    
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                            assetWriterInputWithMediaType:AVMediaTypeVideo
                                            outputSettings:videoSettings] retain];
    
        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                         sourcePixelBufferAttributes:nil];
    
        NSParameterAssert(writerInput);
        NSParameterAssert([videoWriter canAddInput:writerInput]);
        [videoWriter addInput:writerInput];
    
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
        int frameCount = 0;
        CVPixelBufferRef buffer = NULL;
        for(UIImage *img in arrImages)
        {
            buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; 
    
                if (adaptor.assetWriterInput.readyForMoreMediaData) 
                {
                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
                    [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    
                    if(buffer)
                        CVBufferRelease(buffer);
                }
            frameCount++;
        } 
    
         [writerInput markAsFinished];
         [videoWriter finishWriting];
    }
    
    
    - (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
    {
        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;
        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                              frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                              &pxbuffer);
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
        NSParameterAssert(pxdata != NULL);
    
        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                     frameSize.height, 8, 4*frameSize.width, rgbColorSpace, 
                                                     kCGImageAlphaNoneSkipFirst);
        NSParameterAssert(context);
        CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                               CGImageGetHeight(image)), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);
    
        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
        return pxbuffer;
    }
    

    I am new in this topic so please help me solve this question.