Swift Merge audio and video files into one video

16,122

Solution 1

Improved code (of Govind's answer) with some additional features:

  1. Merge audio of the video + external audio (the initial answer was dropping the sound of the video)
  2. Flip video horizontally if needed (I personally use it when user captures using frontal camera, btw instagram flips it too)
  3. Apply preferredTransform correctly which solves the issue when video was saved rotated (video is external: captured by other device/generated by other app)
  4. Removed some unused code with VideoComposition.
  5. Added a completion handler to the method so that it can be called from a different class.
  6. Update to Swift 4.

Step 1.

import UIKit
import AVFoundation
import AVKit
import AssetsLibrary

Step 2.

/// Merges video and sound while keeping sound of the video too
///
/// - Parameters:
///   - videoUrl: URL to video file
///   - audioUrl: URL to audio file
///   - shouldFlipHorizontally: pass True if video was recorded using frontal camera otherwise pass False
///   - completion: completion of saving: error or url with final video
func mergeVideoAndAudio(videoUrl: URL,
                        audioUrl: URL,
                        shouldFlipHorizontally: Bool = false,
                        completion: @escaping (_ error: Error?, _ url: URL?) -> Void) {

    let mixComposition = AVMutableComposition()
    var mutableCompositionVideoTrack = [AVMutableCompositionTrack]()
    var mutableCompositionAudioTrack = [AVMutableCompositionTrack]()
    var mutableCompositionAudioOfVideoTrack = [AVMutableCompositionTrack]()

    //start merge

    let aVideoAsset = AVAsset(url: videoUrl)
    let aAudioAsset = AVAsset(url: audioUrl)

    let compositionAddVideo = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo,
                                                                   preferredTrackID: kCMPersistentTrackID_Invalid)

    let compositionAddAudio = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio,
                                                                 preferredTrackID: kCMPersistentTrackID_Invalid)

    let compositionAddAudioOfVideo = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio,
                                                                        preferredTrackID: kCMPersistentTrackID_Invalid)

    let aVideoAssetTrack: AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
    let aAudioOfVideoAssetTrack: AVAssetTrack? = aVideoAsset.tracks(withMediaType: AVMediaTypeAudio).first
    let aAudioAssetTrack: AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]

    // Default must have tranformation
    compositionAddVideo.preferredTransform = aVideoAssetTrack.preferredTransform

    if shouldFlipHorizontally {
        // Flip video horizontally
        var frontalTransform: CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)
        frontalTransform = frontalTransform.translatedBy(x: -aVideoAssetTrack.naturalSize.width, y: 0.0)
        frontalTransform = frontalTransform.translatedBy(x: 0.0, y: -aVideoAssetTrack.naturalSize.width)
        compositionAddVideo.preferredTransform = frontalTransform
    }

    mutableCompositionVideoTrack.append(compositionAddVideo)
    mutableCompositionAudioTrack.append(compositionAddAudio)
    mutableCompositionAudioOfVideoTrack.append(compositionAddAudioOfVideo)

    do {
        try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,
                                                                            aVideoAssetTrack.timeRange.duration),
                                                            of: aVideoAssetTrack,
                                                            at: kCMTimeZero)

        //In my case my audio file is longer then video file so i took videoAsset duration
        //instead of audioAsset duration
        try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,
                                                                            aVideoAssetTrack.timeRange.duration),
                                                            of: aAudioAssetTrack,
                                                            at: kCMTimeZero)

        // adding audio (of the video if exists) asset to the final composition
        if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {
            try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,
                                                                                       aVideoAssetTrack.timeRange.duration),
                                                                       of: aAudioOfVideoAssetTrack,
                                                                       at: kCMTimeZero)
        }
    } catch {
        print(error.localizedDescription)
    }

    // Exporting
    let savePathUrl: URL = URL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")
    do { // delete old video
        try FileManager.default.removeItem(at: savePathUrl)
    } catch { print(error.localizedDescription) }

    let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
    assetExport.outputFileType = AVFileTypeMPEG4
    assetExport.outputURL = savePathUrl
    assetExport.shouldOptimizeForNetworkUse = true

    assetExport.exportAsynchronously { () -> Void in
        switch assetExport.status {
        case AVAssetExportSessionStatus.completed:
            print("success")
            completion(nil, savePathUrl)
        case AVAssetExportSessionStatus.failed:
            print("failed \(assetExport.error?.localizedDescription ?? "error nil")")
            completion(assetExport.error, nil)
        case AVAssetExportSessionStatus.cancelled:
            print("cancelled \(assetExport.error?.localizedDescription ?? "error nil")")
            completion(assetExport.error, nil)
        default:
            print("complete")
            completion(assetExport.error, nil)
        }
    }

}

Again thanks to @Govind's answer! It helped me a lot!

Hope this update helps someone too:)

Solution 2

In Above question same error I found due to wrong savePathUrl, destination URL should be like below code including new video name.

I was looking for the code to Merge audio and video files into one video but couldn't find anywhere so after spending hours while reading apple docs I wrote this code.

NOTE : This is tested and 100% working code for me.

Stap : 1 Import this modules in your viewController.

import UIKit
import AVFoundation
import AVKit
import AssetsLibrary

step 2: Add this function in your code

func mergeFilesWithUrl(videoUrl:NSURL, audioUrl:NSURL)
{
    let mixComposition : AVMutableComposition = AVMutableComposition()
    var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []
    var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []
    let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()


    //start merge

    let aVideoAsset : AVAsset = AVAsset(URL: videoUrl)
    let aAudioAsset : AVAsset = AVAsset(URL: audioUrl)

    mutableCompositionVideoTrack.append(mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))
    mutableCompositionAudioTrack.append( mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))

    let aVideoAssetTrack : AVAssetTrack = aVideoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]



    do{
        try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aVideoAssetTrack, atTime: kCMTimeZero)

        //In my case my audio file is longer then video file so i took videoAsset duration
        //instead of audioAsset duration

        try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)

        //Use this instead above line if your audiofile and video file's playing durations are same

        //            try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)

    }catch{

    }

    totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration )

    let mutableVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition()
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30)

    mutableVideoComposition.renderSize = CGSizeMake(1280,720)

    //        playerItem = AVPlayerItem(asset: mixComposition)
    //        player = AVPlayer(playerItem: playerItem!)
    //
    //
    //        AVPlayerVC.player = player



    //find your video on this URl
    let savePathUrl : NSURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")

    let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
    assetExport.outputFileType = AVFileTypeMPEG4
    assetExport.outputURL = savePathUrl
    assetExport.shouldOptimizeForNetworkUse = true

    assetExport.exportAsynchronouslyWithCompletionHandler { () -> Void in
        switch assetExport.status {

        case AVAssetExportSessionStatus.Completed:

            //Uncomment this if u want to store your video in asset

            //let assetsLib = ALAssetsLibrary()
            //assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)

            print("success")
        case  AVAssetExportSessionStatus.Failed:
            print("failed \(assetExport.error)")
        case AVAssetExportSessionStatus.Cancelled:
            print("cancelled \(assetExport.error)")
        default:
            print("complete")
        }
    }


}

Step 3: Call function where u want like this

let videoUrl : NSURL =  NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("SampleVideo", ofType: "mp4")!)
let audioUrl : NSURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("SampleAudio", ofType: "mp3")!)

mergeFilesWithUrl(videoUrl, audioUrl: audioUrl)

hope this will help you and will save your time.

Solution 3

Swift 4.2 / 5

func mergeVideoWithAudio(videoUrl: URL, audioUrl: URL, success: @escaping ((URL) -> Void), failure: @escaping ((Error?) -> Void)) {


    let mixComposition: AVMutableComposition = AVMutableComposition()
    var mutableCompositionVideoTrack: [AVMutableCompositionTrack] = []
    var mutableCompositionAudioTrack: [AVMutableCompositionTrack] = []
    let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()

    let aVideoAsset: AVAsset = AVAsset(url: videoUrl)
    let aAudioAsset: AVAsset = AVAsset(url: audioUrl)

    if let videoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid), let audioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
        mutableCompositionVideoTrack.append(videoTrack)
        mutableCompositionAudioTrack.append(audioTrack)

    if let aVideoAssetTrack: AVAssetTrack = aVideoAsset.tracks(withMediaType: .video).first, let aAudioAssetTrack: AVAssetTrack = aAudioAsset.tracks(withMediaType: .audio).first {
        do {
            try mutableCompositionVideoTrack.first?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration), of: aVideoAssetTrack, at: CMTime.zero)
            try mutableCompositionAudioTrack.first?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration), of: aAudioAssetTrack, at: CMTime.zero)
               videoTrack.preferredTransform = aVideoAssetTrack.preferredTransform 

        } catch{
            print(error)
        }


       totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(start: CMTime.zero,duration: aVideoAssetTrack.timeRange.duration)
    }
    }

    let mutableVideoComposition: AVMutableVideoComposition = AVMutableVideoComposition()
    mutableVideoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
    mutableVideoComposition.renderSize = CGSize(width: 480, height: 640)

    if let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first {
        let outputURL = URL(fileURLWithPath: documentsPath).appendingPathComponent("\("fileName").m4v")

        do {
            if FileManager.default.fileExists(atPath: outputURL.path) {

                try FileManager.default.removeItem(at: outputURL)
            }
        } catch { }

        if let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) {
            exportSession.outputURL = outputURL
            exportSession.outputFileType = AVFileType.mp4
            exportSession.shouldOptimizeForNetworkUse = true

            /// try to export the file and handle the status cases
            exportSession.exportAsynchronously(completionHandler: {
                switch exportSession.status {
                case .failed:
                    if let _error = exportSession.error {
                        failure(_error)
                    }

                case .cancelled:
                    if let _error = exportSession.error {
                        failure(_error)
                    }

                default:
                    print("finished")
                    success(outputURL)
                }
            })
        } else {
            failure(nil)
        }
    }
}

Solution 4

Version Swift3 with URL and new syntax.

func mergeFilesWithUrl(videoUrl:URL, audioUrl:URL)
{
    let mixComposition : AVMutableComposition = AVMutableComposition()
    var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []
    var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []
    let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()


    //start merge

    let aVideoAsset : AVAsset = AVAsset(url: videoUrl)
    let aAudioAsset : AVAsset = AVAsset(url: audioUrl)

    mutableCompositionVideoTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))
    mutableCompositionAudioTrack.append( mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))

    let aVideoAssetTrack : AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
    let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]



    do{
        try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), of: aVideoAssetTrack, at: kCMTimeZero)

        //In my case my audio file is longer then video file so i took videoAsset duration
        //instead of audioAsset duration

        try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), of: aAudioAssetTrack, at: kCMTimeZero)

        //Use this instead above line if your audiofile and video file's playing durations are same

        //            try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)

    }catch{

    }

    totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration )

    let mutableVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition()
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30)

    mutableVideoComposition.renderSize = CGSize(width: 1280, height: 720)
    //        playerItem = AVPlayerItem(asset: mixComposition)
    //        player = AVPlayer(playerItem: playerItem!)
    //
    //
    //        AVPlayerVC.player = player

    //find your video on this URl
    let savePathUrl : URL = URL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")

    let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
    assetExport.outputFileType = AVFileTypeMPEG4
    assetExport.outputURL = savePathUrl
    assetExport.shouldOptimizeForNetworkUse = true

    assetExport.exportAsynchronously { () -> Void in
        switch assetExport.status {

        case AVAssetExportSessionStatus.completed:

            //Uncomment this if u want to store your video in asset

            //let assetsLib = ALAssetsLibrary()
            //assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)

            print("success")
        case  AVAssetExportSessionStatus.failed:
            print("failed \(assetExport.error)")
        case AVAssetExportSessionStatus.cancelled:
            print("cancelled \(assetExport.error)")
        default:
            print("complete")
        }
    }
}

Solution 5

Swift 5 version (Also repeats audio if video is larger than audio) : Just pass audio and video URLs. I have tried this with local video and remote audio url.

func mergeVideoWithAudio(videoUrl: URL,
                                audioUrl: URL,
                                success: @escaping ((URL) -> Void),
                                failure: @escaping ((Error?) -> Void)) {

       let mixComposition: AVMutableComposition = AVMutableComposition()
       var mutableCompositionVideoTrack: [AVMutableCompositionTrack] = []
       var mutableCompositionAudioTrack: [AVMutableCompositionTrack] = []
       let totalVideoCompositionInstruction: AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()

       let aVideoAsset: AVAsset = AVAsset(url: videoUrl)
       let aAudioAsset: AVAsset = AVAsset(url: audioUrl)

       if let videoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid), let audioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
           mutableCompositionVideoTrack.append( videoTrack )
           mutableCompositionAudioTrack.append( audioTrack )

           if let aVideoAssetTrack: AVAssetTrack = aVideoAsset.tracks(withMediaType: .video).first, let aAudioAssetTrack: AVAssetTrack = aAudioAsset.tracks(withMediaType: .audio).first {
               do {
                   try mutableCompositionVideoTrack.first?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration), of: aVideoAssetTrack, at: CMTime.zero)

                   let videoDuration = aVideoAsset.duration
                   if CMTimeCompare(videoDuration, aAudioAsset.duration) == -1 {
                       try mutableCompositionAudioTrack.first?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration), of: aAudioAssetTrack, at: CMTime.zero)
                   } else if CMTimeCompare(videoDuration, aAudioAsset.duration) == 1 {
                       var currentTime = CMTime.zero
                       while true {
                           var audioDuration = aAudioAsset.duration
                           let totalDuration = CMTimeAdd(currentTime, audioDuration)
                           if CMTimeCompare(totalDuration, videoDuration) == 1 {
                               audioDuration = CMTimeSubtract(totalDuration, videoDuration)
                           }
                           try mutableCompositionAudioTrack.first?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration), of: aAudioAssetTrack, at: currentTime)

                           currentTime = CMTimeAdd(currentTime, audioDuration)
                           if CMTimeCompare(currentTime, videoDuration) == 1 || CMTimeCompare(currentTime, videoDuration) == 0 {
                               break
                           }
                       }
                   }
                   videoTrack.preferredTransform = aVideoAssetTrack.preferredTransform

               } catch {
                   print(error)
               }

               totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: aVideoAssetTrack.timeRange.duration)
           }
       }

       let mutableVideoComposition: AVMutableVideoComposition = AVMutableVideoComposition()
       mutableVideoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
       mutableVideoComposition.renderSize = CGSize(width: 480, height: 640)

       if let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first {
           let outputURL = URL(fileURLWithPath: documentsPath).appendingPathComponent("\("fileName").m4v")

           do {
               if FileManager.default.fileExists(atPath: outputURL.path) {

                   try FileManager.default.removeItem(at: outputURL)
               }
           } catch { }

           if let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) {
               exportSession.outputURL = outputURL
               exportSession.outputFileType = AVFileType.mp4
               exportSession.shouldOptimizeForNetworkUse = true

               // try to export the file and handle the status cases
               exportSession.exportAsynchronously(completionHandler: {
                   switch exportSession.status {
                   case .failed:
                       if let error = exportSession.error {
                           failure(error)
                       }

                   case .cancelled:
                       if let error = exportSession.error {
                           failure(error)
                       }

                   default:
                       print("finished")
                       success(outputURL)
                   }
               })
           } else {
               failure(nil)
           }
       }
   }
Share:
16,122

Related videos on Youtube

Kei Maejima
Author by

Kei Maejima

Updated on August 06, 2022

Comments

  • Kei Maejima
    Kei Maejima almost 2 years

    I wrote a program in Swift.I want to merge a video with an audio file, but got this error.

    "failed Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo=0x17da4230 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The operation is not supported for this media.}"

    code

    func mergeAudio(audioURL: NSURL, moviePathUrl: NSURL, savePathUrl: NSURL) {
        var composition = AVMutableComposition()
        let trackVideo:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
        let trackAudio:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
        let option = NSDictionary(object: true, forKey: "AVURLAssetPreferPreciseDurationAndTimingKey")
        let sourceAsset = AVURLAsset(URL: moviePathUrl, options: option as [NSObject : AnyObject])
        let audioAsset = AVURLAsset(URL: audioURL, options: option as [NSObject : AnyObject])
    
        let tracks = sourceAsset.tracksWithMediaType(AVMediaTypeVideo)
        let audios = audioAsset.tracksWithMediaType(AVMediaTypeAudio)
    
        if tracks.count > 0 {
            let assetTrack:AVAssetTrack = tracks[0] as! AVAssetTrack
            let assetTrackAudio:AVAssetTrack = audios[0] as! AVAssetTrack
    
            let audioDuration:CMTime = assetTrackAudio.timeRange.duration
            let audioSeconds:Float64 = CMTimeGetSeconds(assetTrackAudio.timeRange.duration)
    
            trackVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero,audioDuration), ofTrack: assetTrack, atTime: kCMTimeZero, error: nil)
            trackAudio.insertTimeRange(CMTimeRangeMake(kCMTimeZero,audioDuration), ofTrack: assetTrackAudio, atTime: kCMTimeZero, error: nil)
        }
    
        var assetExport: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)
        assetExport.outputFileType = AVFileTypeMPEG4
        assetExport.outputURL = savePathUrl
        self.tmpMovieURL = savePathUrl
        assetExport.shouldOptimizeForNetworkUse = true
        assetExport.exportAsynchronouslyWithCompletionHandler { () -> Void in
            switch assetExport.status {
            case AVAssetExportSessionStatus.Completed:
                let assetsLib = ALAssetsLibrary()
                assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)
                println("success")
            case  AVAssetExportSessionStatus.Failed:
                println("failed \(assetExport.error)")
            case AVAssetExportSessionStatus.Cancelled:
                println("cancelled \(assetExport.error)")
            default:
                println("complete")
            }
        }
    
    }
    

    In my idea media type like mpeg4 is wrong. Where is the problem? What am i missing?

    • DShah
      DShah over 8 years
      Have you solved your problem?
    • Kei Maejima
      Kei Maejima over 8 years
      Not yet.Still searching solution.
    • user3344977
      user3344977 over 8 years
      Did you ever figure this out?
  • Dumitru Rogojinaru
    Dumitru Rogojinaru over 7 years
    it is not working if the a recorder sound was is format AVFileTypeAppleM4A, any suggestions?
  • parth
    parth almost 7 years
    Video's original audio removed if I am adding my custom audio. I want original video's sound also and want to append my custom sound. can you help me?
  • Govind Prajapati
    Govind Prajapati almost 7 years
    Sorry friend, all I have is above code and don't have time to explore more.
  • Faruk
    Faruk almost 7 years
    @parth Hey dude, I found a solution. Check this stackoverflow.com/questions/45313680/…
  • omarojo
    omarojo about 6 years
    What changes should I make if I want the original audio from the video to be entirely replaced with the external audio file. ?
  • Tung Fam
    Tung Fam about 6 years
    @omarojo I'm not sure If it's convenient to provide an answer in comments for such a big question. Feel free to create a new question and I'll reply there. Send me a link in comments once you created it. I'll try to help.
  • omarojo
    omarojo about 6 years
    I actually figured it out I just commented the lines in your code where you were including the video’s original audio.
  • Tung Fam
    Tung Fam about 6 years
    @omarojo that's great! glad you figured it out:) don't forget to upvote the answer so other people can notice it's helpfulness.
  • Shalin Shah
    Shalin Shah almost 6 years
    this works perfectly! thanks for the updated function + completion handler
  • Ahmadreza
    Ahmadreza over 5 years
    How to stop downloading process?
  • Tung Fam
    Tung Fam over 5 years
    @Alfi which downloading process are you referring to? In the steps described, there is no downloading process.
  • Ahmadreza
    Ahmadreza over 5 years
    @Tung Fam the video is downloading from a web url anyway! how can stop it if user wanted to stop the whole operation?
  • Tung Fam
    Tung Fam over 5 years
    @Alfi I don't think it's related to this question/answer. Moreover I don't know how you download the video. In my case I take it from the folder on the phone. But a general advice is that try to google for a specific problem that you have since this qa is not related to your problem.
  • rommex
    rommex almost 5 years
    The three lines that pertain to 'mutableVideoComposition' are useless and should be deleted
  • rommex
    rommex almost 5 years
    @DumitruRogojinaru did you find a solution about M4A format, man?
  • Paresh. P
    Paresh. P almost 5 years
    Can i merge two wav file?
  • Paresh. P
    Paresh. P almost 5 years
    is there any way merging two wav file i am getting crash at assetExport?.exportAsynchronously(completionHandler: { [weak self] in}
  • S.S.D
    S.S.D about 3 years
    Super Helpful! Thanks a lot.