Extract/Record Audio from HLS stream (video) while playing iOS

23,727

Solution 1

I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.

So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website. HLS Apple

Here are the steps I am following. 1. First get the m3u8 and parse it. You can parse it using this helpful kit M3U8Kit. Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist) if you get the master playlist you can also parse it to get M3U8MediaPlaylist

(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];


NSError *error;
NSURL *baseURL;

if(isMasterPlaylist)
{

    M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
    self.masterPlaylist = masterList;

    M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
    M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];

    NSString *URI = StreamInfo.URI;
    NSRange range = [URI rangeOfString:@"dailymotion.com"];
    NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
    baseURL = [NSURL URLWithString:baseURLString];

    plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}



M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;

M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;

NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];

for (int i = 0; i < segmentInfoList.count; i++)
{
    M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];

    NSString *segmentURI = segmentInfo.URI;
    NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];

    [segmentUrls addObject:mediaURL];
    if(!self.segmentDuration)
        self.segmentDuration = segmentInfo.duration;
}

self.segmentFilesURLs = segmentUrls;



}

You can see that you will get the links to the .ts files from the m3u8 parse it.

  1. Now download all the .ts file into a local folder.
  2. Merge these .ts files in to one mp4 file and Export. You can do that using this wonderful C library TS2MP4

and then you can delete the .ts files or keep them if you need them.

Solution 2

This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.

Share:
23,727
Sajad Khan
Author by

Sajad Khan

Updated on July 15, 2022

Comments

  • Sajad Khan
    Sajad Khan almost 2 years

    I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button. The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.

    But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.

    I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed this article. I am able to record remote mp4 audio but its not working with HLS streams. Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];

    But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.

    -(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
    AVPlayer *player = (AVPlayer*) object;
    
    if (player.status == AVPlayerStatusReadyToPlay)
    {
        NSLog(@"Ready to play");
        self.previousAudioTrackID = 0;
    
    
            __weak typeof (self) weakself = self;
    
            timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
                    {
    
                        @try {
    
                                for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
                                    if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
                                        weakself.currentAudioPlayerItemTrack = track;
    
                                }
    
                                AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
    
    
                                weakself.currentAudioTrackID = audioAssetTrack.trackID;
    
                                if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
    
                                    NSLog(@":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
                                    weakself.previousAudioTrackID = weakself.currentAudioTrackID;
                                    weakself.audioTrack = audioAssetTrack;
                                    /// Use this audio track to initialize MTAudioProcessingTap
                                }
                            }
                            @catch (NSException *exception) {
                                NSLog(@"Exception Trap ::::: Audio tracks not found!");
                            }
    
                    }];
    
    
        }
    }  
    

    I am also keeping track of trackID to check if track is changed.

    This is how I initialize the MTAudioProcessingTap.

    -(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
    // Configure an MTAudioProcessingTap to handle things.
    MTAudioProcessingTapRef tap;
    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = (__bridge void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalize;
    
    OSStatus err = MTAudioProcessingTapCreate(
        kCFAllocatorDefault, 
        &callbacks, 
        kMTAudioProcessingTapCreationFlag_PostEffects, 
        &tap
    );
    
    if(err) {
        NSLog(@"Unable to create the Audio Processing Tap %d", (int)err);
        NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
                                             code:err
                                         userInfo:nil];
        NSLog(@"Error: %@", [error description]);;
        return;
    }
    
    // Create an AudioMix and assign it to our currently playing "item", which
    // is just the stream itself.
    
    
    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
    AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
        audioMixInputParametersWithTrack:audioTrack];
    
    inputParams.audioTapProcessor = tap;
    audioMix.inputParameters = @[inputParams];
    _audioPlayer.currentItem.audioMix = audioMix;
    }
    

    But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.

    Is the problem with the audioTrack I am getting through KVO?

    Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?

  • Carlos F
    Carlos F over 8 years
    This is helpful if the m3u8 is a static list of ts segments. However if you want to record a live stream, this will not work because u have to manage duplicates when u periodically hit the server. I need this pretty desperately even if it means hiring a consultant.
  • Sajad Khan
    Sajad Khan over 8 years
    Yes Carlos F you are right about that you have to manage it as this also will create a mess when the player hit the server first and then you later try to parse. But I did manage to record live streams also using this approach.
  • Amrit Tiwari
    Amrit Tiwari over 5 years
    @SajadKhan Can you elaborate answer in more details. Above answer is not clear. Or it will be greater thankful if you make a demo project on it.