iOS: AVPlayer - getting a snapshot of the current frame of a video
Solution 1
AVPlayerItemVideoOutput
works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime
and simply call copyPixelBufferForItemTime
? This code produces a CVPixelBuffer
instead of a UIImage
, but there are answers that describe how to do that.
This answer mostly cribbed from here
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()
@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
@end
@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}
@end
Solution 2
AVAssetImageGenerator
is the best way to snapshot a video, this method return asynchronously a UIImage
:
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:@escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if image != nil {
handler(UIImage(cgImage: image!))
}
}
}
(It's Swift 4.2)
frangulyan
Updated on June 03, 2022Comments
-
frangulyan almost 2 years
I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as
UIImage
. That's it.My video is a m3u8 file located on the internet, it is played normally in the
AVPlayerLayer
without any problems.What have I tried:
-
AVAssetImageGenerator
. It is not working, the methodcopyCGImageAtTime:actualTime: error:
returns null image ref. According to the answer hereAVAssetImageGenerator
doesn't work for streaming videos. -
Taking snapshot of the player view. I tried first
renderInContext:
onAVPlayerLayer
, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -drawViewHierarchyInRect:afterScreenUpdates:
which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. -
AVPlayerItemVideoOutput
. I have added a video output for myAVPlayerItem
, however whenever I callhasNewPixelBufferForItemTime:
it returnsNO
. I guess the problem is again streaming video and I am not alone with this problem. -
AVAssetReader
. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
-
-
frangulyan over 7 yearsI will try it now and let you know. A short question: should I setup the
AVPlayerItemVideoOutput
object right from the beginning? My code is playing a video without the output added and then, whenever I need a snapshot, I quickly create aAVPlayerItemVideoOutput
object, add it to the player item and try to read the pixel buffer. I also tried adding output a bit earlier - whenever my special snapshot gesture was starting the touches but not yet recognized. Is this important? -
bluevoid over 7 yearsI think you must set up the
AVPlayerItemVideoOutput
from the beginning, probably before you start playback. -
frangulyan over 7 yearsThanks for your solution, I just checked and it works! The trick was to add
AVPlayerItemVideoOutput
before I start to play, as you said. Seems a bit inefficient to have a video output added the whole time just for one screenshot somewhere in the future, which in most of the cases will not even be taken, but at least it works! -
bluevoid over 7 yearsYou're welcome. I guess you're right - attaching an
ARGB
AVPlayerItemVideoOutput
to what may very well be aYUV
flow could be expensive. I'd never thought of that. -
Raghuram over 7 years@ Axel Guilmin This captures only Avplayer. What if i want to take screenshot of both Avplayer and UiView?
-
Axel Guilmin over 7 yearsI don't think my answer would be the right approach to capture a UIView. I did not test it but this answer seems better : stackoverflow.com/a/4334902/1327557
-
Raghuram over 7 years@ Axel Guilmin Thank you for your reply. See this is my problem: stackoverflow.com/questions/42085479/…
-
schmittsfn about 7 yearsDid this solution work for you with FairPlay protected HLS? I've tried copyPixelBufferForItemTime and it works great with unprotected streams, but once you use FairPlay it returns NULL. stackoverflow.com/questions/42839831/…
-
mfaani almost 5 years@Anessence were you able to find an answer that captures from livestreams?