AVPlayer 播放 m3u8 流时如何捕获图像?

How can I capture an image when AVPlayer playing m3u8 stream?

我使用 AVPlayer 播放 m3u8 文件,我想在这些代码中捕获图像:

AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:self.player.currentItem.asset];
gen.appliesPreferredTrackTransform = YES;
NSError *error = nil;
CMTime actualTime;
CMTime now = self.player.currentTime;
[gen setRequestedTimeToleranceAfter:kCMTimeZero];
[gen setRequestedTimeToleranceBefore:kCMTimeZero];
CGImageRef image = [gen copyCGImageAtTime:now actualTime:&actualTime error:&error];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
NSLog(@"%f , %f",CMTimeGetSeconds(now),CMTimeGetSeconds(actualTime));

NSLog(@"%@",error);
if (image) {
    CFRelease(image);
}

但它不起作用。错误是:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7fadf25f59f0 {NSUnderlyingError=0x7fadf25f1670 "The operation couldn’t be completed. (OSStatus error -12782.)", NSLocalizedFailureReason=An unknown error occurred (-12782), NSLocalizedDescription=The operation could not be completed}

我该如何解决?
非常感谢。

AVAssetImageGenerator 可能需要本地资源。将 AVPlayerItemVideoOutput 添加到 AVPlayer,寻找所需位置并在 videoOutput 上调用 copyPixelBufferForItemTime:itemTimeForDisplay: 可能会更幸运。

我使用以下代码解决了与您相同的问题。

您可以使用此代码:

属性

@property (strong, nonatomic) AVPlayer *player;
@property (strong, nonatomic) AVPlayerItem *playerItem;
@property (strong, nonatomic) AVPlayerItemVideoOutput *videoOutput;

初始

AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
self.player = [AVPlayer playerWithPlayerItem:_playerItem];

正在获取图像

CMTime currentTime = _player.currentItem.currentTime;
CVPixelBufferRef buffer = [_videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
UIImage *image = [UIImage imageWithCIImage:ciImage];
//Use image^^

要从 HLS 视频的 avplayer 捕获图像:

private let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])

private let jpegCompressionQuality = 0.7

private func imageFromCurrentPlayerContext() {
    guard let player = player else { return }
    let currentTime: CMTime = player.currentTime()

    guard let buffer: CVPixelBuffer = videoOutput.copyPixelBuffer(forItemTime: currentTime, itemTimeForDisplay: nil) else { return }
    let ciImage: CIImage = CIImage(cvPixelBuffer: buffer)
    let context: CIContext = CIContext.init(options: nil)

    guard let cgImage: CGImage = context.createCGImage(ciImage, from: ciImage.extent) else { return }
    let image: UIImage = UIImage.init(cgImage: cgImage)

    guard let jpegImage: Data = UIImageJPEGRepresentation(image, jpegCompressionQuality) else { return }
    // be happy
}