我花了一整天的时间,浏览了很多SO答案,Apple参考资料,文档等,但没有成功。
我想要一件简单的事情:我正在使用 AVPlayer 播放视频,我想暂停它并获取当前帧为
UIImage
。就是这样。
我的视频是位于互联网上的m3u8文件,在
AVPlayerLayer
中正常播放没有任何问题。
我尝试过什么:
AVAssetImageGenerator
。它不起作用,方法copyCGImageAtTime:actualTime: error:
返回空图像引用。根据答案hereAVAssetImageGenerator
不适用于流媒体视频。renderInContext:
上尝试了AVPlayerLayer
,但后来我意识到它没有渲染这种“特殊”图层。然后我发现了 iOS 7 中引入的一种新方法 - drawViewHierarchyInRect:afterScreenUpdates:
,它应该也能够渲染特殊图层,但不幸的是,仍然得到了显示视频的带有空白黑色区域的 UI 快照。AVPlayerItemVideoOutput
。我为我的AVPlayerItem
添加了视频输出,但是每当我调用hasNewPixelBufferForItemTime:
时,它都会返回NO
。我想问题又出在流媒体视频上,而且我并不孤单有这个问题。AVAssetReader
。我本来想尝试一下,但在找到相关问题后决定不要浪费时间这里。那么有没有办法获得我现在在屏幕上看到的东西的快照?我简直不敢相信这个。
AVAssetImageGenerator
是拍摄视频快照的最佳方法,此方法异步返回 UIImage
:
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:@escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if let img = image {
handler(UIImage(cgImage: img))
}
}
}
(这是 Swift 4.2)
AVPlayerItemVideoOutput
对我来说在 m3u8 上工作得很好。也许是因为我不咨询hasNewPixelBufferForItemTime
而只是打电话给copyPixelBufferForItemTime
?此代码生成 CVPixelBuffer
而不是 UIImage
,但有一些答案 描述了如何做到这一点。
这个答案主要抄袭自这里
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()
@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
@end
@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}
@end
Rhythmic Fistman 的解决方案的 Swift 版本在 Xcode 15 和 iOS 17 中仍然按预期工作:
private var playerOutput: AVPlayerItemVideoOutput?
(...)
self.playerOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)])
(...)
self.playerItem?.add(self.playerOutput!)
(...)
if let playerItem = self?.playerItem, let imageBuffer = self?.playerOutput?.copyPixelBuffer(forItemTime: playerItem.currentTime(), itemTimeForDisplay: nil), let image = UIImage(pixelBuffer: imageBuffer) {
// do what you want with the `image`
} else {
print("Failed to grab frame")
}