我正在开发一个有集合视图的应用程序,集合视图的单元格可以包含视频。现在我正在使用AVPlayer
和AVPlayerLayer
显示视频。不幸的是,滚动性能很糟糕。好像AVPlayer
,AVPlayerItem
和AVPlayerLayer
在主线上做了很多他们的工作。他们不断拿出锁,等待信号量等,这阻碍了主线程并导致严重的帧丢失。
有没有办法告诉AVPlayer
停止在主线程上做这么多事情?到目前为止,我所尝试的一切都没有解决问题。
我还尝试使用AVSampleBufferDisplayLayer
构建一个简单的视频播放器。使用它我可以确保一切都发生在主线程之外,我可以在滚动和播放视频时达到~60fps。不幸的是,这种方法的级别要低得多,而且它不提供开箱即用的音频播放和时间擦除等功能。有没有办法与AVPlayer
获得类似的表现?我更愿意使用它。
编辑:在进一步研究之后,看起来使用AVPlayer
时可能无法实现良好的滚动性能。创建一个AVPlayer
并与AVPlayerItem
实例关联,开始了一系列工作,这些工作蹦蹦跳到主线程上,然后等待信号量并尝试获取一堆锁。随着滚动视图中视频数量的增加,停止主线程的时间量会急剧增加。
AVPlayer
dealloc似乎也是一个巨大的问题。 Dealloc'ing a AVPlayer
也试图同步一堆东西。再次,当你创造更多的球员时,这会非常糟糕。
这是相当令人沮丧的,它使AVPlayer
几乎无法用于我正在尝试做的事情。阻止像这样的主线程这样做是一件非常业余的事情,很难相信苹果工程师会犯这种错误。无论如何,希望他们能尽快解决这个问题。
尽可能在后台队列中构建AVPlayerItem
(您必须在主线程上执行某些操作,但您可以执行设置操作并等待视频属性加载到后台队列 - 请仔细阅读文档)。这涉及与KVO的伏都教舞蹈,真的不好玩。
当AVPlayer
等待AVPlayerItem
s状态变成AVPlayerItemStatusReadyToPlay
时,打嗝发生了。为了减少你想要做的打嗝的长度,你可以尽可能地将AVPlayerItem
放在背景线程上更接近AVPlayerItemStatusReadyToPlay
,然后再将它分配给AVPlayer
。
自从我实际实现了这一段时间已经有一段时间了,但是IIRC主要的线程块是由于底层AVURLAsset
的属性是懒惰加载而引起的,如果你不自己加载它们,当AVPlayer
时它们会在主线程上忙碌加载想玩。
查看AVAsset文档,尤其是AVAsynchronousKeyValueLoading
周围的内容。我认为我们需要在duration
上使用资产之前加载tracks
和AVPlayer
的值,以最小化主线程块。我们可能还必须遍历每个轨道并在每个段上执行AVAsynchronousKeyValueLoading
,但我不记得100%。
不知道这是否会有所帮助 - 但是这里有一些我用来加载后台队列视频的代码,这肯定有助于主线程阻塞(道歉如果它不能编译1:1,我从更大的代码库中抽象出来我我正在努力:
func loadSource() {
self.status = .Unknown
let operation = NSBlockOperation()
operation.addExecutionBlock { () -> Void in
// create the asset
let asset = AVURLAsset(URL: self.mediaUrl, options: nil)
// load values for track keys
let keys = ["tracks", "duration"]
asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: { () -> Void in
// Loop through and check to make sure keys loaded
var keyStatusError: NSError?
for key in keys {
var error: NSError?
let keyStatus: AVKeyValueStatus = asset.statusOfValueForKey(key, error: &error)
if keyStatus == .Failed {
let userInfo = [NSUnderlyingErrorKey : key]
keyStatusError = NSError(domain: MovieSourceErrorDomain, code: MovieSourceAssetFailedToLoadKeyValueErrorCode, userInfo: userInfo)
println("Failed to load key: \(key), error: \(error)")
}
else if keyStatus != .Loaded {
println("Warning: Ignoring key status: \(keyStatus), for key: \(key), error: \(error)")
}
}
if keyStatusError == nil {
if operation.cancelled == false {
let composition = self.createCompositionFromAsset(asset)
// register notifications
let playerItem = AVPlayerItem(asset: composition)
self.registerNotificationsForItem(playerItem)
self.playerItem = playerItem
// create the player
let player = AVPlayer(playerItem: playerItem)
self.player = player
}
}
else {
println("Failed to load asset: \(keyStatusError)")
}
})
// add operation to the queue
SomeBackgroundQueue.addOperation(operation)
}
func createCompositionFromAsset(asset: AVAsset, repeatCount: UInt8 = 16) -> AVMutableComposition {
let composition = AVMutableComposition()
let timescale = asset.duration.timescale
let duration = asset.duration.value
let editRange = CMTimeRangeMake(CMTimeMake(0, timescale), CMTimeMake(duration, timescale))
var error: NSError?
let success = composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
if success {
for _ in 0 ..< repeatCount - 1 {
composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
}
}
return composition
}
如果你看看Facebook的AsyncDisplayKit(Facebook和Instagram提供商背后的引擎)你可以使用他们的AVideoNode在后台线程上渲染大部分视频。如果您将其子节点到ASDisplayNode并将displayNode.view添加到您正在滚动的任何视图(表/集合/滚动),您可以实现完美平滑的滚动(只需确保创建节点和资产以及后台线程上的所有内容) 。唯一的问题是更改视频项时,因为这会强制自己进入主线程。如果您在该特定视图上只有几个视频,则可以使用此方法!
dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), {
self.mainNode = ASDisplayNode()
self.videoNode = ASVideoNode()
self.videoNode!.asset = AVAsset(URL: self.videoUrl!)
self.videoNode!.frame = CGRectMake(0.0, 0.0, self.bounds.width, self.bounds.height)
self.videoNode!.gravity = AVLayerVideoGravityResizeAspectFill
self.videoNode!.shouldAutoplay = true
self.videoNode!.shouldAutorepeat = true
self.videoNode!.muted = true
self.videoNode!.playButton.hidden = true
dispatch_async(dispatch_get_main_queue(), {
self.mainNode!.addSubnode(self.videoNode!)
self.addSubview(self.mainNode!.view)
})
})
这是一个在UICollectionView中显示“视频墙”的工作解决方案:
1)将所有单元格存储在NSMapTable中(从此以后,您只能从NSMapTable访问单元格对象):
self.cellCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsWeakMemory valueOptions:NSPointerFunctionsStrongMemory capacity:AppDelegate.sharedAppDelegate.assetsFetchResults.count];
for (NSInteger i = 0; i < AppDelegate.sharedAppDelegate.assetsFetchResults.count; i++) {
[self.cellCache setObject:(AssetPickerCollectionViewCell *)[self.collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:[NSIndexPath indexPathForItem:i inSection:0]] forKey:[NSIndexPath indexPathForItem:i inSection:0]];
}
2)将此方法添加到您的UICollectionViewCell子类:
- (void)setupPlayer:(PHAsset *)phAsset {
typedef void (^player) (void);
player play = ^{
NSString __autoreleasing *serialDispatchCellQueueDescription = ([NSString stringWithFormat:@"%@ serial cell queue", self]);
dispatch_queue_t __autoreleasing serialDispatchCellQueue = dispatch_queue_create([serialDispatchCellQueueDescription UTF8String], DISPATCH_QUEUE_SERIAL);
dispatch_async(serialDispatchCellQueue, ^{
__weak typeof(self) weakSelf = self;
__weak typeof(PHAsset) *weakPhAsset = phAsset;
[[PHImageManager defaultManager] requestPlayerItemForVideo:weakPhAsset options:nil
resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) {
if(![[info objectForKey:PHImageResultIsInCloudKey] boolValue]) {
AVPlayer __autoreleasing *player = [AVPlayer playerWithPlayerItem:playerItem];
__block typeof(AVPlayerLayer) *weakPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
[weakPlayerLayer setFrame:weakSelf.contentView.bounds]; //CGRectMake(self.contentView.bounds.origin.x, self.contentView.bounds.origin.y, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height * (9.0/16.0))];
[weakPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[weakPlayerLayer setBorderWidth:0.25f];
[weakPlayerLayer setBorderColor:[UIColor whiteColor].CGColor];
[player play];
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.contentView.layer addSublayer:weakPlayerLayer];
});
}
}];
});
}; play();
}
3)以这种方式从您的UICollectionView委托调用上面的方法:
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
if ([[self.cellCache objectForKey:indexPath] isKindOfClass:[AssetPickerCollectionViewCell class]])
[self.cellCache setObject:(AssetPickerCollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath] forKey:indexPath];
dispatch_async(dispatch_get_global_queue(0, DISPATCH_QUEUE_PRIORITY_HIGH), ^{
NSInvocationOperation *invOp = [[NSInvocationOperation alloc]
initWithTarget:(AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath]
selector:@selector(setupPlayer:) object:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]];
[[NSOperationQueue mainQueue] addOperation:invOp];
});
return (AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath];
}
顺便说一下,这里是如何使用Photos应用程序的Video文件夹中的所有视频填充PHFetchResult集合:
// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
__block PHFetchResult *i = self->_assetsFetchResults;
if (!i) {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
PHAssetCollection *collection = smartAlbums.firstObject;
if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
self->_assetsFetchResults = i;
});
}
NSLog(@"assetsFetchResults (%ld)", self->_assetsFetchResults.count);
return i;
}
如果您想过滤本地(而不是iCloud)的视频,这就是我所假设的,当您正在寻找平滑滚动时:
// Filter videos that are stored in iCloud
- (NSArray *)phAssets {
NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count];
[[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
if (asset.sourceType == PHAssetSourceTypeUserLibrary)
[assets addObject:asset];
}];
return [NSArray arrayWithArray:(NSArray *)assets];
}
我已经玩弄了上面的所有答案,发现他们只有一定的限制。
到目前为止,对我来说最简单和最简单的方法是将AVPlayerItem
分配给后台线程中的AVPlayer
实例的代码。我注意到将AVPlayerItem
分配给主线程上的玩家(即使在AVPlayerItem
对象准备就绪后)也会对你的性能和帧速率造成影响。
斯威夫特4
恩。
let mediaUrl = //your media string
let player = AVPlayer()
let playerItem = AVPlayerItem(url: mediaUrl)
DispatchQueue.global(qos: .default).async {
player.replaceCurrentItem(with: playerItem)
}
我设法在每个单元格中使用avplayer
创建一个像视图一样的水平Feed,就像这样:
AVPlayers
数量取决于您要寻找的体验。在我的应用程序中,我只管理3个AVPlayers
,因此现在正在玩一个玩家并且前一个和下一个玩家正在缓冲。所有缓冲管理器正在做的是管理在任何给定点缓冲正确的视频TableView
/ CollectionView
重复使用cellForRowAtIndexPath:
中的细胞所有你需要做的就是在你取消细胞后传递给它正确的游戏者(我只是在细胞上缓冲一个indexPath并返回正确的一个)AVPlayer
KVO's - 每当缓冲管理器接到一个加载新视频的呼叫以缓冲AVPlayer创建他的所有资产和通知时,只需像这样调用它们://播放器
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
self.videoContainer.playerLayer.player = self.videoPlayer;
self.asset = [AVURLAsset assetWithURL:[NSURL URLWithString:self.videoUrl]];
NSString *tracksKey = @"tracks";
dispatch_async(dispatch_get_main_queue(), ^{
[self.asset loadValuesAsynchronouslyForKeys:@[tracksKey]
completionHandler:^{ dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
NSError *error;
AVKeyValueStatus status = [self.asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset];
// add the notification on the video
// set notification that we need to get on run time on the player & items
// a notification if the current item state has changed
[self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:contextItemStatus];
// a notification if the playing item has not yet started to buffer
[self.playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferEmpty];
// a notification if the playing item has fully buffered
[self.playerItem addObserver:self forKeyPath:@"playbackBufferFull" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferFull];
// a notification if the playing item is likely to keep up with the current buffering rate
[self.playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:contextPlaybackLikelyToKeepUp];
// a notification to get information about the duration of the playing item
[self.playerItem addObserver:self forKeyPath:@"duration" options:NSKeyValueObservingOptionNew context:contextDurationUpdate];
// a notificaiton to get information when the video has finished playing
[NotificationCenter addObserver:self selector:@selector(itemDidFinishedPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
self.didRegisterWhenLoad = YES;
self.videoPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
// a notification if the player has chenge it's rate (play/pause)
[self.videoPlayer addObserver:self forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:contextRateDidChange];
// a notification to get the buffering rate on the current playing item
[self.videoPlayer addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:contextTimeRanges];
}
});
}];
});
});
其中:videoContainer - 是您要添加播放器的视图
如果您需要任何帮助或更多解释,请与我们联系
祝好运 :)