我按照this文档为iOS上的AVPlayerItem制作了色度键过滤器。我希望所有符合条件的像素都变得透明。目前,条件是像素的色调值是否在 0.3 到 0.4 之间(绿色像素)。
我的过滤器:
- (CGFloat) hueFromRed:(CGFloat)red green:(CGFloat)green blue:(CGFloat)blue {
UIColor* color = [UIColor colorWithRed:red green:green blue:blue alpha:1];
CGFloat hue, saturation, brightness;
[color getHue:&hue saturation:&saturation brightness:&brightness alpha:nil];
return hue;
}
- (CIFilter<CIColorCube> *) chromaKeyFilterHuesFrom:(CGFloat)minHue to:(CGFloat)maxHue {
const unsigned int size = 64;
const size_t cubeDataSize = size * size * size * 4;
NSMutableData* cubeData = [[NSMutableData alloc] initWithCapacity:(cubeDataSize * sizeof(float))];
for (int z = 0; z < size; z++) {
CGFloat blue = ((double)z)/(size-1);
for (int y = 0; y < size; y++) {
CGFloat green = ((double)y)/(size-1);
for (int x = 0; x < size; x++) {
CGFloat red = ((double)x)/(size-1);
CGFloat hue = [self hueFromRed:red green:green blue:blue];
float alpha = (hue >= minHue && hue <= maxHue) ? 0 : 1;
float premultipliedRed = red * alpha;
float premultipliedGreen = green * alpha;
float premultipliedBlue = blue * alpha;
[cubeData appendBytes:&premultipliedRed length:sizeof(float)];
[cubeData appendBytes:&premultipliedGreen length:sizeof(float)];
[cubeData appendBytes:&premultipliedBlue length:sizeof(float)];
[cubeData appendBytes:&alpha length:sizeof(float)];
}
}
}
CIFilter<CIColorCube> *colorCubeFilter = CIFilter.colorCubeFilter;
colorCubeFilter.cubeDimension = size;
colorCubeFilter.cubeData = cubeData;
return colorCubeFilter;
}
在
ViewController
中,我创建了一个按钮来启动视频播放器并将过滤器应用于 AVPlayerItem
。 [self chromaKeyFilterHuesFrom:0.3 to:0.4]
表示过滤绿色像素:
- (AVVideoComposition*) createVideoComposition:(AVPlayerItem *)_playerItem {
CIFilter<CIColorCube>* chromaKeyFilter = [self chromaKeyFilterHuesFrom:0.3 to:0.4];
AVMutableVideoComposition *composition = [AVMutableVideoComposition videoCompositionWithAsset: _playerItem.asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *_Nonnull request) {
CIImage *image = request.sourceImage.imageByClampingToExtent;
[chromaKeyFilter setValue:image forKey:kCIInputImageKey];
CIImage *output = [chromaKeyFilter.outputImage imageByCroppingToRect:request.sourceImage.extent];
[request finishWithImage:output context:nil];
}];
return composition;
}
- (IBAction)playVideo:(id)sender {
NSString *path = [[NSBundle mainBundle] pathForResource:@"my_video_has_green_pixels_zone" ofType:@"mp4"];
NSURL *url = [NSURL fileURLWithPath:path];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:url];
playerItem.videoComposition = [self createVideoComposition:playerItem];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];
controller.player = player;
controller.view.frame = self.view.bounds;
controller.view.backgroundColor = UIColor.redColor;
[[self view] addSubview:controller.view];
[self presentViewController:controller animated:YES completion:nil];
[player play];
}
滤镜工作“正常”,只是绿色像素变成黑色而不是透明。我不明白为什么或是什么使这些像素变黑。
AVPlayerViewController
使用的像素缓冲区默认情况下可能不支持 Alpha 通道(以节省内存)。但您可以像这样更改像素格式:
AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];
controller.pixelBufferAttributes = @{kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_32BGRA};