AVFoundation和视频效果

问题描述 投票:3回答:2

我正在尝试一些视频编辑,我得到了排序和混合视频/音频一起正常工作,甚至一些基本的慢动作! :)现在我想要整合视频过滤器,不仅要集成到图层本身(否则我会在与CIFilter的公司中使用AVPlayerItemVideoOutput),而且还要在导出的最终视频文件中集成。因此,我目前正在研究将上面提到的CIFilter“渲染”到最终视频中,同时仍然使用CMTime对时序进行非常精确的控制。

有什么建议?

video avfoundation cifilter
2个回答
3
投票

您可以使用AVVideoCompositingAVAsynchronousVideoCompositionRequest协议来实现自定义合成器。

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

然后使用OpenGL渲染像素缓冲区,如Apple's Documentation中所述。这将允许您实现所需的任意数量的转换或过滤器。


0
投票

2015年的WWDC演讲解释了如何做到这一点。

从20:32 https://developer.apple.com/videos/play/wwdc2015/510/开始观看

出口:

第01步:

let vidComp = AVVideoComposition(asset: avAsset,
    applyingCIFiltersWithHandler: {
    request in
        filtered = request.sourceImage.imageByClampingToExtent();
        filtered = filtered.imageByApplyingFilter("CIGaussianBlur",
            withInputParameters: [kCIInputRadiusKey: 100])
        filtered = filtered.imageByCroppingToRect(request.sourceImage.extent)
        request.finishWithImage(filtered, context: cicontext)
 })

第02步:

let export = AVAssetExportSession(asset: avAsset,
    presetName: AVAssetExportPreset1920x1080)
export.outputFileType = AVFileTypeQuickTimeMovie
export.outputURL = outURL
export.videoComposition = vidComp
NSFileManager.defaultManager().removeItemAtURL(outURL)
export.exportAsynchronouslyWithCompletionHandler()

回放:

第01步:

let vidComp = AVVideoComposition(asset: avAsset,
    applyingCIFiltersWithHandler: {
 // same as earlier example
 })

第02步:

let playerItem = AVPlayerItem(asset: avAsset)
playerItem.videoComposition = vidComp
let player = AVPlayer(playerItem: playerItem)
player.play()

乔纳森的回答也是正确的。但是,Apple现已停止使用OpenGL。下面是使用Metal的Swift中的相同代码,

let theImage = CIImage.init(cvImageBuffer: foregroundPixelBuffer)

let blurFilter = CIFilter.init(name: "CIMotionBlur")
blurFilter?.setValue(theImage, forKey: "inputImage")

if let destinationImage = blurFilter?.outputImage {
    context?.render(destinationImage, to: outputBuffer)
}

上下文应声明如下,

context = CIContext.init(mtlDevice: device)

和设备如下,

// Ask for the default Metal device; this represents our GPU.
guard let defaultMetalDevice = MTLCreateSystemDefaultDevice() else {
    print("Metal is not supported on this device.")
    return nil
}
device = defaultMetalDevice

上下文和设备实例应该声明一次并重新使用以从其缓存能力中受益。

© www.soinside.com 2019 - 2024. All rights reserved.