如何在 iOS 11 和 Swift 4 中从相机捕获深度数据?

问题描述 投票:0回答:4

我正在尝试使用

AVDepthData
从 iOS 11 中的相机获取深度数据,但当我使用
AVCapturePhotoCaptureDelegate
设置照片输出时,
photo.depthData
nil

所以我尝试用

AVCaptureDepthDataOutputDelegate
设置
AVCaptureDepthDataOutput
,尽管我不知道如何拍摄深度照片?

有人从

AVDepthData
得到过图片吗?

编辑:

这是我尝试过的代码:

// delegates: AVCapturePhotoCaptureDelegate & AVCaptureDepthDataOutputDelegate

@IBOutlet var image_view: UIImageView!
@IBOutlet var capture_button: UIButton!

var captureSession: AVCaptureSession?
var sessionOutput: AVCapturePhotoOutput?
var depthOutput: AVCaptureDepthDataOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

@IBAction func capture(_ sender: Any) {
    
    self.sessionOutput?.capturePhoto(with: AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]), delegate: self)
}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    
    self.previewLayer?.removeFromSuperlayer()
    self.image_view.image = UIImage(data: photo.fileDataRepresentation()!)

    let depth_map = photo.depthData?.depthDataMap
    print("depth_map:", depth_map) // is nil
}

func depthDataOutput(_ output: AVCaptureDepthDataOutput, didOutput depthData: AVDepthData, timestamp: CMTime, connection: AVCaptureConnection) {

    print("depth data") // never called
}

override func viewDidLoad() {
    super.viewDidLoad()
    
    self.captureSession = AVCaptureSession()
    self.captureSession?.sessionPreset = .photo
    
    self.sessionOutput = AVCapturePhotoOutput()
    self.depthOutput = AVCaptureDepthDataOutput()
    self.depthOutput?.setDelegate(self, callbackQueue: DispatchQueue(label: "depth queue"))
    
    do {
        
        let device = AVCaptureDevice.default(for: .video)
        let input = try AVCaptureDeviceInput(device: device!)
        if(self.captureSession?.canAddInput(input))!{
            self.captureSession?.addInput(input)
            
            if(self.captureSession?.canAddOutput(self.sessionOutput!))!{
                self.captureSession?.addOutput(self.sessionOutput!)
                
                
                if(self.captureSession?.canAddOutput(self.depthOutput!))!{
                    self.captureSession?.addOutput(self.depthOutput!)
                    
                    self.previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession!)
                    self.previewLayer?.frame = self.image_view.bounds
                    self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
                    self.previewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
                    self.image_view.layer.addSublayer(self.previewLayer!)    
                }
            }
        }
    } catch {}
    
    self.captureSession?.startRunning()
}

我正在尝试两件事,一件事是深度数据是

nil
,另一件事是我尝试调用深度委托方法。

有人知道我错过了什么吗?

swift camera ios11 color-depth
4个回答
10
投票

首先,你需要使用双摄像头,否则你将无法获得任何深度数据。

let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)

并保留对您的队列的引用

let dataOutputQueue = DispatchQueue(label: "data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)

您可能还想同步视频和深度数据

var outputSynchronizer: AVCaptureDataOutputSynchronizer?

然后你可以像这样在 viewDidLoad() 方法中同步两个输出

if sessionOutput?.isDepthDataDeliverySupported {
    sessionOutput?.isDepthDataDeliveryEnabled = true
    depthDataOutput?.connection(with: .depthData)!.isEnabled = true
    depthDataOutput?.isFilteringEnabled = true
    outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [sessionOutput!, depthDataOutput!])
    outputSynchronizer!.setDelegate(self, queue: self.dataOutputQueue)
}

我建议观看 WWDC 会议 507 - 他们还提供了一个完整的示例应用程序,完全可以满足您的需求。

https://developer.apple.com/videos/play/wwdc2017/507/


4
投票

为了向@klinger回答提供更多详细信息,这是您需要执行的操作来获取每个像素的深度数据,我写了一些评论,希望它有所帮助!

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

    //## Convert Disparity to Depth ##

    let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
    let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer

    //## Data Analysis ##

    // Useful data
    let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
    let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
    CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))

    // Convert the base address to a safe pointer of the appropriate type
    let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer<Float32>.self)

    // Read the data (returns value of type Float)
    // Accessible values : (width-1) * (height-1) = 767 * 575

    let distanceAtXYPoint = floatBuffer[Int(x * y)]

}

2
投票

有两种方法可以做到这一点,并且您正在尝试同时执行这两种操作:

  1. 捕获深度数据和图像。这是通过使用
    photo.depthData
    中的
    photoOutput(_:didFinishProcessingPhoto:error:)
    对象来完成的。我在下面解释为什么这对你不起作用。
  2. 使用
    AVCaptureDepthDataOutput
    并实施
    depthDataOutput(_:didOutput:timestamp:connection:)
    。我不确定为什么这对您不起作用,但实施
    depthDataOutput(_:didOutput:timestamp:connection:)
    可能会帮助您找出原因。

我认为#1是更好的选择,因为它将深度数据与图像配对。这样做的方法如下:

@IBAction func capture(_ sender: Any) {

    let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
    settings.isDepthDataDeliveryEnabled = true
    self.sessionOutput?.capturePhoto(with: settings, delegate: self)

}

// ...

override func viewDidLoad() {
    // ...
    self.sessionOutput = AVCapturePhotoOutput()
    self.sessionOutput.isDepthDataDeliveryEnabled = true
    // ...
}

那么,

depth_map
不应该是
nil
。请务必阅读 thisthis(独立但相似的页面),了解有关获取深度数据的更多信息。

对于#2,我不太确定为什么

depthDataOutput(_:didOutput:timestamp:connection:)
没有被调用,但您应该实现
depthDataOutput(_:didDrop:timestamp:connection:reason:)
以查看深度数据是否由于某种原因被删除。


0
投票

您初始化捕获设备的方式不正确。

您应该使用双摄像头模式。

至于 oc 如下:

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInDualCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
最新问题
© www.soinside.com 2019 - 2025. All rights reserved.