如何结合 AudioKit 和 ARKit

问题描述 投票:0回答:1

我有一个情况我想要

  1. 使用 ARKit 检测物体 ✅
  2. 使用 AudioKit 合成音频 ✅
  3. 将生成的音频通过管道传输到
    SCNNode

第一步,设置AudioKit

    func initKit() {
        // set up the audio engine
        do {
            try audioSession.setCategory(.playback)
            try audioSession.setActive(true)
            print("Started audio session")
        } catch {
            print(error)
        }
        
        do {
            try engine.start()
        } catch {
            print(error)
        }
    }

然后用相机检测场景中的物体:

    func rendererDidAdd(node: SCNNode, planeAnchor: ARPlaneAnchor) {
        // save the node so we can remove the planes later
        sceneNodes.insert(node)
        // Create a custom object to visualize the plane geometry and extent.
        let plane = Plane(anchor: planeAnchor, in: arView, attachedNode: node)
        // store the planes for access later
        planes.append(plane)
        
        // filter out unwanted planes
        switch planeAnchor.classification {

        ...

        default:
            // table, seat, window, door
            addNoiseFor(node: node, plane: plane)
            // add the plane to view if applicable
            addPlaneToView(node: node, plane: plane, planeAnchor: planeAnchor)
        }
        
    }

然后给节点添加噪声并启动。

    func addNoiseFor(node: SCNNode, plane: Plane) {
        let pink = PinkNoise(amplitude: 0.5)
        
        let player = SCNAudioPlayer(avAudioNode: pink.avAudioNode)
        player.willStartPlayback = {
            print("Started playing")
        }
        node.addAudioPlayer(player)
        pink.start()
        
        print("Playing audio for \(plane.identifier)")
    }

但是,这不起作用,因为

pink
需要一个附加到
AVAudioNode
的 AVAudioEngine 实例,而
SCNNode
不会创建一个实例。事实上,使用
SCNAudioPlayer(avAudioNode:)
时,它似乎不会在任何时候附加一个。

它也从不回拨

willStartPlayback()

现在,我做了一些调查,ARKit 确实在渲染后立即附加了一个

AVAudioEngine
实例,可以通过回调进行检查:

func renderer(_ renderer: any SCNSceneRenderer, didRenderScene scene: SCNScene, atTime time: TimeInterval) {}

…如果不是附加

SCNAudioPlayer(avAudioNode:)
,而是使用
SCNAudioSource(fileNamed:)
附加音频文件播放器(多么令人困惑的命名约定),它也会播放音频。它似乎做了一些内部初始化,而这些初始化在
avAudioNode
版本中不会发生。

问题的关键是我不知道你应该如何设置

SCNAudioPlayer(avAudioNode:)
才能“启动”音频。肯定是少了点什么!任何帮助将不胜感激。

swift scenekit arkit audiokit avaudioengine
1个回答
0
投票

这是如何使用 SCNAudioNode 播放声音的示例:

import UIKit
import QuartzCore
import SceneKit
import AVFoundation

class GameViewController: UIViewController {
    
    var sceneView = SCNView()
    let gameScene = SCNScene()
    
    let audioEngine = AVAudioEngine()
    let audioPlayerNode = AVAudioPlayerNode()

    override func viewDidLoad() {
        super.viewDidLoad()
        
        // create and add a camera to the scene
        let cameraNode = SCNNode()
        cameraNode.camera = SCNCamera()
        gameScene.rootNode.addChildNode(cameraNode)
        
        // place the camera
        cameraNode.position = SCNVector3(x: 0, y: 0, z: 15)
        
        // create and add a light to the scene
        let lightNode = SCNNode()
        lightNode.light = SCNLight()
        lightNode.light!.type = .omni
        lightNode.position = SCNVector3(x: 0, y: 10, z: 10)
        gameScene.rootNode.addChildNode(lightNode)
        
        // create and add an ambient light to the scene
        let ambientLightNode = SCNNode()
        ambientLightNode.light = SCNLight()
        ambientLightNode.light!.type = .ambient
        ambientLightNode.light!.color = UIColor.darkGray
        gameScene.rootNode.addChildNode(ambientLightNode)
        
        sceneView = self.view as! SCNView
        
        // set the scene to the view
        sceneView.scene = gameScene
        
        // allows the user to manipulate the camera
        sceneView.allowsCameraControl = true
        
        // show statistics such as fps and timing information
        sceneView.showsStatistics = false
        
        // configure the view
        sceneView.backgroundColor = UIColor.black
        
        // add a tap gesture recognizer
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(_:)))
        sceneView.addGestureRecognizer(tapGesture)
        
        self.setupAudioEngine()
        
    }
    
    func setupAudioEngine() {
        
        audioEngine.attach(audioPlayerNode)
        // Load your audio file into an AVAudioPCMBuffer
        guard let url = Bundle.main.url(forResource: "art.scnassets/some_sound", withExtension: "mp3") else {
            fatalError("Audio file not found")
        }
        let audioFile = try! AVAudioFile(forReading: url)
        let audioFormat = audioFile.processingFormat
        let audioFrameCount = UInt32(audioFile.length)
        let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)!
        try! audioFile.read(into: audioBuffer)

        audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: audioBuffer.format)

        try! audioEngine.start()

        // audioPlayerNode.scheduleBuffer(audioBuffer, at: nil, options: .loops, completionHandler: nil) // Looping
        audioPlayerNode.scheduleBuffer(audioBuffer, at: nil, options: [], completionHandler: nil) // No Looping

        
    }
    
    func setupNode() {
        
        let node = SCNNode(geometry: SCNSphere(radius: 1.0))
        
        let material = SCNMaterial()
        material.diffuse.contents = UIColor.red
        material.lightingModel = .physicallyBased
        
        node.geometry?.firstMaterial = material
        
        gameScene.rootNode.addChildNode(node)
        
        // Create an SCNAudioPlayer using the AVAudioNode
        let scnAudioPlayer = SCNAudioPlayer(avAudioNode: audioPlayerNode)
        node.addAudioPlayer(scnAudioPlayer)
        
        // Play the audio
        audioPlayerNode.play()
        
    }
    
    @objc func handleTap(_ gestureRecognize: UIGestureRecognizer) {
        self.setupNode()
    }
    
    override var prefersStatusBarHidden: Bool {
        return true
    }
    
    override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
        if UIDevice.current.userInterfaceIdiom == .phone {
            return .allButUpsideDown
        } else {
            return .all
        }
    }

}

将声音文件添加到您的项目中(相应地调整命名)

运行应用程序并点击屏幕。 (打开声音)

© www.soinside.com 2019 - 2024. All rights reserved.