我正在开发一个基于 RealityKit 和 SwiftUI 的沉浸式 VisionOS 应用程序。
此应用程序有
ModelEntities
,其中有 PerspectiveCamera
实体作为子实体。我正在创建相机,并将其添加到实体中
let cameraEntity = PerspectiveCamera()
cameraEntity.camera.far = 10000
cameraEntity.camera.fieldOfViewInDegrees = 60
cameraEntity.camera.near = 0.01
entity.addChild(cameraEntity)
SO 上有一些帖子,例如 this ,显然将此类摄像机视图显示为
arView
的一部分。然而我的应用程序不是 AR。沉浸式视图是以编程方式生成的。
我的问题是:
如何在 SwiftUI 2D 窗口中显示相机视图?
我联系了Apple,他们确认这是visionOS 1.0 … 2.0中的一个错误:目前可以定义
PerspectiveCamera
,但无法使用其输出。
他们建议使用 RealityRenderer
代替。这是建议的代码,我直到现在才检查。
@Observable
@MainActor
final class OffscreenRenderModel {
private let renderer: RealityRenderer
private let colorTexture: MTLTexture
init(scene: Entity) throws {
renderer = try RealityRenderer()
renderer.entities.append(scene)
let camera = PerspectiveCamera()
renderer.activeCamera = camera
renderer.entities.append(camera)
let textureDesc = MTLTextureDescriptor()
textureDesc.pixelFormat = .rgba8Unorm
textureDesc.width = 512
textureDesc.height = 512
textureDesc.usage = [.renderTarget, .shaderRead]
let device = MTLCreateSystemDefaultDevice()!
colorTexture = device.makeTexture(descriptor: textureDesc)!
}
func render() throws {
let cameraOutputDesc = RealityRenderer.CameraOutput.Descriptor.singleProjection(colorTexture: colorTexture)
let cameraOutput = try RealityRenderer.CameraOutput(cameraOutputDesc)
try renderer.updateAndRender(deltaTime: 0.1, cameraOutput: cameraOutput, onComplete: { renderer in
guard let colorTexture = cameraOutput.colorTextures.first else { fatalError() }
// The colorTexture holds the rendered scene.
})
}
}