1 个画布的 WebGPU 多个渲染通道

问题描述 投票:0回答:1

我是 WebGPU 的初学者,想要在单独的着色器模块中处理具有多个处理阶段的图像:

  1. 去饱和度
  2. 边缘检测
  3. 压缩

步骤 4 然后计算压缩纹理并将其转换为 ASCII。 我通过在 EffectComposer 中运行多个渲染通道在 Three.js 中完成了此操作,但我不确定如何在 WebGPU 中执行此操作。我的假设是我需要在某处指定输出,并将其用作纹理的绑定。

如何“链接”生成的纹理并将它们放入下一个着色器模块中?这是正确的方法吗?根据较小的功能分离着色器是一个坏主意吗?

这是我当前正在使用的代码:

// SETUP
if (!navigator.gpu) {
    throw new Error('WebGPU not supported on this browser :(');
}
const adapter = await navigator.gpu.requestAdapter();
if (!adapter) {
    throw new Error('No appropriate GPUAdapter found :(')
}
const device = await adapter.requestDevice();

const canvas = document.querySelector('canvas');
const context = canvas.getContext('webgpu');
const canvasFormat = navigator.gpu.getPreferredCanvasFormat();
context.configure({
    device: device,
    format: canvasFormat,
});


// DO STUFF!
// IMAGE -> TEXTURE for the sampler
const url = './someImg.jpg'
async function loadImageBitmap(url) {
    const res = await fetch(url);
    const blob = await res.blob();
    return await createImageBitmap(blob, { colorSpaceConversion: 'none' });
}
const source = await loadImageBitmap(url);
canvas.style.width = source.width + 'px' // adjusts the canvas based on img resolution
canvas.style.height = source.height + 'px'
canvas.width = source.width
canvas.height = source.height

// texture
const texture = device.createTexture({
    label: 'imgTexture',
    format: 'rgba8unorm',
    size: [source.width, source.height],
    usage:
        GPUTextureUsage.TEXTURE_BINDING |
        GPUTextureUsage.COPY_DST |
        GPUTextureUsage.RENDER_ATTACHMENT,
})
device.queue.copyExternalImageToTexture(
    { source, flipY: true },
    { texture },
    { width: source.width, height: source.height },
);

// SHADER #1 (desaturation)
// module
const module = device.createShaderModule({
    label: 'monochrome filter shader module',
    code: monoFilter, // WGSL file
});

// render pipeline
const pipeline = device.createRenderPipeline({
    label: 'monoFilter render pipeline',
    layout: 'auto',
    vertex: {
        module,
        targets: [{ format: canvasFormat }],
    },
    fragment: {
        module,
        targets: [{ format: canvasFormat }],
    }
})

// sampler
const sampler = device.createSampler({
    magFilter: 'linear',
    minFilter: 'linear',
})

// resolution buffer (vec2(x,y) for the shader)
const resolutionArray = new Float32Array([source.width, source.height])
const resolutionBuffer = device.createBuffer({
    label: 'resolution buffer',
    size: resolutionArray.byteLength,
    usage: GPUBufferUsage.STORAGE | GPUBufferUsage.COPY_DST,
})
device.queue.writeBuffer(resolutionBuffer, 0, resolutionArray)

// bindgroup
const bindGroup = device.createBindGroup({
    layout: pipeline.getBindGroupLayout(0),
    entries: [
        { binding: 0, resource: sampler },
        { binding: 1, resource: texture.createView() },
        { binding: 2, resource: { buffer: resolutionBuffer } }
    ]
})

// SHADER #2 (edge detection)
// module, pipeline, bindgroup ...

// CREATE AND DRAW RENDER PASS
function render() {
    const encoder = device.createCommandEncoder({
        label: 'render quad encoder',
    });
    const pass = encoder.beginRenderPass({
        colorAttachments: [{
            view: context.getCurrentTexture().createView(),
            clearValue: [0.2, 0.0, 0.3, 1.0],
            loadOp: 'clear',
            storeOp: 'store',
        }],
    });
    pass.setPipeline(pipeline);
    pass.setBindGroup(0, bindGroup);
    pass.draw(6);
    pass.end();

    // render pass for the next processing step ...

    device.queue.submit([encoder.finish()]);
}

render()

大部分内容是我从 https://codelabs.developers.google.com/your-first-webgpu-app#0

的教程中学到的

对于某些背景,我已经使用 WGSL 着色器完成了单通道渲染,并为采样器、纹理和其他任意数组缓冲区提供了自定义绑定。除此之外,我仍在学习 API 的基础知识。

javascript graphics shader webgpu wgsl
1个回答
0
投票

在步骤 1 中使用画布并不常见。即使在 Three.js 中,直到最后一步才使用画布。 参见 Three.js 手册

因此,根据您的需要创建 1 个或多个纹理中间纹理。

参考 Three.js 手册中的示例

render pass diagram

该图有效地显示了 2 个中间纹理,

rtA
rtB

所以,在 webgpu 中,如果你复制了该设置,你会执行类似的操作

// render to rtA
{
 pass = encoder.beingRenderPass({
   colorAttachments[{ view: rtA.createView(), ... }],
   ...
 });
 render scene to `rtA`
 pass.end();
}

// bloom pass to rtB
{
 pass = encoder.beingRenderPass({
   colorAttachments[{ view: rtB.createView(), ... }],
   ...
 });
 pass.setPipeline(bloomPipeline);
 pass.setBindGroup(..., someBindGroupThatReferencesRTA);
 pass.draw(...)
 pass.end();
}

// film pass to rtA
{
 pass = encoder.beingRenderPass({
   colorAttachments[{ view: rtA.createView(), ... }],
   ...
 });
 pass.setPipeline(filmPipeline);
 pass.setBindGroup(..., someBindGroupThatReferencesRTB);
 pass.draw(...)
 pass.end();
}

// copy pass to canvas
{
 pass = encoder.beingRenderPass({
   colorAttachments[{ view: context.getCurrentTexture().createView(), ... }],
   ...
 });
 pass.setPipeline(copyPipeline);
 pass.setBindGroup(..., someBindGroupThatReferencesRTA);
 pass.draw(...)
 pass.end();
}

您是否具有复制通道或直接渲染胶片通道上的画布(或您的最后效果是什么)取决于您。

© www.soinside.com 2019 - 2024. All rights reserved.