如何解码具有透明背景的WebM / VP9? 我已经与此战斗了几天。我有一个视频,显示浏览器中具有透明背景的视频,但我无法在本机Android设备上模仿这种行为。它也给了我一个

问题描述 投票:0回答:0

ffprobe output_with_alpha_fixed.webm libavutil 59. 39.100 / 59. 39.100 libavcodec 61. 19.100 / 61. 19.100 libavformat 61. 7.100 / 61. 7.100 libavdevice 61. 3.100 / 61. 3.100 libavfilter 10. 4.100 / 10. 4.100 libswscale 8. 3.100 / 8. 3.100 libswresample 5. 3.100 / 5. 3.100 libpostproc 58. 3.100 / 58. 3.100 Input #0, matroska,webm, from 'output_with_alpha_fixed.webm': Metadata: ENCODER : Lavf61.7.100 Duration: 00:00:02.00, start: 0.000000, bitrate: 833 kb/s Stream #0:0: Video: vp9 (Profile 0), yuv420p(tv, unknown/smpte170m/bt709, progressive), 512x512, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn Metadata: alpha_mode : 1 ENCODER : Lavc61.19.100 libvpx-vp9 DURATION : 00:00:02.000000000

the ffprobe的notes:

单声流(无单独的alpha流)

alpha_mode

=1

  • yuv420p
  • 我使用
    libvpx
  • 手动解码流。我将在下面放置代码,但TLDR是我确切地看到了FFProbe上面指示的内容。 WebM容器只有一个流。 libvpx将流解释为YUV,而不是Yuva。
  • 框架呈正确渲染,但具有黑色背景。 我不确定该怎么办。如何使用透明背景(例如Web浏览器)和FFMPEG来解码?
    
    
  • 在解码时是我的日志输出:

Tracks: 1 Frame received: 512x512 (bit-depth: 8) Image format: 0x102 Alpha detected: No Strides - Y: 576, U: 288, V: 288, Alpha: 576 VPX decoder destroyed

我用来解码的C代码是:

vpx_codec_ctx_t codec; vpx_codec_dec_cfg_t cfg = {0}; vpx_codec_iface_t* iface = vpx_codec_vp9_dx(); cfg.threads = 4; // Enable multi-threading if (vpx_codec_dec_init(&codec, iface, &cfg, 0)) { LOGE("Failed to initialize VPX decoder"); return nullptr; } unsigned int bufferLength = static_cast<unsigned int>(length); if (bufferLength == 0) { LOGE("Buffer is empty"); vpx_codec_destroy(&codec); return nullptr; } vpx_codec_err_t decodeStatus = vpx_codec_decode(&codec, buffer, bufferLength, nullptr, 0); if (decodeStatus != VPX_CODEC_OK) { LOGE("Failed to decode VP9 video. Error: %s", vpx_codec_error(&codec)); vpx_codec_destroy(&codec); return nullptr; } vpx_codec_iter_t iter = nullptr; vpx_image_t* img = vpx_codec_get_frame(&codec, &iter); if (img == nullptr) { LOGE("No frame received from decoder"); vpx_codec_destroy(&codec); return nullptr; } LOGI("Frame received: %dx%d (bit-depth: %d)", img->d_w, img->d_h, img->bit_depth); LOGI("Image format: 0x%x", img->fmt); bool has_alpha = (img->fmt & VPX_IMG_FMT_HAS_ALPHA) != 0; LOGI("Alpha detected: %s", has_alpha ? "Yes" : "No"); int width = img->d_w; int height = img->d_h; int frameSize = width * height * 4; // RGBA buffer uint8_t* rgba = new uint8_t[frameSize]; uint8_t* y = img->planes[VPX_PLANE_Y]; uint8_t* u = img->planes[VPX_PLANE_U]; uint8_t* v = img->planes[VPX_PLANE_V]; uint8_t* alpha = img->planes[VPX_PLANE_ALPHA]; // Extract alpha plane int y_stride = img->stride[VPX_PLANE_Y]; int u_stride = img->stride[VPX_PLANE_U]; int v_stride = img->stride[VPX_PLANE_V]; int a_stride = img->stride[VPX_PLANE_ALPHA]; LOGI("Strides - Y: %d, U: %d, V: %d, Alpha: %d", y_stride, u_stride, v_stride, a_stride); // YUV to RGBA conversion with alpha handling for (int j = 0; j < height; ++j) { for (int i = 0; i < width; ++i) { int yIndex = j * y_stride + i; int uvIndex = (j / 2) * u_stride + (i / 2); int Y = y[yIndex]; int U = u[uvIndex] - 128; int V = v[uvIndex] - 128; int R = Y + 1.402 * V; int G = Y - 0.344 * U - 0.714 * V; int B = Y + 1.772 * U; R = (R > 255) ? 255 : (R < 0) ? 0 : R; G = (G > 255) ? 255 : (G < 0) ? 0 : G; B = (B > 255) ? 255 : (B < 0) ? 0 : B; int rgbaIndex = (j * width + i) * 4; rgba[rgbaIndex + 0] = static_cast<uint8_t>(R); rgba[rgbaIndex + 1] = static_cast<uint8_t>(G); rgba[rgbaIndex + 2] = static_cast<uint8_t>(B); rgba[rgbaIndex + 3] = (alpha) ? alpha[j * a_stride + i] : 255; // Alpha or fully opaque } } vpx_codec_destroy(&codec); LOGI("VPX decoder destroyed"); jbyteArray result = env->NewByteArray(frameSize); env->SetByteArrayRegion(result, 0, frameSize, reinterpret_cast<jbyte*>(rgba)); delete[] rgba;

是的,我有同样的错误...也许您可以尝试MPV。

webm libvpx vp9
最新问题
© www.soinside.com 2019 - 2025. All rights reserved.