直接将 Flutter 动画渲染到视频中

问题描述 投票:0回答:3

考虑到 Flutter 使用自己的图形引擎,有没有办法将 Flutter 动画直接渲染到视频中,或者以逐帧方式创建屏幕截图?

一个用例是,这可以让观众更轻松地进行演示。

例如,一位作者想要创建一个 Flutter 动画教程,他们在其中构建了一个演示应用程序并撰写了一篇配套的博客文章,使用直接由 Flutter 渲染的动画 GIF/视频。

另一个例子是 UI 团队以外的开发人员发现一个复杂的动画中有微小的错误。无需实际学习动画代码,他们就可以将动画渲染成视频并编辑带有注释的短片,然后将其发送给 UI 团队进行诊断。

flutter dart flutter-animation
3个回答
14
投票

它不漂亮,但我已经设法让原型工作了。 首先,所有动画都需要由一个主动画控制器驱动,这样我们就可以单步执行我们想要的动画的任何部分。其次,我们要记录的小部件树必须用全局键包裹在

RepaintBoundary
中。 RepaintBoundary 和它的键可以生成小部件树的快照,如下所示:

Future<Uint8List> _capturePngToUint8List() async {
    // renderBoxKey is the global key of my RepaintBoundary
    RenderRepaintBoundary boundary = renderBoxKey.currentContext.findRenderObject(); 
    
    // pixelratio allows you to render it at a higher resolution than the actual widget in the application.
    ui.Image image = await boundary.toImage(pixelRatio: 2.0);
    ByteData byteData = await image.toByteData(format: ui.ImageByteFormat.png);
    Uint8List pngBytes = byteData.buffer.asUint8List();

    return pngBytes;
  }

然后可以在一个循环中使用上述方法,该循环将小部件树捕获到 pngBytes 中,并按您想要的帧率指定的 deltaT 向前步进 animationController:

double t = 0;
int i = 1;

setState(() {
  animationController.value = 0.0;
});

Map<int, Uint8List> frames = {};
double dt = (1 / 60) / animationController.duration.inSeconds.toDouble();

while (t <= 1.0) {
  print("Rendering... ${t * 100}%");
  var bytes = await _capturePngToUint8List();
  frames[i] = bytes;

  t += dt;
  setState(() {
    animationController.value = t;
  });
  i++;
}

最后,所有这些 png 帧都可以通过管道传输到 ffmpeg 子进程中,以写入视频。 我还没有设法让这部分很好地工作(更新:向下滚动以查看解决方案),所以我所做的是将所有 png 帧写到实际的 png 文件中,然后我手动运行ffmpeg 在写入它们的文件夹中。 (注意:我已经使用 flutter desktop 来访问我安装的 ffmpeg,但是 pub.dev 上有a 包也可以在移动设备上获取 ffmpeg

List<Future<File>> fileWriterFutures = [];

frames.forEach((key, value) {
  fileWriterFutures.add(_writeFile(bytes: value, location: r"D:\path\to\my\images\folder\" + "frame_$key.png"));
});

await Future.wait(fileWriterFutures);

_runFFmpeg();

这是我的文件编写器帮助功能:

Future<File> _writeFile({@required String location, @required Uint8List bytes}) async {
  File file = File(location);
  return file.writeAsBytes(bytes);
}

这是我的 FFmpeg runner 函数:

void _runFFmpeg() async {
  // ffmpeg -y -r 60 -start_number 1 -i frame_%d.png -c:v libx264 -preset medium -tune animation -pix_fmt yuv420p test.mp4
  var process = await Process.start(
      "ffmpeg",
      [
        "-y", // replace output file if it already exists
        "-r", "60", // framrate
        "-start_number", "1",
        "-i", r"./test/frame_%d.png", // <- Change to location of images
        "-an", // don't expect audio
        "-c:v", "libx264rgb", // H.264 encoding
        "-preset", "medium",
        "-crf",
        "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
        "-tune", "animation",
        "-preset", "medium",
        "-pix_fmt", "yuv420p",
        r"./test/test.mp4" // <- Change to location of output
      ],
      mode: ProcessStartMode.inheritStdio // This mode causes some issues at times, so just remove it if it doesn't work. I use it mostly to debug the ffmpeg process' output
   );

  print("Done Rendering");
}

更新:

自发布此答案以来,我已经想出如何将图像直接通过管道传输到 ffmpeg,而无需先写出所有文件。以下是我从我的一个小部件中获取的更新的渲染函数。小部件的上下文中存在一些变量,但我希望可以从上下文中推断出它们的值:

void render([double? pixelRatio]) async {
    // If already rendering, return
    if (isRendering) return;

    String outputFileLocation = "final.mp4";

    setState(() {
      isRendering = true;
    });

    timeline.stop();

    await timeline.animateTo(0.0, duration: const Duration(milliseconds: 700), curve: Curves.easeInOutQuad);
    setState(() {
      timeline.value = 0.0;
    });

    await Future.delayed(const Duration(milliseconds: 100));

    try {
      int width = canvasSize.width.toInt();
      int height = canvasSize.height.toInt();
      int frameRate = 60;
      int numberOfFrames = frameRate * (timeline.duration!.inSeconds);

      print("starting ffmpeg..");
      var process = await Process.start(
          "ffmpeg",
          [
            "-y", // replace output file if it already exists
            // "-f", "rawvideo",
            // "-pix_fmt", "rgba",
            "-s", "${width}x$height", // size
            "-r", "$frameRate", // framrate
            "-i", "-",
            "-frames", "$numberOfFrames",
            "-an", // don't expect audio
            "-c:v", "libx264rgb", // H.264 encoding
            "-preset", "medium",
            "-crf",
            "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
            "-tune", "animation",
            "-preset", "medium",
            "-pix_fmt", "yuv420p",
            "-vf",
            "pad=ceil(iw/2)*2:ceil(ih/2)*2", // ensure width and height is divisible by 2
            outputFileLocation
          ],
          mode: ProcessStartMode.detachedWithStdio,
          runInShell: true);

      print("writing to ffmpeg...");
      RenderRepaintBoundary boundary = paintKey.currentContext!.findRenderObject()! as RenderRepaintBoundary;

      pixelRatio = pixelRatio ?? 1.0;
      print("Pixel Ratio: $pixelRatio");

      for (int i = 0; i <= numberOfFrames; i++) {
        Timeline.startSync("Render Video Frame");
        double t = (i.toDouble() / numberOfFrames.toDouble());
        // await timeline.animateTo(t, duration: Duration.zero);
        timeline.value = t;

        ui.Image image = await boundary.toImage(pixelRatio: pixelRatio);
        ByteData? rawData = await image.toByteData(format: ui.ImageByteFormat.png);
        var rawIntList = rawData!.buffer.asInt8List().toList();
        Timeline.finishSync();

        if (i % frameRate == 0) {
          print("${((t * 100.0) * 100).round() / 100}%");
        }

        process.stdin.add(rawIntList);

        image.dispose();
      }
      await process.stdin.flush();

      print("stopping ffmpeg...");
      await process.stdin.close();
      process.kill();
      print("done!");
    } catch (e) {
      print(e);
    } finally {
      await timeline.animateTo(beforeValue, duration: const Duration(milliseconds: 500), curve: Curves.easeInOutQuad);
      setState(() {
        isRendering = false;
      });
    }
  }

2
投票

我使用 Erik 的答案作为我自己实施的起点,并想添加到他的原始答案中。

将所有png图片保存到目标位置后,我使用Flutter的ffmpeg包制作了所有图片的视频。由于我花了一段时间才找到正确的设置来创建一个也可以用 QuickTime Player 播放的视频,所以我想与大家分享:

final FlutterFFmpeg _flutterFFmpeg =
    new FlutterFFmpeg(); // Create new ffmpeg instance somewhere in your code

// Function to create the video. All png files must be available in your target location prior to calling this function.
Future<String> _createVideoFromPngFiles(String location, int framerate) async {
  final dateAsString = DateFormat('ddMMyyyy_hhmmss').format(DateTime.now());
  final filePath =
      "$location/video_$dateAsString.mov"; // had to use mov to be able to play the video on QuickTime

  var arguments = [
    "-y", // Replace output file if it already exists
    "-r", "$framerate", // Your target framerate
    "-start_number", "1",
    "-i",
    "$location/frame_%d.png", // The location where you saved all your png files
    "-an", // Don't expect audio
    "-c:v",
    "libx264", // H.264 encoding, make sure to use the full-gpl ffmpeg package version
    "-preset", "medium",
    "-crf",
    "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
    "-tune", "animation",
    "-preset", "medium",
    "-pix_fmt",
    "yuv420p", // Set the pixel format to make it compatible for QuickTime
    "-vf",
    "pad=ceil(iw/2)*2:ceil(ih/2)*2", // Make sure that height and width are divisible by 2
    filePath
  ];

  final result = await _flutterFFmpeg.executeWithArguments(arguments);
  return result == 0
      ? filePath
      : ''; // Result == 0 indicates that video creation was successful
}

如果您正在使用

libx264
,请确保按照 flutter_ffmpeg 包的说明进行操作:您必须使用
full-gpl
版本,其中包含 x264 库。

根据动画的长度、所需的帧率、像素比和设备的内存,在写入文件之前保存所有帧可能会导致内存问题。因此,根据您的用例,您可能希望暂停/恢复动画并分批写入文件,这样您就不会冒超出可用内存的风险。


0
投票

渲染包(2023 年 1 月)

我现在已经开发了一个高级的渲染包,它采用@Erik W. 的方法并优化了重绘边界捕获(图像写入和转换过程),并使其更容易使用相关的 ffmpeg 函数。

Render
小部件包装您的小部件:

import 'package:render/render.dart';

final controller = RenderController();

@override
Widget build(BuildContext context) {
   return Render(
      controller: controller,
      child: Container(),
   );
}

然后用控制器捕捉动作:

final result = await renderController.captureMotion(
     duration,
     format: Format.gif,
);

final file = result.output;

© www.soinside.com 2019 - 2024. All rights reserved.