如何渲染 ESP32 摄像头捕获的视频流?

问题描述 投票:0回答:1

我是 stackoverflow 的新手!很高兴见到你。

我想在 Unity 中通过 ESP 32 摄像头实时渲染视频捕获。

但是当我玩这个项目时,游戏场景只显示Unity的默认背景...控制台没有显示任何错误。请帮我在Unity中渲染ESP32相机的视频..!

我知道该相机的流地址(例如http://192.1.1.1:81/stream),并且我想在Canvas上的RawImage对象上显示捕获的视频(MJPEG流)。我编写了一个脚本来执行以下操作(我得到了 ChatGPT 的帮助):

using UnityEngine;
using UnityEngine.UI;
using System.IO;
using System.Net;
using System.Collections;
using System.Linq;  // System.Linq 추가

public class MJPEGStream : MonoBehaviour
{
    public string url = "http://<ESP32_IP>:81/stream"; // MJPEG 스트리밍 URL
    public RawImage rawImage; // RawImage 컴포넌트

    private Texture2D texture;

    void Start()
    {
        if (rawImage == null)
        {
            Debug.LogError("RawImage 컴포넌트가 할당되지 않았습니다!");
            return; // rawImage가 null이면 함수 종료
        }

        // texture = new Texture2D(2, 2); // 빈 텍스처 초기화
        texture = new Texture2D(0, 0);
        rawImage.texture = texture; // RawImage에 텍스처 할당

        StartCoroutine(StartStream());
    }

    IEnumerator StartStream()
    {
        if (string.IsNullOrEmpty(url))
        {
            Debug.LogError("스트리밍 URL이 비어있습니다!");
            yield break;
        }

        HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
        WebResponse response = request.GetResponse();
        Stream stream = response.GetResponseStream();

        byte[] boundary = System.Text.Encoding.ASCII.GetBytes("--123456789000000000000987654321"); // boundary 설정

        while (true)
        {
            byte[] jpegData = ReadNextJPEGFrame(stream, boundary);
            if (jpegData != null)
            {
                texture.LoadImage(jpegData);
                Debug.Log("Texture width: " + texture.width + ", height: " + texture.height);
                rawImage.texture = texture;
                Debug.Log("Texture format: " + texture.format);
            }
            yield return null;
        }
    }

    // boundary 구분자를 기준으로 다음 JPEG 프레임을 읽음
    byte[] ReadNextJPEGFrame(Stream stream, byte[] boundary)
    {
        MemoryStream jpegStream = new MemoryStream();
        bool foundBoundary = false;

        while (true)
        {
            int byteRead = stream.ReadByte();
            if (byteRead == -1)
                return null; // 스트림 끝에 도달하면 종료

            jpegStream.WriteByte((byte)byteRead);

            // boundary를 찾아서 해당 부분에서 JPEG 데이터 추출
            if (jpegStream.Length >= boundary.Length)
            {
                byte[] lastBytes = new byte[boundary.Length];
                jpegStream.Position = jpegStream.Length - boundary.Length;
                jpegStream.Read(lastBytes, 0, boundary.Length);

                if (lastBytes.SequenceEqual(boundary))  // SequenceEqual 사용
                {
                    // boundary 찾음
                    byte[] jpegData = new byte[jpegStream.Length - boundary.Length];
                    jpegStream.Position = 0;
                    jpegStream.Read(jpegData, 0, jpegData.Length);
                    return jpegData; // JPEG 데이터 반환
                }
            }
        }
    }
}

我将此组件添加到 RawImage 对象中。这张图显示了RawImage对象的设置:rawimageInspector

esp32 unityscript
1个回答
0
投票

我为我的案例找到了很好的例子!对这个项目(github.com/DanielArnett/SampleUnityMjpegViewer)进行小修复后,我可以在Unity中流式传输相机的视频。

当我玩这个项目时,我遇到了如下两个错误:

1. Assets\Scripts\MjpegTexture.cs(4,19): error CS0234: The type or namespace name 'UI' does not exist in the namespace 'UnityEngine' (are you missing an assembly reference?)
2. Assets\Scripts\MjpegProcessor.cs(14,12): error CS0246: The type or namespace name 'Bitmap' could not be found (are you missing a using directive or an assembly reference?)

为了解决第一个错误,我执行以下操作:

  1. 窗口 > 包管理器
  2. 在 Unity 注册表中搜索“Unity UI”并 安装。
  3. 添加“使用UnityEngine.UI;”在“MjpegTexture.cs”中。

为了解决第二个错误,我将“Bitmap”相关代码替换为“Texture2D”。

我希望这对将来的人有帮助!

© www.soinside.com 2019 - 2024. All rights reserved.