如何使用MSE SourceBuffer播放和寻找碎片MP4音频?

问题描述 投票:0回答:1

注:

如果您最终来到这里,您可能想看看 shaka-player 和随附的 shaka-streamer。用它。除非您确实需要,否则不要自己实现此操作。

我已经尝试了很长一段时间,希望能够在 Chrome、Firefox、Safari 等上播放音轨,但我一直碰壁。我目前的问题是我无法在碎片 MP4(或 MP3)中进行查找。

目前我正在将 MP3 等音频文件转换为碎片 MP4 (fMP4) 并将它们按块发送到客户端。我所做的是定义一个

CHUNK_DURACTION_SEC
(块持续时间以秒为单位)并计算块大小,如下所示:

chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);
chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);

有了这个,我对音频文件进行分区,并且可以完全跳跃地获取它

chunkSize
-每个块有很多字节:

-----------------------------------------
| chunk 1 | chunk 2 |   ...   | chunk n |
-----------------------------------------

音频文件如何转换为fMP4

ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp4 \
       -movflags faststart+frag_every_frame+empty_moov+default_base_moof \
        output.mp4

这似乎适用于 Chrome 和 Firefox(到目前为止)。

如何附加块

在遵循this示例之后,并意识到它根本无法按照here所解释的方式工作,我将其扔掉并从头开始。不幸的是没有成功。还是不行。

以下代码应该从头到尾播放一首曲目。然而,我也需要能够寻求。到目前为止,这根本不起作用。触发

seeking
事件后,搜索只会停止音频。

代码

/* Desired chunk duration in seconds. */
const CHUNK_DURATION_SEC = 20;

const AUDIO_EVENTS = [
  'ended',
  'error',
  'play',
  'playing',
  'seeking',
  'seeked',
  'pause',
  'timeupdate',
  'canplay',
  'loadedmetadata',
  'loadstart',
  'updateend',
];


class ChunksLoader {

  /** The total number of chunks for the track. */
  public readonly chunksTotal: number;

  /** The length of one chunk in bytes */
  public readonly chunkSize: number;

  /** Keeps track of requested chunks. */
  private readonly requested: boolean[];

  /** URL of endpoint for fetching audio chunks. */
  private readonly url: string;

  constructor(
    private track: Track,
    private sourceBuffer: SourceBuffer,
    private logger: NGXLogger,
  ) {

    this.chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);
    this.chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);

    this.requested = [];
    for (let i = 0; i < this.chunksTotal; i++) {
      this.requested[i] = false;
    }

    this.url = `${environment.apiBaseUrl}/api/tracks/${this.track.id}/play`;
  }

  /**
   * Fetch the first chunk.
   */
  public begin() {
    this.maybeFetchChunk(0);
  }

  /**
   * Handler for the "timeupdate" event. Checks if the next chunk should be fetched.
   *
   * @param currentTime
   *  The current time of the track which is currently played.
   */
  public handleOnTimeUpdate(currentTime: number) {

    const nextChunkIndex = Math.floor(currentTime / CHUNK_DURATION_SEC) + 1;
    const hasAllChunks = this.requested.every(val => !!val);

    if (nextChunkIndex === (this.chunksTotal - 1) && hasAllChunks) {
      this.logger.debug('Last chunk. Calling mediaSource.endOfStream();');
      return;
    }

    if (this.requested[nextChunkIndex] === true) {
      return;
    }

    if (currentTime < CHUNK_DURATION_SEC * (nextChunkIndex - 1 + 0.25)) {
      return;
    }

    this.maybeFetchChunk(nextChunkIndex);
  }

  /**
   * Fetches the chunk if it hasn't been requested yet. After the request finished, the returned
   * chunk gets appended to the SourceBuffer-instance.
   *
   * @param chunkIndex
   *  The chunk to fetch.
   */
  private maybeFetchChunk(chunkIndex: number) {

    const start = chunkIndex * this.chunkSize;
    const end = start + this.chunkSize - 1;

    if (this.requested[chunkIndex] == true) {
      return;
    }

    this.requested[chunkIndex] = true;

    if ((end - start) == 0) {
      this.logger.warn('Nothing to fetch.');
      return;
    }

    const totalKb = ((end - start) / 1000).toFixed(2);
    this.logger.debug(`Starting to fetch bytes ${start} to ${end} (total ${totalKb} kB). Chunk ${chunkIndex + 1} of ${this.chunksTotal}`);

    const xhr = new XMLHttpRequest();
    xhr.open('get', this.url);
    xhr.setRequestHeader('Authorization', `Bearer ${AuthenticationService.getJwtToken()}`);
    xhr.setRequestHeader('Range', 'bytes=' + start + '-' + end);
    xhr.responseType = 'arraybuffer';
    xhr.onload = () => {
      this.logger.debug(`Range ${start} to ${end} fetched`);
      this.logger.debug(`Requested size:        ${end - start + 1}`);
      this.logger.debug(`Fetched size:          ${xhr.response.byteLength}`);
      this.logger.debug('Appending chunk to SourceBuffer.');
      this.sourceBuffer.appendBuffer(xhr.response);
    };
    xhr.send();
  };

}

export enum StreamStatus {
  NOT_INITIALIZED,
  INITIALIZING,
  PLAYING,
  SEEKING,
  PAUSED,
  STOPPED,
  ERROR
}

export class PlayerState {
  status: StreamStatus = StreamStatus.NOT_INITIALIZED;
}


/**
 *
 */
@Injectable({
  providedIn: 'root'
})
export class MediaSourcePlayerService {

  public track: Track;

  private mediaSource: MediaSource;

  private sourceBuffer: SourceBuffer;

  private audioObj: HTMLAudioElement;

  private chunksLoader: ChunksLoader;

  private state: PlayerState = new PlayerState();

  private state$ = new BehaviorSubject<PlayerState>(this.state);

  public stateChange = this.state$.asObservable();

  private currentTime$ = new BehaviorSubject<number>(null);

  public currentTimeChange = this.currentTime$.asObservable();

  constructor(
    private httpClient: HttpClient,
    private logger: NGXLogger
  ) {
  }

  get canPlay() {
    const state = this.state$.getValue();
    const status = state.status;
    return status == StreamStatus.PAUSED;
  }

  get canPause() {
    const state = this.state$.getValue();
    const status = state.status;
    return status == StreamStatus.PLAYING || status == StreamStatus.SEEKING;
  }

  public playTrack(track: Track) {
    this.logger.debug('playTrack');
    this.track = track;
    this.startPlayingFrom(0);
  }

  public play() {
    this.logger.debug('play()');
    this.audioObj.play().then();
  }

  public pause() {
    this.logger.debug('pause()');
    this.audioObj.pause();
  }

  public stop() {
    this.logger.debug('stop()');
    this.audioObj.pause();
  }

  public seek(seconds: number) {
    this.logger.debug('seek()');
    this.audioObj.currentTime = seconds;
  }

  private startPlayingFrom(seconds: number) {
    this.logger.info(`Start playing from ${seconds.toFixed(2)} seconds`);
    this.mediaSource = new MediaSource();
    this.mediaSource.addEventListener('sourceopen', this.onSourceOpen);

    this.audioObj = document.createElement('audio');
    this.addEvents(this.audioObj, AUDIO_EVENTS, this.handleEvent);
    this.audioObj.src = URL.createObjectURL(this.mediaSource);

    this.audioObj.play().then();
  }

  private onSourceOpen = () => {

    this.logger.debug('onSourceOpen');

    this.mediaSource.removeEventListener('sourceopen', this.onSourceOpen);
    this.mediaSource.duration = this.track.duration;

    this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
    // this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mpeg');

    this.chunksLoader = new ChunksLoader(
      this.track,
      this.sourceBuffer,
      this.logger
    );

    this.chunksLoader.begin();
  };

  private handleEvent = (e) => {

    const currentTime = this.audioObj.currentTime.toFixed(2);
    const totalDuration = this.track.duration.toFixed(2);
    this.logger.warn(`MediaSource event: ${e.type} (${currentTime} of ${totalDuration} sec)`);

    this.currentTime$.next(this.audioObj.currentTime);

    const currentStatus = this.state$.getValue();

    switch (e.type) {
      case 'playing':
        currentStatus.status = StreamStatus.PLAYING;
        this.state$.next(currentStatus);
        break;
      case 'pause':
        currentStatus.status = StreamStatus.PAUSED;
        this.state$.next(currentStatus);
        break;
      case 'timeupdate':
        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);
        break;
      case 'seeking':
        currentStatus.status = StreamStatus.SEEKING;
        this.state$.next(currentStatus);
        if (this.mediaSource.readyState == 'open') {
          this.sourceBuffer.abort();
        }
        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);
        break;
    }
  };

  private addEvents(obj, events, handler) {
    events.forEach(event => obj.addEventListener(event, handler));
  }

}

运行它会给我以下输出:

enter image description here

对屏幕截图表示歉意,但不可能在没有 Chrome 中的所有堆栈跟踪的情况下仅复制输出。

我还尝试过遵循这个示例并调用

sourceBuffer.abort()
,但这没有用。它看起来更像是几年前的一个 hack,但它仍然在 docs 中被引用(参见“示例”->“您可以在 Nick Desaulnier 的 bufferWhenNeeded 演示中看到类似的操作 ..”)。

case 'seeking':
  currentStatus.status = StreamStatus.SEEKING;
  this.state$.next(currentStatus);        
  if (this.mediaSource.readyState === 'open') {
    this.sourceBuffer.abort();
  } 
  break;

尝试使用 MP3

我已经在 Chrome 下测试了上述代码,将曲目转换为 MP3:

ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp3 output.mp3

并使用

SourceBuffer
作为类型创建
audio/mpeg

this.mediaSource.addSourceBuffer('audio/mpeg')

我在寻找时也遇到同样的问题。

问题没有寻求

上面的代码还有另一个问题:

播放两分钟后,音频播放开始断断续续并提前停止。因此,音频播放到一定程度后就会在没有任何明显原因的情况下停止。

无论出于何种原因,都会发生另一个

canplay
playing
事件。几秒钟后,音频就停止了..

enter image description here

ffmpeg mp4 audio-streaming media-source fmp4
1个回答
0
投票

这个假设是错误的:

chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);

注释正确,“box”指的是MP4文件的内部结构。

无需深入了解 MP4 结构,这里有从电影中创建分段 mp4 文件的命令:

ffmpeg -y -i ./Coffee_Run_Blender_Open_Movie.mp4 -c copy -f hls -hls_segment_type fmp4 -hls_list_size 0 ./test.m3u8

现在,

./test.m3u8
以明文形式包含段文件名及其持续时间,供您解析:

#EXTM3U
#EXT-X-VERSION:7
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-MAP:URI="init.mp4"
#EXTINF:2.333333,
test0.m4s
#EXTINF:1.958333,
test1.m4s
...

注意文件大小如何不同。

以下是按顺序播放片段的 HTML 代码:

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <title>MSE Demo</title>
</head>
<body>
<h1>MSE Demo</h1>
<div>
  <video controls width="80%"></video>
</div>

<script type="text/javascript">
  function playVideo() {
    var baseUrl = './';
    var initUrl = baseUrl + 'init.mp4';
    var templateUrl = baseUrl + 'test$Number$.m4s';
    var sourceBuffer;
    var index = 0;
    var numberOfChunks = 93;
    var video = document.querySelector('video');

    if (!window.MediaSource) {
      console.error('No Media Source API available');
      return;
    }

    var ms = new MediaSource();
    video.src = window.URL.createObjectURL(ms);
    video.addEventListener('error', function(e) { console.error("video error", e); });
    ms.addEventListener('sourceopen', onMediaSourceOpen);
    ms.addEventListener('sourceended', function(e) { console.warn("sourceended"); });
    ms.addEventListener('sourceclose', function(e) { console.warn("sourceclose"); });

    function onMediaSourceOpen() {
      // /home/alex/forge/research/mp4/MP4Info/cmakebuild/mp4info init.mp4 | grep Codec
      sourceBuffer = ms.addSourceBuffer('video/mp4; codecs="avc1.64002A,mp4a.40.2"');
      sourceBuffer.addEventListener('updateend', nextSegment);

      GET(initUrl, appendToBuffer);

      video.play();
    }

    function nextSegment() {
      console.log("nextSegment", video.currentTime);
      var url = templateUrl.replace('$Number$', index);
      GET(url, appendToBuffer);
      index++;
      if (index > numberOfChunks) {
        sourceBuffer.removeEventListener('updateend', nextSegment);
      }
    }

    function appendToBuffer(videoChunk) {
      if (videoChunk) {
        // Uncaught DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': The SourceBuffer is full, and cannot free space to append additional buffers.
        sourceBuffer.appendBuffer(new Uint8Array(videoChunk));
      }
    }

    function GET(url, callback) {
      console.log("GET", url);
      var xhr = new XMLHttpRequest();
      xhr.open('GET', url);
      xhr.responseType = 'arraybuffer';

      xhr.onload = function(e) {
        console.log("FETCHED", url, xhr.status);
        if (xhr.status != 200) {
          console.warn('Unexpected status code ' + xhr.status + ' for ' + url);
          return false;
        }
        callback(xhr.response);
      };

      xhr.send();
    }
     
     
  };
</script>
<button onclick="playVideo()">(PLAY)</button>
</body>
</html>

简单地跳过某些段并在指定段中查找:

这是两步寻道计算。首先找到包含绝对搜索点的正确段,然后根据段附加的顺序计算相对搜索。

MediaSource 不适用于 iPhone,但您可以将视频直接指向

./test.m3u8
并进行绝对搜索就可以了。

© www.soinside.com 2019 - 2024. All rights reserved.