我一直在尝试使用.srt文件作为定时文本源(仅适用于android 4.1+ http://developer.android.com/about/versions/android-4.1.html#Multimedia)。第一个问题与获取.srt文件的文件描述符有关(在assets文件夹中,你将如何在应用程序中捆绑它?)。该文件会自动压缩,因此您甚至无法在不更改编译设置或执行自定义构建的情况下查看该文件。最简单的解决方案是将.srt文件重命名为.jpg,以便它不会被压缩,openFD方法仍然有效。我现在添加TimedTextSource:
_myMP.addTimedTextSource(getAssets().openFd("captions.jpg").getFileDescriptor(), MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP);
现在文件正确加载并使用myMP.getTrackInfo()获取轨道列表,可以看到在添加定时文本源后,第6个轨道的类型为“3”,这是定时文本轨道类型。我已经使用selectTrack来选择此曲目,如谷歌文档中所述,但在这样做之后,我的TimedTextListener上没有出现任何字幕:
_myMP.setOnTimedTextListener(new OnTimedTextListener(){
@Override
public void onTimedText(MediaPlayer mp, TimedText text) {
if (text!=null)
Log.d("TimedText", text.getText());
}
});
仅触发一次(我在文件中有20个定时文本事件),但text参数始终为null。我已经完成了搜索,找不到使用timeText的单个工作代码示例,并且它没有出现在任何示例项目中,除了谷歌的api文档之外几乎没有其他文档,但据我所知,没有人发布过它的工作实例。我在更新到Android 4.2的谷歌Nexus上测试了这个
我能够使这个工作,因为它仍然是一个悬而未决的问题,我将在这里包括完整的解决方案。
虽然更改文件扩展名以防止压缩的想法很好,但我更喜欢将srt
文件从资源复制到设备上的app本地目录,但是为了完整起见,这里是一个扩展列表不会被压缩。
“.jpg”,“。jpeg”,“。png”,“。gif”,“。wav”,“。mp2”,“。mp3”,“。”“,”。aac“,”。mpg“, “.mpeg”,“。mid”,“。midi”,“。smf”,“。jet”,“。tttll”,“。imy”,“。xmf”,“。mp4”,“。m4a”, “.m4v”,“。3gp”,“。3gpp”,“。3g2”,“。3gpp2”,“。amr”,“。awb”,“。wma”,“。wmv”
解决方案步骤很简单:
MediaPlayer
实例并通过调用MediaPlayer.create()
或player.setDataSource()
然后player.prepare()
来准备它player.addTimedTextSource()
调用String
,其中MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP
包含设备上字幕文件的完整路径,TimedText
作为第二个参数player.selectTrack()
选择the index of timedTextType
轨道并通过搜索从TrackInfo[]
返回的player.getTrackInfo()
传递2
(我发现它通常是player.setOnTimedTextListener()
)player.start()
设置一个监听器,然后开始播放媒体文件res/raw
这是完整的课程:
要运行这个确切的类,您需要在sub.srt
文件夹video.mp4
和TextView
(或任何扩展名)下的两个文件。然后用id txtDisplay
定义一个API 16
。最后,您的项目/设备/模拟器必须支持public class MainActivity extends Activity implements OnTimedTextListener {
private static final String TAG = "TimedTextTest";
private TextView txtDisplay;
private static Handler handler = new Handler();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
txtDisplay = (TextView) findViewById(R.id.txtDisplay);
MediaPlayer player = MediaPlayer.create(this, R.raw.video);
try {
player.addTimedTextSource(getSubtitleFile(R.raw.sub),
MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP);
int textTrackIndex = findTrackIndexFor(
TrackInfo.MEDIA_TRACK_TYPE_TIMEDTEXT, player.getTrackInfo());
if (textTrackIndex >= 0) {
player.selectTrack(textTrackIndex);
} else {
Log.w(TAG, "Cannot find text track!");
}
player.setOnTimedTextListener(this);
player.start();
} catch (Exception e) {
e.printStackTrace();
}
}
private int findTrackIndexFor(int mediaTrackType, TrackInfo[] trackInfo) {
int index = -1;
for (int i = 0; i < trackInfo.length; i++) {
if (trackInfo[i].getTrackType() == mediaTrackType) {
return i;
}
}
return index;
}
private String getSubtitleFile(int resId) {
String fileName = getResources().getResourceEntryName(resId);
File subtitleFile = getFileStreamPath(fileName);
if (subtitleFile.exists()) {
Log.d(TAG, "Subtitle already exists");
return subtitleFile.getAbsolutePath();
}
Log.d(TAG, "Subtitle does not exists, copy it from res/raw");
// Copy the file from the res/raw folder to your app folder on the
// device
InputStream inputStream = null;
OutputStream outputStream = null;
try {
inputStream = getResources().openRawResource(resId);
outputStream = new FileOutputStream(subtitleFile, false);
copyFile(inputStream, outputStream);
return subtitleFile.getAbsolutePath();
} catch (Exception e) {
e.printStackTrace();
} finally {
closeStreams(inputStream, outputStream);
}
return "";
}
private void copyFile(InputStream inputStream, OutputStream outputStream)
throws IOException {
final int BUFFER_SIZE = 1024;
byte[] buffer = new byte[BUFFER_SIZE];
int length = -1;
while ((length = inputStream.read(buffer)) != -1) {
outputStream.write(buffer, 0, length);
}
}
// A handy method I use to close all the streams
private void closeStreams(Closeable... closeables) {
if (closeables != null) {
for (Closeable stream : closeables) {
if (stream != null) {
try {
stream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
@Override
public void onTimedText(final MediaPlayer mp, final TimedText text) {
if (text != null) {
handler.post(new Runnable() {
@Override
public void run() {
int seconds = mp.getCurrentPosition() / 1000;
txtDisplay.setText("[" + secondsToDuration(seconds) + "] "
+ text.getText());
}
});
}
}
// To display the seconds in the duration format 00:00:00
public String secondsToDuration(int seconds) {
return String.format("%02d:%02d:%02d", seconds / 3600,
(seconds % 3600) / 60, (seconds % 60), Locale.US);
}
}
subtitle
这里是我使用的1
00:00:00,220 --> 00:00:01,215
First Text Example
2
00:00:03,148 --> 00:00:05,053
Second Text Example
3
00:00:08,004 --> 00:00:09,884
Third Text Example
4
00:00:11,300 --> 00:00:12,900
Fourth Text Example
5
00:00:15,500 --> 00:00:16,700
Fifth Text Example
6
00:00:18,434 --> 00:00:20,434
Sixth Text Example
7
00:00:22,600 --> 00:00:23,700
Last Text Example
文件作为示例:
TextView
以下是来自测试应用程序的一些屏幕截图,显示随着媒体文件的进展,an example project自动更改(即从字幕文件中读取)
编辑:
这是package ca.yourpackage.yourapp;
import android.content.Context;
import android.media.MediaPlayer;
import android.util.AttributeSet;
import android.util.Log;
import android.widget.TextView;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.LineNumberReader;
import java.util.Locale;
import java.util.Map;
import java.util.TreeMap;
/**
* Created by MHDante on 2015-07-26.
*/
public class SubtitleView extends TextView implements Runnable{
private static final String TAG = "SubtitleView";
private static final boolean DEBUG = false;
private static final int UPDATE_INTERVAL = 300;
private MediaPlayer player;
private TreeMap<Long, Line> track;
public SubtitleView(Context context) {
super(context);
}
public SubtitleView(Context context, AttributeSet attrs) {
super(context, attrs);
}
public SubtitleView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
}
@Override
public void run() {
if (player !=null && track!= null){
int seconds = player.getCurrentPosition() / 1000;
setText((DEBUG?"[" + secondsToDuration(seconds) + "] ":"")
+ getTimedText(player.getCurrentPosition()));
}
postDelayed(this, UPDATE_INTERVAL);
}
private String getTimedText(long currentPosition) {
String result = "";
for(Map.Entry<Long, Line> entry: track.entrySet()){
if (currentPosition < entry.getKey()) break;
if (currentPosition < entry.getValue().to) result = entry.getValue().text;
}
return result;
}
// To display the seconds in the duration format 00:00:00
public String secondsToDuration(int seconds) {
return String.format("%02d:%02d:%02d", seconds / 3600,
(seconds % 3600) / 60, (seconds % 60), Locale.US);
}
@Override
protected void onAttachedToWindow() {
super.onAttachedToWindow();
postDelayed(this, 300);
}
@Override
protected void onDetachedFromWindow() {
super.onDetachedFromWindow();
removeCallbacks(this);
}
public void setPlayer(MediaPlayer player) {
this.player = player;
}
public void setSubSource(int ResID, String mime){
if(mime.equals(MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP))
track = getSubtitleFile(ResID);
else
throw new UnsupportedOperationException("Parser only built for SRT subs");
}
/////////////Utility Methods:
//Based on https://github.com/sannies/mp4parser/
//Apache 2.0 Licence at: https://github.com/sannies/mp4parser/blob/master/LICENSE
public static TreeMap<Long, Line> parse(InputStream is) throws IOException {
LineNumberReader r = new LineNumberReader(new InputStreamReader(is, "UTF-8"));
TreeMap<Long, Line> track = new TreeMap<>();
while ((r.readLine()) != null) /*Read cue number*/{
String timeString = r.readLine();
String lineString = "";
String s;
while (!((s = r.readLine()) == null || s.trim().equals(""))) {
lineString += s + "\n";
}
long startTime = parse(timeString.split("-->")[0]);
long endTime = parse(timeString.split("-->")[1]);
track.put(startTime, new Line(startTime, endTime, lineString));
}
return track;
}
private static long parse(String in) {
long hours = Long.parseLong(in.split(":")[0].trim());
long minutes = Long.parseLong(in.split(":")[1].trim());
long seconds = Long.parseLong(in.split(":")[2].split(",")[0].trim());
long millies = Long.parseLong(in.split(":")[2].split(",")[1].trim());
return hours * 60 * 60 * 1000 + minutes * 60 * 1000 + seconds * 1000 + millies;
}
private TreeMap<Long, Line> getSubtitleFile(int resId) {
InputStream inputStream = null;
try {
inputStream = getResources().openRawResource(resId);
return parse(inputStream);
} catch (Exception e) {
e.printStackTrace();
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return null;
}
public static class Line {
long from;
long to;
String text;
public Line(long from, long to, String text) {
this.from = from;
this.to = to;
this.text = text;
}
}
}
的代码
编辑:我应该指出,在过去的几年里,后KitKat版本的Android已成为大多数应用程序使用的Android设备市场份额。下面的实现是尝试桥接与旧设备的兼容性。在这一点上,我建议使用TimedText框架(在KitKat中运行良好),或者由android发布的更新的替代方案,因为自定义解决方案可能具有显着的维护成本。
我花了两天时间查看android源代码,试图解决这个TimedText Framework导致的所有错误。
我的建议是完全跳过它们的实现。它不完整且不一致。在早期版本中,很多Text同步都是在本机媒体播放器中完成的,因此很容易出现状态错误。
我的替代方法是使用Textview子类:
//I used and reccomend asyncPrepare()
MediaPlayer mp = MediaPlayer.create(context, R.raw.video);
SubtitleView subView = (SubtitleView) getViewbyId(R.id.subs_box);
subView.setPlayer(mp);
subView.setSubSource(R.raw.subs_intro, MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP);
用法:
<ca.yourpagckage.yourapp.SubtitleView
android:layout_width="300dp"
android:layout_height="300dp"
android:text="Subtitles go Here"
android:id="@+id/subs_box"/>
在你的布局xml文件中,只需创建一个textView,就像你想要显示字幕一样,然后将类更改为ca.yourpagckage.yourapp.SubtitleView
MediaPlayer player = MediaPlayer.create(this, R.raw.video);
祝好运。
要使其与.mp3文件一起使用,请调用player.start();在宣布新的媒体播放器之后和addtimedtext代码之前立即。就在下面这一行之后
qazxswpoi