我想调用一个函数并从图像列表中构建一个视频,然后将其本地保存在设备上:
public void CreateAndSaveVideoFile(List<Bitmap> MyBitmapArray)
{
// ..
}
试验:
从当前目录中的所有JPEG文件创建MPEG-4文件:
mencoder mf://*.jpg -mf w=800:h=600:fps=25:type=jpg -ovc lavc \ -lavcopts vcodec=mpeg4:mbd=2:trell -oac copy -o output.avi
我不知道如何在Java / Android项目中使用上述内容。
任何人都可以帮助指导我或/并为我提供一个方法来完成我的任务吗?提前致谢。
您可以使用jcodec SequenceEncoder
将图像序列转换为MP4文件。
示例代码:
import org.jcodec.api.awt.SequenceEncoder;
...
SequenceEncoder enc = new SequenceEncoder(new File("filename"));
// GOP size will be supported in 0.2
// enc.getEncoder().setKeyInterval(25);
for(...) {
BufferedImage image = ... // Obtain an image to encode
enc.encodeImage(image);
}
enc.finish();
它是一个java库,所以很容易将它导入到Android项目中,你不必像ffmpeg那样使用NDK。
有关示例代码和下载,请参阅http://jcodec.org/。
使用JCodec
public static void main(String[] args) throws IOException {
SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("img%08d.png", i)));
encoder.encodeImage(bi);
}
encoder.finish();}
现在要将Bitmap转换为BufferedImage,您可以使用此类:
import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.awt.image.DataBufferInt;
import java.io.IOException;
import java.io.InputStream;
/**
* Utility class for loading windows bitmap files
* <p>
* Based on code from author Abdul Bezrati and Pepijn Van Eeckhoudt
*/
public class BitmapLoader {
/**
* Static method to load a bitmap file based on the filename passed in.
* Based on the bit count, this method will either call the 8 or 24 bit
* bitmap reader methods
*
* @param file The name of the bitmap file to read
* @throws IOException
* @return A BufferedImage of the bitmap
*/
public static BufferedImage loadBitmap(String file) throws IOException {
BufferedImage image;
InputStream input = null;
try {
input = ResourceRetriever.getResourceAsStream(file);
int bitmapFileHeaderLength = 14;
int bitmapInfoHeaderLength = 40;
byte bitmapFileHeader[] = new byte[bitmapFileHeaderLength];
byte bitmapInfoHeader[] = new byte[bitmapInfoHeaderLength];
input.read(bitmapFileHeader, 0, bitmapFileHeaderLength);
input.read(bitmapInfoHeader, 0, bitmapInfoHeaderLength);
int nSize = bytesToInt(bitmapFileHeader, 2);
int nWidth = bytesToInt(bitmapInfoHeader, 4);
int nHeight = bytesToInt(bitmapInfoHeader, 8);
int nBiSize = bytesToInt(bitmapInfoHeader, 0);
int nPlanes = bytesToShort(bitmapInfoHeader, 12);
int nBitCount = bytesToShort(bitmapInfoHeader, 14);
int nSizeImage = bytesToInt(bitmapInfoHeader, 20);
int nCompression = bytesToInt(bitmapInfoHeader, 16);
int nColoursUsed = bytesToInt(bitmapInfoHeader, 32);
int nXPixelsMeter = bytesToInt(bitmapInfoHeader, 24);
int nYPixelsMeter = bytesToInt(bitmapInfoHeader, 28);
int nImportantColours = bytesToInt(bitmapInfoHeader, 36);
if (nBitCount == 24) {
image = read24BitBitmap(nSizeImage, nHeight, nWidth, input);
} else if (nBitCount == 8) {
image = read8BitBitmap(nColoursUsed, nBitCount, nSizeImage, nWidth, nHeight, input);
} else {
System.out.println("Not a 24-bit or 8-bit Windows Bitmap, aborting...");
image = null;
}
} finally {
try {
if (input != null)
input.close();
} catch (IOException e) {
}
}
return image;
}
/**
* Static method to read a 8 bit bitmap
*
* @param nColoursUsed Number of colors used
* @param nBitCount The bit count
* @param nSizeImage The size of the image in bytes
* @param nWidth The width of the image
* @param input The input stream corresponding to the image
* @throws IOException
* @return A BufferedImage of the bitmap
*/
private static BufferedImage read8BitBitmap(int nColoursUsed, int nBitCount, int nSizeImage, int nWidth, int nHeight, InputStream input) throws IOException {
int nNumColors = (nColoursUsed > 0) ? nColoursUsed : (1 & 0xff) << nBitCount;
if (nSizeImage == 0) {
nSizeImage = ((((nWidth * nBitCount) + 31) & ~31) >> 3);
nSizeImage *= nHeight;
}
int npalette[] = new int[nNumColors];
byte bpalette[] = new byte[nNumColors * 4];
readBuffer(input, bpalette);
int nindex8 = 0;
for (int n = 0; n < nNumColors; n++) {
npalette[n] = (255 & 0xff) << 24 |
(bpalette[nindex8 + 2] & 0xff) << 16 |
(bpalette[nindex8 + 1] & 0xff) << 8 |
(bpalette[nindex8 + 0] & 0xff);
nindex8 += 4;
}
int npad8 = (nSizeImage / nHeight) - nWidth;
BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_INT_ARGB);
DataBufferInt dataBufferByte = ((DataBufferInt) bufferedImage.getRaster().getDataBuffer());
int[][] bankData = dataBufferByte.getBankData();
byte bdata[] = new byte[(nWidth + npad8) * nHeight];
readBuffer(input, bdata);
nindex8 = 0;
for (int j8 = nHeight - 1; j8 >= 0; j8--) {
for (int i8 = 0; i8 < nWidth; i8++) {
bankData[0][j8 * nWidth + i8] = npalette[((int) bdata[nindex8] & 0xff)];
nindex8++;
}
nindex8 += npad8;
}
return bufferedImage;
}
/**
* Static method to read a 24 bit bitmap
*
* @param nSizeImage size of the image in bytes
* @param nHeight The height of the image
* @param nWidth The width of the image
* @param input The input stream corresponding to the image
* @throws IOException
* @return A BufferedImage of the bitmap
*/
private static BufferedImage read24BitBitmap(int nSizeImage, int nHeight, int nWidth, InputStream input) throws IOException {
int npad = (nSizeImage / nHeight) - nWidth * 3;
if (npad == 4 || npad < 0)
npad = 0;
int nindex = 0;
BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_4BYTE_ABGR);
DataBufferByte dataBufferByte = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer());
byte[][] bankData = dataBufferByte.getBankData();
byte brgb[] = new byte[(nWidth + npad) * 3 * nHeight];
readBuffer(input, brgb);
for (int j = nHeight - 1; j >= 0; j--) {
for (int i = 0; i < nWidth; i++) {
int base = (j * nWidth + i) * 4;
bankData[0][base] = (byte) 255;
bankData[0][base + 1] = brgb[nindex];
bankData[0][base + 2] = brgb[nindex + 1];
bankData[0][base + 3] = brgb[nindex + 2];
nindex += 3;
}
nindex += npad;
}
return bufferedImage;
}
/**
* Converts bytes to an int
*
* @param bytes An array of bytes
* @param index
* @returns A int representation of the bytes
*/
private static int bytesToInt(byte[] bytes, int index) {
return (bytes[index + 3] & 0xff) << 24 |
(bytes[index + 2] & 0xff) << 16 |
(bytes[index + 1] & 0xff) << 8 |
bytes[index + 0] & 0xff;
}
/**
* Converts bytes to a short
*
* @param bytes An array of bytes
* @param index
* @returns A short representation of the bytes
*/
private static short bytesToShort(byte[] bytes, int index) {
return (short) (((bytes[index + 1] & 0xff) << 8) |
(bytes[index + 0] & 0xff));
}
/**
* Reads the buffer
*
* @param in An InputStream
* @param buffer An array of bytes
* @throws IOException
*/
private static void readBuffer(InputStream in, byte[] buffer) throws IOException {
int bytesRead = 0;
int bytesToRead = buffer.length;
while (bytesToRead > 0) {
int read = in.read(buffer, bytesRead, bytesToRead);
bytesRead += read;
bytesToRead -= read;
}
}
}
如果您的应用Android SDK的最低版本大于或等于16(Android 4.1),则视频编码的最佳方式是使用Android Media Codec API。
对视频进行编码时,Android 4.1(SDK 16)要求您为媒体提供ByteBuffer数组,但Android 4.3(SDK 18)现在允许您使用Surface作为编码器的输入。例如,这允许您对现有视频文件的输入进行编码,或使用从OpenGL ES生成的帧进行编码。
Media Muxer在Android 4.3(SDK 18)中添加了,所以为了方便用Media Muxer编写mp4文件,你应该有SDK> = 18。
使用Media Codec API方式,您将获得硬件加速编码,并且您可以轻松编码高达60 FPS。
你可以从1)How to encode Bitmaps into a video using MediaCodec?开始或使用2)Google Grafika或3)Bigflake。
从Grafika RecordFBOActivity.java开始。将Choreographer事件替换为您自己包含要编码的位图,删除On Screen drawing,将位图加载为Open GL Texture并将其绘制在Media Codec Input Surface上。
jCodec增加了Android支持。
你需要将这些添加到你的gradle ...
compile 'org.jcodec:jcodec:0.2.3'
compile 'org.jcodec:jcodec-android:0.2.3'
...和
android {
...
configurations.all {
resolutionStrategy.force 'com.google.code.findbugs:jsr305:3.0.2'
}
}
我可以确认这是按预期工作,但有警告。首先是我尝试了一些全尺寸图像并且文件已写入,但在播放时出错。当我按比例缩小时,如果图像的宽度或高度不均匀,我会得到一个错误,因为它需要YUV420J颜色空间的2的倍数。
另外值得注意的是,这会使你的包装重,重。我的小项目通过添加这个并且需要启用multidex来超过dex限制。
FileChannelWrapper out = null;
File dir = what ever directory you use...
File file = new File(dir, "test.mp4");
try { out = NIOUtils.writableFileChannel(file.getAbsolutePath());
AndroidSequenceEncoder encoder = new AndroidSequenceEncoder(out, Rational.R(15, 1));
for (Bitmap bitmap : bitmaps) {
encoder.encodeImage(bitmap);
}
encoder.finish();
} finally {
NIOUtils.closeQuietly(out);
}
您可以使用Bitmp4将图像序列转换为MP4文件。
示例代码:
...
val encoder = MP4Encoder()
encoder.setFrameDelay(50)
encoder.setOutputFilePath(exportedFile.path)
encoder.setOutputSize(width, width)
startExport()
stopExport()
addFrame(bitmap) //called intervally
它是一个java库,所以很容易将它导入到Android项目中,你不必像ffmpeg那样使用NDK。
有关示例代码和下载,请参阅https://github.com/dbof10/Bitmp4。
Abhishek V是对的,有关jcodec SequenceEncoder的更多信息:请参阅Android make animated video from list of images
最近我使用覆盆子pi和Android设备构建了一个实时视频系统,遇到了和你一样的问题。我没有保存图像文件列表,而是使用一些实时流媒体协议(如RTP / RTCP)将数据流传输给用户。如果您的要求是这样的,也许您可以改变您的策略。
另一个建议是你可以探索一些C / C ++库,使用NDK / JNI来打破Java的限制。
希望这些建议对你有意义:)
您可以使用ffmpeg库从图像阵列制作视频。 FFMPEG库对于制作视频非常有用。也许以下链接将帮助您。 http://osric.com/chris/accidental-developer/2012/04/using-ffmpeg-to-programmatically-slice-and-splice-video/ https://groups.google.com/forum/#!topic/android-ndk/sxDYlGYK-Xg
你有位图,你可以使用JCodec将其翻转成视频
这是一个示例图像序列编码器:
您可以通过用Bitmap替换BufferedImage来修改它。
根据您的需要使用这些方法。
public static Picture fromBitmap(Bitmap src) {
Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), RGB);
fromBitmap(src, dst);
return dst;
}
public static void fromBitmap(Bitmap src, Picture dst) {
int[] dstData = dst.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
int rgb = packed[srcOff];
dstData[dstOff] = (rgb >> 16) & 0xff;
dstData[dstOff + 1] = (rgb >> 8) & 0xff;
dstData[dstOff + 2] = rgb & 0xff;
}
}
}
public static Bitmap toBitmap(Picture src) {
Bitmap dst = Bitmap.create(pic.getWidth(), pic.getHeight(), ARGB_8888);
toBitmap(src, dst);
return dst;
}
public static void toBitmap(Picture src, Bitmap dst) {
int[] srcData = src.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
for (int i = 0, dstOff = 0, srcOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, dstOff++, srcOff += 3) {
packed[dstOff] = (srcData[srcOff] << 16) | (srcData[srcOff + 1] << 8) | srcData[srcOff + 2];
}
}
dst.setPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
}