在 Android 中使用 webrtc 保存视频文件时遇到问题
Trouble saving a video file with webrtc in Android
我正在开发一个基于 webrtc 的视频聊天应用程序,目前视频通话正在运行,但我想使用 VideoFileRenderer 从远程视频流录制视频,接口有很多实现,例如:https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java
这是我正在使用的实现。它可以毫无问题地将视频保存到文件中,但我只能在使用编解码器后在桌面上播放它,因为文件是 .y4m 而不是 .mp4,当我尝试使用 VideoView 播放它时,它说它不能'无法播放视频,即使我尝试使用 android 附带的 videoPlayer 播放视频也无法播放,我只能使用 MXPlayer、VLC 或任何其他具有编解码器的应用程序播放桌面。
为了简化问题:
How can I play video.y4m on native android VideoView?
我再简化一下,假设我不了解录制文件的格式,这里是我用来录制文件的代码:
开始录制时间:
remoteVideoFileRenderer = new VideoFileRenderer(
fileToRecordTo.getAbsolutePath(),
640,
480,
rootEglBase.getEglBaseContext());
remoteVideoTrack.addSink(remoteVideoFileRenderer);
录制完成时:
remoteVideoFileRenderer.release();
现在问题又来了:我有一个 "fileToRecordTo" 并且这个视频文件可以在 GOM(windows)、VLC(windows、mac 和 Android), MXPlayer(Android) 但我既不能使用嵌入 Android 的播放器播放它(如果有效,我会在我的应用程序中使用这个播放器)也不能Android 原生 videoView.
任何帮助。
只录制视频
我的项目中也有类似的案例。起初,我尝试了 WebRTC 的默认 VideoFileRenderer 但视频尺寸太大,因为没有应用压缩。
我找到了这个存储库。这对我来说真的很有帮助。
https://github.com/cloudwebrtc/flutter-webrtc
这是一个分步指南。我也做了一些调整。
将此 class 添加到您的项目中。它有很多配置最终视频格式的选项。
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;
import org.webrtc.EglBase;
import org.webrtc.GlRectDrawer;
import org.webrtc.VideoFrame;
import org.webrtc.VideoFrameDrawer;
import org.webrtc.VideoSink;
import java.io.IOException;
import java.nio.ByteBuffer;
class FileEncoder implements VideoSink {
private static final String TAG = "FileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private int outputFileWidth = -1;
private int outputFileHeight = -1;
private ByteBuffer[] encoderOutputBuffers;
private EglBase eglBase;
private EglBase.Context sharedContext;
private VideoFrameDrawer frameDrawer;
private static final String MIME_TYPE = "video/avc"; // H.264 Advanced Video Coding
private static final int FRAME_RATE = 30; // 30fps
private static final int IFRAME_INTERVAL = 5; // 5 seconds between I-frames
private MediaMuxer mediaMuxer;
private MediaCodec encoder;
private MediaCodec.BufferInfo bufferInfo;
private int trackIndex = -1;
private boolean isRunning = true;
private GlRectDrawer drawer;
private Surface surface;
FileEncoder(String outputFile, final EglBase.Context sharedContext) throws IOException {
renderThread = new HandlerThread(TAG + "RenderThread");
renderThread.start();
renderThreadHandler = new Handler(renderThread.getLooper());
bufferInfo = new MediaCodec.BufferInfo();
this.sharedContext = sharedContext;
mediaMuxer = new MediaMuxer(outputFile,
MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
}
private void initVideoEncoder() {
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, 1280, 720);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
try {
encoder = MediaCodec.createEncoderByType(MIME_TYPE);
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
renderThreadHandler.post(() -> {
eglBase = EglBase.create(sharedContext, EglBase.CONFIG_RECORDABLE);
surface = encoder.createInputSurface();
eglBase.createSurface(surface);
eglBase.makeCurrent();
drawer = new GlRectDrawer();
});
} catch (Exception e) {
Log.wtf(TAG, e);
}
}
@Override
public void onFrame(VideoFrame frame) {
frame.retain();
if (outputFileWidth == -1) {
outputFileWidth = frame.getRotatedWidth();
outputFileHeight = frame.getRotatedHeight();
initVideoEncoder();
}
renderThreadHandler.post(() -> renderFrameOnRenderThread(frame));
}
private void renderFrameOnRenderThread(VideoFrame frame) {
if (frameDrawer == null) {
frameDrawer = new VideoFrameDrawer();
}
frameDrawer.drawFrame(frame, drawer, null, 0, 0, outputFileWidth, outputFileHeight);
frame.release();
drainEncoder();
eglBase.swapBuffers();
}
/**
* Release all resources. All already posted frames will be rendered first.
*/
void release() {
isRunning = false;
renderThreadHandler.post(() -> {
if (encoder != null) {
encoder.stop();
encoder.release();
}
eglBase.release();
mediaMuxer.stop();
mediaMuxer.release();
renderThread.quit();
});
}
private boolean encoderStarted = false;
private volatile boolean muxerStarted = false;
private long videoFrameStart = 0;
private void drainEncoder() {
if (!encoderStarted) {
encoder.start();
encoderOutputBuffers = encoder.getOutputBuffers();
encoderStarted = true;
return;
}
while (true) {
int encoderStatus = encoder.dequeueOutputBuffer(bufferInfo, 10000);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
break;
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
encoderOutputBuffers = encoder.getOutputBuffers();
Log.e(TAG, "encoder output buffers changed");
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// not expected for an encoder
MediaFormat newFormat = encoder.getOutputFormat();
Log.e(TAG, "encoder output format changed: " + newFormat);
trackIndex = mediaMuxer.addTrack(newFormat);
if (!muxerStarted) {
mediaMuxer.start();
muxerStarted = true;
}
if (!muxerStarted)
break;
} else if (encoderStatus < 0) {
Log.e(TAG, "unexpected result fr om encoder.dequeueOutputBuffer: " + encoderStatus);
} else { // encoderStatus >= 0
try {
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
Log.e(TAG, "encoderOutputBuffer " + encoderStatus + " was null");
break;
}
// It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
encodedData.position(bufferInfo.offset);
encodedData.limit(bufferInfo.offset + bufferInfo.size);
if (videoFrameStart == 0 && bufferInfo.presentationTimeUs != 0) {
videoFrameStart = bufferInfo.presentationTimeUs;
}
bufferInfo.presentationTimeUs -= videoFrameStart;
if (muxerStarted)
mediaMuxer.writeSampleData(trackIndex, encodedData, bufferInfo);
isRunning = isRunning && (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == 0;
encoder.releaseOutputBuffer(encoderStatus, false);
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
break;
}
} catch (Exception e) {
Log.wtf(TAG, e);
break;
}
}
}
}
private long presTime = 0L;
}
现在在你的 Activity/Fragment class
声明一个上面的变量class
FileEncoder recording;
当您收到要录制的流(远程或本地)时,您可以初始化录制。
FileEncoder recording = new FileEncoder("path/to/video", rootEglBase.eglBaseContext)
remoteVideoTrack.addSink(recording)
通话结束后,您需要停止并释放录音。
remoteVideoTrack.removeSink(recording)
recording.release()
这足以录制视频但没有音频。
视频和音频录制
要录制本地对端的音频,您需要使用此 class(https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc/RecordedAudioToFileController.java)。但首先你需要设置一个 AudioDeviceModule 对象
AudioDeviceModule adm = createJavaAudioDevice()
peerConnectionFactory = PeerConnectionFactory.builder()
.setOptions(options)
.setAudioDeviceModule(adm)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
adm.release()
private AudioDeviceModule createJavaAudioDevice() {
//Implement AudioRecordErrorCallback
//Implement AudioTrackErrorCallback
return JavaAudioDeviceModule.builder(this)
.setSamplesReadyCallback(audioRecorder)
//Default audio source is Voice Communication which is good for VoIP sessions. You can change to the audio source you want.
.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION)
.setAudioRecordErrorCallback(audioRecordErrorCallback)
.setAudioTrackErrorCallback(audioTrackErrorCallback)
.createAudioDeviceModule()
}
合并音频和视频
添加此依赖项
implementation 'com.googlecode.mp4parser:isoparser:1.1.22'
然后在通话结束后将这段代码添加到您的代码中。请确保视频和音频录制已停止并正确释放。
try {
Movie video;
video = MovieCreator.build("path/to/recorded/video");
Movie audio;
audio = MovieCreator.build("path/to/recorded/audio");
Track audioTrack = audio.getTracks().get(0)
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
FileChannel fc = new FileOutputStream(new File("path/to/final/output")).getChannel();
out.writeContainer(fc);
fc.close();
} catch (IOException e) {
e.printStackTrace();
}
我知道这不是在 Android WebRTC 视频通话中录制音频和视频的最佳解决方案。如果有人知道如何使用 WebRTC 提取音频,请添加评论。
我正在开发一个基于 webrtc 的视频聊天应用程序,目前视频通话正在运行,但我想使用 VideoFileRenderer 从远程视频流录制视频,接口有很多实现,例如:https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java 这是我正在使用的实现。它可以毫无问题地将视频保存到文件中,但我只能在使用编解码器后在桌面上播放它,因为文件是 .y4m 而不是 .mp4,当我尝试使用 VideoView 播放它时,它说它不能'无法播放视频,即使我尝试使用 android 附带的 videoPlayer 播放视频也无法播放,我只能使用 MXPlayer、VLC 或任何其他具有编解码器的应用程序播放桌面。
为了简化问题:
How can I play video.y4m on native android VideoView?
我再简化一下,假设我不了解录制文件的格式,这里是我用来录制文件的代码:
开始录制时间:
remoteVideoFileRenderer = new VideoFileRenderer(
fileToRecordTo.getAbsolutePath(),
640,
480,
rootEglBase.getEglBaseContext());
remoteVideoTrack.addSink(remoteVideoFileRenderer);
录制完成时:
remoteVideoFileRenderer.release();
现在问题又来了:我有一个 "fileToRecordTo" 并且这个视频文件可以在 GOM(windows)、VLC(windows、mac 和 Android), MXPlayer(Android) 但我既不能使用嵌入 Android 的播放器播放它(如果有效,我会在我的应用程序中使用这个播放器)也不能Android 原生 videoView.
任何帮助。
只录制视频
我的项目中也有类似的案例。起初,我尝试了 WebRTC 的默认 VideoFileRenderer 但视频尺寸太大,因为没有应用压缩。 我找到了这个存储库。这对我来说真的很有帮助。 https://github.com/cloudwebrtc/flutter-webrtc
这是一个分步指南。我也做了一些调整。
将此 class 添加到您的项目中。它有很多配置最终视频格式的选项。
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;
import org.webrtc.EglBase;
import org.webrtc.GlRectDrawer;
import org.webrtc.VideoFrame;
import org.webrtc.VideoFrameDrawer;
import org.webrtc.VideoSink;
import java.io.IOException;
import java.nio.ByteBuffer;
class FileEncoder implements VideoSink {
private static final String TAG = "FileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private int outputFileWidth = -1;
private int outputFileHeight = -1;
private ByteBuffer[] encoderOutputBuffers;
private EglBase eglBase;
private EglBase.Context sharedContext;
private VideoFrameDrawer frameDrawer;
private static final String MIME_TYPE = "video/avc"; // H.264 Advanced Video Coding
private static final int FRAME_RATE = 30; // 30fps
private static final int IFRAME_INTERVAL = 5; // 5 seconds between I-frames
private MediaMuxer mediaMuxer;
private MediaCodec encoder;
private MediaCodec.BufferInfo bufferInfo;
private int trackIndex = -1;
private boolean isRunning = true;
private GlRectDrawer drawer;
private Surface surface;
FileEncoder(String outputFile, final EglBase.Context sharedContext) throws IOException {
renderThread = new HandlerThread(TAG + "RenderThread");
renderThread.start();
renderThreadHandler = new Handler(renderThread.getLooper());
bufferInfo = new MediaCodec.BufferInfo();
this.sharedContext = sharedContext;
mediaMuxer = new MediaMuxer(outputFile,
MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
}
private void initVideoEncoder() {
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, 1280, 720);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
try {
encoder = MediaCodec.createEncoderByType(MIME_TYPE);
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
renderThreadHandler.post(() -> {
eglBase = EglBase.create(sharedContext, EglBase.CONFIG_RECORDABLE);
surface = encoder.createInputSurface();
eglBase.createSurface(surface);
eglBase.makeCurrent();
drawer = new GlRectDrawer();
});
} catch (Exception e) {
Log.wtf(TAG, e);
}
}
@Override
public void onFrame(VideoFrame frame) {
frame.retain();
if (outputFileWidth == -1) {
outputFileWidth = frame.getRotatedWidth();
outputFileHeight = frame.getRotatedHeight();
initVideoEncoder();
}
renderThreadHandler.post(() -> renderFrameOnRenderThread(frame));
}
private void renderFrameOnRenderThread(VideoFrame frame) {
if (frameDrawer == null) {
frameDrawer = new VideoFrameDrawer();
}
frameDrawer.drawFrame(frame, drawer, null, 0, 0, outputFileWidth, outputFileHeight);
frame.release();
drainEncoder();
eglBase.swapBuffers();
}
/**
* Release all resources. All already posted frames will be rendered first.
*/
void release() {
isRunning = false;
renderThreadHandler.post(() -> {
if (encoder != null) {
encoder.stop();
encoder.release();
}
eglBase.release();
mediaMuxer.stop();
mediaMuxer.release();
renderThread.quit();
});
}
private boolean encoderStarted = false;
private volatile boolean muxerStarted = false;
private long videoFrameStart = 0;
private void drainEncoder() {
if (!encoderStarted) {
encoder.start();
encoderOutputBuffers = encoder.getOutputBuffers();
encoderStarted = true;
return;
}
while (true) {
int encoderStatus = encoder.dequeueOutputBuffer(bufferInfo, 10000);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
break;
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
encoderOutputBuffers = encoder.getOutputBuffers();
Log.e(TAG, "encoder output buffers changed");
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// not expected for an encoder
MediaFormat newFormat = encoder.getOutputFormat();
Log.e(TAG, "encoder output format changed: " + newFormat);
trackIndex = mediaMuxer.addTrack(newFormat);
if (!muxerStarted) {
mediaMuxer.start();
muxerStarted = true;
}
if (!muxerStarted)
break;
} else if (encoderStatus < 0) {
Log.e(TAG, "unexpected result fr om encoder.dequeueOutputBuffer: " + encoderStatus);
} else { // encoderStatus >= 0
try {
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
Log.e(TAG, "encoderOutputBuffer " + encoderStatus + " was null");
break;
}
// It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
encodedData.position(bufferInfo.offset);
encodedData.limit(bufferInfo.offset + bufferInfo.size);
if (videoFrameStart == 0 && bufferInfo.presentationTimeUs != 0) {
videoFrameStart = bufferInfo.presentationTimeUs;
}
bufferInfo.presentationTimeUs -= videoFrameStart;
if (muxerStarted)
mediaMuxer.writeSampleData(trackIndex, encodedData, bufferInfo);
isRunning = isRunning && (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == 0;
encoder.releaseOutputBuffer(encoderStatus, false);
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
break;
}
} catch (Exception e) {
Log.wtf(TAG, e);
break;
}
}
}
}
private long presTime = 0L;
}
现在在你的 Activity/Fragment class
声明一个上面的变量class
FileEncoder recording;
当您收到要录制的流(远程或本地)时,您可以初始化录制。
FileEncoder recording = new FileEncoder("path/to/video", rootEglBase.eglBaseContext)
remoteVideoTrack.addSink(recording)
通话结束后,您需要停止并释放录音。
remoteVideoTrack.removeSink(recording)
recording.release()
这足以录制视频但没有音频。
视频和音频录制
要录制本地对端的音频,您需要使用此 class(https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc/RecordedAudioToFileController.java)。但首先你需要设置一个 AudioDeviceModule 对象
AudioDeviceModule adm = createJavaAudioDevice()
peerConnectionFactory = PeerConnectionFactory.builder()
.setOptions(options)
.setAudioDeviceModule(adm)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
adm.release()
private AudioDeviceModule createJavaAudioDevice() {
//Implement AudioRecordErrorCallback
//Implement AudioTrackErrorCallback
return JavaAudioDeviceModule.builder(this)
.setSamplesReadyCallback(audioRecorder)
//Default audio source is Voice Communication which is good for VoIP sessions. You can change to the audio source you want.
.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION)
.setAudioRecordErrorCallback(audioRecordErrorCallback)
.setAudioTrackErrorCallback(audioTrackErrorCallback)
.createAudioDeviceModule()
}
合并音频和视频
添加此依赖项
implementation 'com.googlecode.mp4parser:isoparser:1.1.22'
然后在通话结束后将这段代码添加到您的代码中。请确保视频和音频录制已停止并正确释放。
try {
Movie video;
video = MovieCreator.build("path/to/recorded/video");
Movie audio;
audio = MovieCreator.build("path/to/recorded/audio");
Track audioTrack = audio.getTracks().get(0)
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
FileChannel fc = new FileOutputStream(new File("path/to/final/output")).getChannel();
out.writeContainer(fc);
fc.close();
} catch (IOException e) {
e.printStackTrace();
}
我知道这不是在 Android WebRTC 视频通话中录制音频和视频的最佳解决方案。如果有人知道如何使用 WebRTC 提取音频,请添加评论。