WebRTC 无法录屏

WebRTC cannot record screen

我正在尝试使用 WebRTC 制作屏幕共享应用程序。我有可以从相机获取和共享视频流的代码。我需要修改它以通过 MediaProjection API 获取视频。基于此 post 我修改了我的代码以使用 org.webrtc.ScreenCapturerAndroid,但没有显示视频输出。只有黑屏。如果我使用相机,一切正常(我可以在屏幕上看到相机输出)。有人可以检查我的代码并指出正确的方向吗?我已经坚持了三天了。

这是我的代码:

public class MainActivity extends AppCompatActivity {

    private static final String TAG = "VIDEO_CAPTURE";

    private static final int CAPTURE_PERMISSION_REQUEST_CODE = 1;
    private static final String VIDEO_TRACK_ID = "video_stream";

    PeerConnectionFactory peerConnectionFactory;

    SurfaceViewRenderer localVideoView;
    ProxyVideoSink localSink;

    VideoSource videoSource;
    VideoTrack localVideoTrack;

    EglBase rootEglBase;

    boolean camera = false;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        rootEglBase = EglBase.create();
        localVideoView = findViewById(R.id.local_gl_surface_view);

        localVideoView.init(rootEglBase.getEglBaseContext(), null);

        startScreenCapture();
    }

    @TargetApi(21)
    private void startScreenCapture() {
        MediaProjectionManager mMediaProjectionManager = (MediaProjectionManager) getApplication().getSystemService(Context.MEDIA_PROJECTION_SERVICE);
        startActivityForResult(mMediaProjectionManager.createScreenCaptureIntent(), CAPTURE_PERMISSION_REQUEST_CODE);
    }

    @Override
    public void onActivityResult(int requestCode, int resultCode, Intent data) {
        if (requestCode != CAPTURE_PERMISSION_REQUEST_CODE) { return; }

        start(data);
    }

    private void start(Intent permissionData) {

        //Initialize PeerConnectionFactory globals.
        PeerConnectionFactory.InitializationOptions initializationOptions =
                PeerConnectionFactory.InitializationOptions.builder(this)
                        .setEnableVideoHwAcceleration(true)
                        .createInitializationOptions();
        PeerConnectionFactory.initialize(initializationOptions);

        //Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.
        PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
        DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(
                rootEglBase.getEglBaseContext(), true,true);
        DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext());

        peerConnectionFactory = PeerConnectionFactory.builder()
                .setOptions(options)
                .setVideoDecoderFactory(defaultVideoDecoderFactory)
                .setVideoEncoderFactory(defaultVideoEncoderFactory)
                .createPeerConnectionFactory();;

        VideoCapturer videoCapturerAndroid;
        if (camera) {
            videoCapturerAndroid = createCameraCapturer(new Camera1Enumerator(false));
        } else {
            videoCapturerAndroid = new ScreenCapturerAndroid(permissionData, new MediaProjection.Callback() {
                @Override
                public void onStop() {
                    super.onStop();
                    Log.e(TAG, "user has revoked permissions");
                }
            });
        }

        videoSource = peerConnectionFactory.createVideoSource(videoCapturerAndroid);

        DisplayMetrics metrics = new DisplayMetrics();
        MainActivity.this.getWindowManager().getDefaultDisplay().getRealMetrics(metrics);
        videoCapturerAndroid.startCapture(metrics.widthPixels, metrics.heightPixels, 30);

        localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
        localVideoTrack.setEnabled(true);

        //localVideoTrack.addRenderer(new VideoRenderer(localRenderer));
        localSink = new ProxyVideoSink().setTarget(localVideoView);
        localVideoTrack.addSink(localSink);
    }

    //find first camera, this works without problem
    private VideoCapturer createCameraCapturer(CameraEnumerator enumerator) {
        final String[] deviceNames = enumerator.getDeviceNames();

        // First, try to find front facing camera
        Logging.d(TAG, "Looking for front facing cameras.");
        for (String deviceName : deviceNames) {
            if (enumerator.isFrontFacing(deviceName)) {
                Logging.d(TAG, "Creating front facing camera capturer.");
                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);

                if (videoCapturer != null) {
                    return videoCapturer;
                }
            }
        }

        // Front facing camera not found, try something else
        Logging.d(TAG, "Looking for other cameras.");
        for (String deviceName : deviceNames) {
            if (!enumerator.isFrontFacing(deviceName)) {
                Logging.d(TAG, "Creating other camera capturer.");
                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);

                if (videoCapturer != null) {
                    return videoCapturer;
                }
            }
        }

        return null;
    }
}

ProxyVideoSink

public class ProxyVideoSink implements VideoSink {

    private VideoSink target;

    synchronized ProxyVideoSink setTarget(VideoSink target) { this.target = target; return this; }

    @Override
    public void onFrame(VideoFrame videoFrame) {

        if (target == null) {
            Log.w("VideoSink", "Dropping frame in proxy because target is null.");
            return;
        }

        target.onFrame(videoFrame);
    }
}

在 logcat 中,我可以看到,渲染了一些帧,但没有显示任何内容(黑屏)。

06-18 17:42:44.750 11357-11388/com.archona.webrtcscreencapturetest I/org.webrtc.Logging: EglRenderer: local_gl_surface_viewDuration: 4000 ms. Frames received: 117. Dropped: 0. Rendered: 117. Render fps: 29.2. Average render time: 4754 μs. Average swapBuffer time: 2913 μs.
06-18 17:42:48.752 11357-11388/com.archona.webrtcscreencapturetest I/org.webrtc.Logging: EglRenderer: local_gl_surface_viewDuration: 4001 ms. Frames received: 118. Dropped: 0. Rendered: 118. Render fps: 29.5. Average render time: 5015 μs. Average swapBuffer time: 3090 μs.

我正在使用最新版本的 WebRTC 库:实现 'org.webrtc:google-webrtc:1.0.23546'。 我的设备有 API 级别 24 (Android 7.0),但我已经在具有不同 API 级别的 3 种不同设备上测试了这段代码,所以我不怀疑设备特定问题。 我尝试构建另一个使用 MediaProjection API(没有 WebRTC)的应用程序,我可以在 SurfaceView 中看到正确的输出。 我试过降级 webrtc 库,但似乎没有任何效果。

感谢您的帮助。

我在使用 WebRTC 库 org.webrtc:google-webrtc:1.0.22672 时遇到了同样的问题。我正在使用 android 7.0 设备。视频通话工作正常。问题在于屏幕共享。屏幕共享总是黑屏。

然后我添加了以下内容:

peerConnectionFactory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());

现在一切正常。