混合现实 WebRTC - 使用 GraphicsCapturePicker 进行屏幕捕获

Mixed Reality WebRTC - Screen capturing with GraphicsCapturePicker

设置
嘿,
我正在尝试通过 MR-WebRTC 捕获我的屏幕和 send/communicate 流。两台 PC 或带 HoloLens 的 PC 之间的通信对我来说适用于网络摄像头,所以我认为下一步可能是流式传输我的屏幕。所以我使用了我已经拥有的 uwp 应用程序,它与我的网络摄像头一起工作并试图让它工作:

所以现在我陷入了以下情况:

  1. 我从屏幕截图中得到了一帧,但它的类型是Direct3D11CaptureFrame。您可以在下面的代码片段中看到它。
  2. MR-WebRTC 采用帧类型 I420AVideoFrame(也在代码中)。

如何“连接”它们?

来自 Direct3D 的代码片段帧:

_framePool = Direct3D11CaptureFramePool.Create(
                _canvasDevice,                             // D3D device
                DirectXPixelFormat.B8G8R8A8UIntNormalized, // Pixel format
                3,                                         // Number of frames
                _item.Size);                               // Size of the buffers

_session = _framePool.CreateCaptureSession(_item);
_session.StartCapture();
_framePool.FrameArrived += (s, a) =>
{
    using (var frame = _framePool.TryGetNextFrame())
    {
        // Here I would take the Frame and call the MR-WebRTC method LocalI420AFrameReady  
    }
};

来自 WebRTC 的代码片段框架:

// This is the way with the webcam; so LocalI420 was subscribed to
// the event I420AVideoFrameReady and got the frame from there
_webcamSource = await DeviceVideoTrackSource.CreateAsync();
_webcamSource.I420AVideoFrameReady += LocalI420AFrameReady;

// enqueueing the newly captured video frames into the bridge,
// which will later deliver them when the Media Foundation
// playback pipeline requests them.
private void LocalI420AFrameReady(I420AVideoFrame frame)
    {
        lock (_localVideoLock)
        {
            if (!_localVideoPlaying)
            {
                _localVideoPlaying = true;

                // Capture the resolution into local variable useable from the lambda below
                uint width = frame.width;
                uint height = frame.height;

                // Defer UI-related work to the main UI thread
                RunOnMainThread(() =>
                {
                    // Bridge the local video track with the local media player UI
                    int framerate = 30; // assumed, for lack of an actual value
                    _localVideoSource = CreateI420VideoStreamSource(
                        width, height, framerate);
                    var localVideoPlayer = new MediaPlayer();
                    localVideoPlayer.Source = MediaSource.CreateFromMediaStreamSource(
                        _localVideoSource);
                    localVideoPlayerElement.SetMediaPlayer(localVideoPlayer);
                    localVideoPlayer.Play();
                });
            }
        }
        // Enqueue the incoming frame into the video bridge; the media player will
        // later dequeue it as soon as it's ready.
        _localVideoBridge.HandleIncomingVideoFrame(frame);
    }

我通过在 github 存储库上创建问题找到了解决我的问题的方法。 Answer was provided by KarthikRichie:

  1. 您必须使用 ExternalVideoTrackSource
  2. 您可以将 Direct3D11CaptureFrame 转换为 Argb32VideoFrame

// Setting up external video track source
_screenshareSource = ExternalVideoTrackSource.CreateFromArgb32Callback(FrameCallback);

struct WebRTCFrameData
{
    public IntPtr Data;
    public uint Height;
    public uint Width;
    public int Stride;
}

public void FrameCallback(in FrameRequest frameRequest)
{
    try
    {
        if (FramePool != null)
        {
            using (Direct3D11CaptureFrame _currentFrame = FramePool.TryGetNextFrame())
            {
                if (_currentFrame != null)
                {
                    WebRTCFrameData webRTCFrameData = ProcessBitmap(_currentFrame.Surface).Result;
                    frameRequest.CompleteRequest(new Argb32VideoFrame()
                    {
                        data = webRTCFrameData.Data,
                        height = webRTCFrameData.Height,
                        width = webRTCFrameData.Width,
                        stride = webRTCFrameData.Stride
                    });
                }
            }
        }
    }
    catch (Exception ex)
    {
    }
}

private async Task<WebRTCFrameData> ProcessBitmap(IDirect3DSurface surface)
{

    SoftwareBitmap softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(surface, Windows.Graphics.Imaging.BitmapAlphaMode.Straight);

    byte[] imageBytes = new byte[4 * softwareBitmap.PixelWidth * softwareBitmap.PixelHeight];
    softwareBitmap.CopyToBuffer(imageBytes.AsBuffer());
    WebRTCFrameData argb32VideoFrame = new WebRTCFrameData();
    argb32VideoFrame.Data = GetByteIntPtr(imageBytes);
    argb32VideoFrame.Height = (uint)softwareBitmap.PixelHeight;
    argb32VideoFrame.Width = (uint)softwareBitmap.PixelWidth;

    var test = softwareBitmap.LockBuffer(BitmapBufferAccessMode.Read);
    int count = test.GetPlaneCount();
    var pl = test.GetPlaneDescription(count - 1);
    argb32VideoFrame.Stride = pl.Stride;

    return argb32VideoFrame;

}

private IntPtr GetByteIntPtr(byte[] byteArr)
{
    IntPtr intPtr2 = System.Runtime.InteropServices.Marshal.UnsafeAddrOfPinnedArrayElement(byteArr, 0);
    return intPtr2;
}