VideoToolbox 可以本地解码 H264 Annex B 吗?错误代码 -8969 BadData

Can VideoToolbox decode H264 Annex B natively? Error Code -8969 BadData

我的目标是将 iDevice 的屏幕镜像到 OSX,尽可能无延迟。

据我所知有两种方法:

  1. Airplay Mirroring(例如反射器)
  2. CoreMediaIO 通过 Lightning(例如 Quicktime 录音)

我选择采用第二种方法,因为(据我所知)连接的设备在一次性设置后可以自动识别为 DAL 设备。

有关如何执行此操作的主要资源是此博客:https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/

该博客非常深入地介绍了如何使用 CoreMediaIO,但是一旦您将连接的 iDevice 识别为 AVCaptureDevice,您似乎可以使用 AVFoundation

这个问题: 发布了一个关于如何获取 iDevice 提供的 H264(附件 B)多路复用数据流的每一帧的解决方案。

但是,我的问题是 VideoToolbox 无法正确解码(错误代码 -8969,BadData),即使代码应该没有任何差异。

vtDecompressionDuctDecodeSingleFrame signalled err=-8969 (err) (VTVideoDecoderDecodeFrame returned error) at /SourceCache/CoreMedia_frameworks/CoreMedia-1562.240/Sources/VideoToolbox/VTDecompressionSession.c line 3241

完整代码:

#import "ViewController.h"

@import CoreMediaIO;
@import AVFoundation;
@import AppKit;

@implementation ViewController

AVCaptureSession *session;
AVCaptureDeviceInput *newVideoDeviceInput;
AVCaptureVideoDataOutput *videoDataOutput;

- (void)viewDidLoad {
    [super viewDidLoad];
}

- (instancetype)initWithCoder:(NSCoder *)coder
{
    self = [super initWithCoder:coder];
    if (self) {
        // Allow iOS Devices Discovery
        CMIOObjectPropertyAddress prop =
        { kCMIOHardwarePropertyAllowScreenCaptureDevices,
            kCMIOObjectPropertyScopeGlobal,
            kCMIOObjectPropertyElementMaster };
        UInt32 allow = 1;
        CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
                                  &prop, 0, NULL,
                                  sizeof(allow), &allow );

        // Get devices
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
        BOOL deviceAttahced = false;
        for (int i = 0; i < [devices count]; i++) {
            AVCaptureDevice *device = devices[i];
            if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
                deviceAttahced = true;
                [self startSession:device];
                break;
            }
        }

    }
    return self;
}

- (void) deviceConnected:(AVCaptureDevice *)device {
    if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
        [self startSession:device];
    }
}

- (void) startSession:(AVCaptureDevice *)device {

    // Init capturing session
    session = [[AVCaptureSession alloc] init];

    // Star session configuration
    [session beginConfiguration];

    // Add session input
    NSError *error;
    newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        if (newVideoDeviceInput == nil) {
        dispatch_async(dispatch_get_main_queue(), ^(void) {
            NSLog(@"%@", error);
        });
    } else {
        [session addInput:newVideoDeviceInput];
    }

    // Add session output
    videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];

    dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);

    [videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
    [session addOutput:videoDataOutput];

    // Finish session configuration
    [session commitConfiguration];

    // Start the session
    [session startRunning];
}

#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];

    //self.imageView.image = [self nsImageFromSampleBuffer:sampleBuffer];
    self.imageView.image = [[NSImage alloc] initWithData:imageToBuffer(sampleBuffer)];
}    

NSData* imageToBuffer( CMSampleBufferRef source) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return data;
}  

不,您必须删除附件 b 起始代码并用大小值替换它们。与 MP4 格式相同