AVAssetWriter startSessionAtSourceTime 不接受 CMTIme 值
AVAssetWriter startSessionAtSourceTime not accepting CMTIme value
我的应用旨在录制视频并分析在 iOS 11.4 下生成的帧,使用 Xcode 10.0 作为 IDE。使用 AVCaptureMovieFileOutput 成功录制视频,但需要分析帧,以便在 RosyWriter [ https://github.com/WildDylan/appleSample/tree/master/RosyWriter ] 之后转换为 AVAssetWriter 和建模代码。代码是用 ObjC 编写的。
我在 captureOutput: didOutputSampleBuffer: fromConnection: delegate 中遇到了问题。捕获第一帧后,使用从第一帧提取的设置配置 AVAssetWriter 及其输入(视频和音频)。一旦用户选择记录,捕获的 sampleBuffer 被分析和写入。我尝试使用 AVAssetWriter startSessionAtSourceTime: 但显然 CMSampleBufferGetPresentationTimeStamp 从样本缓冲区返回 CMTime 的方式有问题。 sampleBuufer 日志似乎显示具有有效值的 CMTime。
如果我实施:
CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[self->assetWriter startSessionAtSourceTime: sampleTime]
生成的错误是'*** -[AVAssetWriter startSessionAtSourceTime:] 无效参数不满足:CMTIME_IS_NUMERIC(startTime)' .
如果我使用 [self->assetWriter startSessionAtSourceTime:kCMTimeZero],则会生成错误 "warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available."。
当我记录 sampleTime 时,我读取了 - value=0,timescale=0,epoch=0 & flags=0。我还记录了 sampleBuffer 并在下面显示它,然后是相关代码:
SampleBuffer Content =
2018-10-17 12:07:04.540816+0300 MyApp[10664:2111852] -[CameraCaptureManager captureOutput:didOutputSampleBuffer:fromConnection:] : sampleBuffer - CMSampleBuffer 0x100e388c0 retainCount: 1 allocator: 0x1c03a95e0
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
Orientation(P) = 1
{Exif} (P) = <CFBasicHash 0x28161ce80 [0x1c03a95e0]>{type = mutable dict, count = 24,
entries => .....A LOT OF CAMERA DATA HERE.....
}
DPIWidth (P) = 72
{TIFF} (P) = <CFBasicHash 0x28161c540 [0x1c03a95e0]>{type = mutable dict, count = 7,
entries => .....MORE CAMERA DATA HERE.....
}
DPIHeight (P) = 72
{MakerApple}(P) = {
1 = 3;
10 = 0;
14 = 0;
3 = {
epoch = 0;
flags = 1;
timescale = 1000000000;
value = 390750488472916;
};
4 = 0;
5 = 221;
6 = 211;
7 = 1;
8 = (
"-0.04894018",
"-0.6889497",
"-0.7034443"
);
9 = 0;
}
formatDescription = <CMVideoFormatDescription 0x280ddc780 [0x1c03a95e0]> {
mediaType:'vide'
mediaSubType:'BGRA'
mediaSpecific: {
codecType: 'BGRA' dimensions: 720 x 1280
}
extensions: {<CFBasicHash 0x28161f880 [0x1c03a95e0]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1c0917068 [0x1c03a95e0]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1c09170a8 [0x1c03a95e0]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1c09171c8 [0x1c03a95e0]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
2 : <CFString 0x1c093f348 [0x1c03a95e0]>{contents = "CVBytesPerRow"} = <CFNumber 0x81092876519e5903 [0x1c03a95e0]>{value = +2880, type = kCFNumberSInt32Type}
3 : <CFString 0x1c093f3c8 [0x1c03a95e0]>{contents = "Version"} = <CFNumber 0x81092876519eed23 [0x1c03a95e0]>{value = +2, type = kCFNumberSInt32Type}
5 : <CFString 0x1c0917148 [0x1c03a95e0]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
{PTS = {390750488483992/1000000000 = 390750.488}, DTS = {INVALID}, duration = {INVALID}},
}
imageBuffer = 0x2832ad2c0
============================================= =======
//AVCaptureVideoDataOutput Delegates
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if (connection == videoConnection)
{
if (self.outputVideoFormatDescription == NULL )
{
self.outputVideoFormatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
[self setupVideoRecorder];
}
else if (self.status==RecorderRecording)
{
NSLog(@"%s : self.outputVideoFormatDescription - %@",__FUNCTION__,self.outputVideoFormatDescription);
[self.cmDelegate manager:self capturedFrameBuffer:sampleBuffer];
NSLog(@"%s : sampleBuffer - %@",__FUNCTION__,sampleBuffer);
dispatch_async(vidWriteQueue, ^
{
if (!self->wroteFirstFrame)
{
CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
NSLog(@"%s : sampleTime value - %lld, timescale - %i, epoch - %lli, flags - %u",__FUNCTION__,sampleTime.value, sampleTime.timescale, sampleTime.epoch, sampleTime.flags);
[self->assetWriter startSessionAtSourceTime:sampleTime];
self->wroteFirstFrame = YES;
}
if (self->videoAWInput.readyForMoreMediaData)
//else if (self->videoAWInput.readyForMoreMediaData)
{
BOOL appendSuccess = [self->videoAWInput appendSampleBuffer:sampleBuffer];
NSLog(@"%s : appendSuccess - %i",__FUNCTION__,appendSuccess);
if (!appendSuccess) NSLog(@"%s : failed to append video buffer - %@@",__FUNCTION__,self->assetWriter.error.localizedDescription);
}
});
}
else if (connection == audioConnection)
{
}
}
}
我的错...我的问题是我使用已经在 AVCaptureDataOutput setSampleBufferDelegate:queue: 中声明的线程生成帧捕获。递归地将进程放在同一个线程内的线程上。发布答案以防另一个像我这样的白痴犯同样的愚蠢错误...
我的应用旨在录制视频并分析在 iOS 11.4 下生成的帧,使用 Xcode 10.0 作为 IDE。使用 AVCaptureMovieFileOutput 成功录制视频,但需要分析帧,以便在 RosyWriter [ https://github.com/WildDylan/appleSample/tree/master/RosyWriter ] 之后转换为 AVAssetWriter 和建模代码。代码是用 ObjC 编写的。
我在 captureOutput: didOutputSampleBuffer: fromConnection: delegate 中遇到了问题。捕获第一帧后,使用从第一帧提取的设置配置 AVAssetWriter 及其输入(视频和音频)。一旦用户选择记录,捕获的 sampleBuffer 被分析和写入。我尝试使用 AVAssetWriter startSessionAtSourceTime: 但显然 CMSampleBufferGetPresentationTimeStamp 从样本缓冲区返回 CMTime 的方式有问题。 sampleBuufer 日志似乎显示具有有效值的 CMTime。
如果我实施: CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [self->assetWriter startSessionAtSourceTime: sampleTime] 生成的错误是'*** -[AVAssetWriter startSessionAtSourceTime:] 无效参数不满足:CMTIME_IS_NUMERIC(startTime)' .
如果我使用 [self->assetWriter startSessionAtSourceTime:kCMTimeZero],则会生成错误 "warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available."。
当我记录 sampleTime 时,我读取了 - value=0,timescale=0,epoch=0 & flags=0。我还记录了 sampleBuffer 并在下面显示它,然后是相关代码:
SampleBuffer Content =
2018-10-17 12:07:04.540816+0300 MyApp[10664:2111852] -[CameraCaptureManager captureOutput:didOutputSampleBuffer:fromConnection:] : sampleBuffer - CMSampleBuffer 0x100e388c0 retainCount: 1 allocator: 0x1c03a95e0
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
Orientation(P) = 1
{Exif} (P) = <CFBasicHash 0x28161ce80 [0x1c03a95e0]>{type = mutable dict, count = 24,
entries => .....A LOT OF CAMERA DATA HERE.....
}
DPIWidth (P) = 72
{TIFF} (P) = <CFBasicHash 0x28161c540 [0x1c03a95e0]>{type = mutable dict, count = 7,
entries => .....MORE CAMERA DATA HERE.....
}
DPIHeight (P) = 72
{MakerApple}(P) = {
1 = 3;
10 = 0;
14 = 0;
3 = {
epoch = 0;
flags = 1;
timescale = 1000000000;
value = 390750488472916;
};
4 = 0;
5 = 221;
6 = 211;
7 = 1;
8 = (
"-0.04894018",
"-0.6889497",
"-0.7034443"
);
9 = 0;
}
formatDescription = <CMVideoFormatDescription 0x280ddc780 [0x1c03a95e0]> {
mediaType:'vide'
mediaSubType:'BGRA'
mediaSpecific: {
codecType: 'BGRA' dimensions: 720 x 1280
}
extensions: {<CFBasicHash 0x28161f880 [0x1c03a95e0]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1c0917068 [0x1c03a95e0]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1c09170a8 [0x1c03a95e0]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1c09171c8 [0x1c03a95e0]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
2 : <CFString 0x1c093f348 [0x1c03a95e0]>{contents = "CVBytesPerRow"} = <CFNumber 0x81092876519e5903 [0x1c03a95e0]>{value = +2880, type = kCFNumberSInt32Type}
3 : <CFString 0x1c093f3c8 [0x1c03a95e0]>{contents = "Version"} = <CFNumber 0x81092876519eed23 [0x1c03a95e0]>{value = +2, type = kCFNumberSInt32Type}
5 : <CFString 0x1c0917148 [0x1c03a95e0]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
{PTS = {390750488483992/1000000000 = 390750.488}, DTS = {INVALID}, duration = {INVALID}},
}
imageBuffer = 0x2832ad2c0
============================================= =======
//AVCaptureVideoDataOutput Delegates
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if (connection == videoConnection)
{
if (self.outputVideoFormatDescription == NULL )
{
self.outputVideoFormatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
[self setupVideoRecorder];
}
else if (self.status==RecorderRecording)
{
NSLog(@"%s : self.outputVideoFormatDescription - %@",__FUNCTION__,self.outputVideoFormatDescription);
[self.cmDelegate manager:self capturedFrameBuffer:sampleBuffer];
NSLog(@"%s : sampleBuffer - %@",__FUNCTION__,sampleBuffer);
dispatch_async(vidWriteQueue, ^
{
if (!self->wroteFirstFrame)
{
CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
NSLog(@"%s : sampleTime value - %lld, timescale - %i, epoch - %lli, flags - %u",__FUNCTION__,sampleTime.value, sampleTime.timescale, sampleTime.epoch, sampleTime.flags);
[self->assetWriter startSessionAtSourceTime:sampleTime];
self->wroteFirstFrame = YES;
}
if (self->videoAWInput.readyForMoreMediaData)
//else if (self->videoAWInput.readyForMoreMediaData)
{
BOOL appendSuccess = [self->videoAWInput appendSampleBuffer:sampleBuffer];
NSLog(@"%s : appendSuccess - %i",__FUNCTION__,appendSuccess);
if (!appendSuccess) NSLog(@"%s : failed to append video buffer - %@@",__FUNCTION__,self->assetWriter.error.localizedDescription);
}
});
}
else if (connection == audioConnection)
{
}
}
}
我的错...我的问题是我使用已经在 AVCaptureDataOutput setSampleBufferDelegate:queue: 中声明的线程生成帧捕获。递归地将进程放在同一个线程内的线程上。发布答案以防另一个像我这样的白痴犯同样的愚蠢错误...