iOS GPUImage 滤镜无法混入视频缓冲区
iOS GPUImage filters can't mix into video buffer
我正在开发实时应用程序。我需要将过滤器添加到视频缓冲区中。
然后我使用 GPUImage 框架并编写了一个过滤器。它看起来不错,但是缓冲区在 'willOutputSampleBuffer:' 函数中没有任何过滤器的效果。
以下是一些关键代码:
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:self.sessionPreset cameraPosition:AVCaptureDevicePositionFront];
self.videoCamera.delegate = self;
self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;
self.filterView = [[GPUImageView alloc] init];
GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];
[self.videoCamera addTarget:beautifyFilter];
[beautifyFilter addTarget:self.filterView];
dispatch_async(dispatch_get_main_queue(), ^{
[self.view insertSubview:self.filterView atIndex:1];
[self.filterView mas_makeConstraints:^(MASConstraintMaker *make) {
make.edges.equalTo(self.view);
}];
[self.videoCamera startCameraCapture];
});
有没有我忽略的细节?谢谢!!!
我需要向过滤器的目标添加一个新的输出,所以我在我的项目中添加了这些代码,然后我得到了带有过滤器的缓冲区。
GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(720, 1280) resultsInBGRAFormat:YES];
[self.beautifyFilter addTarget:rawDataOutput];
__weak GPUImageRawDataOutput *weakOutput = rawDataOutput;
[rawDataOutput setNewFrameAvailableBlock:^{
__strong GPUImageRawDataOutput *strongOutput = weakOutput;
[strongOutput lockFramebufferForReading];
GLubyte *outputBytes = [strongOutput rawBytesForImage];
NSInteger bytesPerRow = [strongOutput bytesPerRowInOutput];
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 720, 1280, kCVPixelFormatType_32BGRA, outputBytes, bytesPerRow, nil, nil, nil, &pixelBuffer);
//Do something with pixelBuffer
[strongOutput unlockFramebufferAfterReading];
CFRelease(pixelBuffer);
}];
我正在开发实时应用程序。我需要将过滤器添加到视频缓冲区中。 然后我使用 GPUImage 框架并编写了一个过滤器。它看起来不错,但是缓冲区在 'willOutputSampleBuffer:' 函数中没有任何过滤器的效果。
以下是一些关键代码:
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:self.sessionPreset cameraPosition:AVCaptureDevicePositionFront];
self.videoCamera.delegate = self;
self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;
self.filterView = [[GPUImageView alloc] init];
GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];
[self.videoCamera addTarget:beautifyFilter];
[beautifyFilter addTarget:self.filterView];
dispatch_async(dispatch_get_main_queue(), ^{
[self.view insertSubview:self.filterView atIndex:1];
[self.filterView mas_makeConstraints:^(MASConstraintMaker *make) {
make.edges.equalTo(self.view);
}];
[self.videoCamera startCameraCapture];
});
有没有我忽略的细节?谢谢!!!
我需要向过滤器的目标添加一个新的输出,所以我在我的项目中添加了这些代码,然后我得到了带有过滤器的缓冲区。
GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(720, 1280) resultsInBGRAFormat:YES];
[self.beautifyFilter addTarget:rawDataOutput];
__weak GPUImageRawDataOutput *weakOutput = rawDataOutput;
[rawDataOutput setNewFrameAvailableBlock:^{
__strong GPUImageRawDataOutput *strongOutput = weakOutput;
[strongOutput lockFramebufferForReading];
GLubyte *outputBytes = [strongOutput rawBytesForImage];
NSInteger bytesPerRow = [strongOutput bytesPerRowInOutput];
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 720, 1280, kCVPixelFormatType_32BGRA, outputBytes, bytesPerRow, nil, nil, nil, &pixelBuffer);
//Do something with pixelBuffer
[strongOutput unlockFramebufferAfterReading];
CFRelease(pixelBuffer);
}];