检测照片的清晰度

Detection of sharpness of a photo

我正在寻找一个框架来帮助检测照片的清晰度。我已阅读 this post,其中指出了这样做的方法。但我宁愿在图书馆工作也不愿亲自动手。

Apple 在 Core Image 的文档中说:

Core Image can analyze the quality of an image and provide a set of filters with optimal settings for adjusting such things as hue, contrast, and tone color, and for correcting for flash artifacts such as red eye. It does all this with one method call on your part.

我该如何完成 'analyze image quality' 部分?我很想看看一些示例代码。

我认为 Core Image 对您没有帮助。您可以使用自动增强功能来获取一组建议的过滤器和值。然而,没有清晰度(边缘对比度)过滤器,只有整体图像对比度。见 full list here.

有一个 Apple vDSP API 可以进行快速傅里叶变换:

The vDSP API provides mathematical functions for applications such as speech, sound, audio, and video processing, diagnostic medical imaging, radar signal processing, seismic analysis, and scientific data processing.

您应该能够使用它来分析您的图像。

有关概念概述,请参阅:Using Fourier Transforms 并搜索有关 vDSP 的教程。堆栈上也有问答。

也许最好的方法是极边相干性指标:

Baroncini, V., 等人。 “The polar edge coherence: a quasi blind metric for video quality assessment.”EUSIPCO 2009,格拉斯哥 (2009):564-568。

它对图像和视频同样有效。这直接测量边缘的锐度。如果您应用锐化过滤器,您可以比较之前和之后的值,如果过度锐化,值将再次开始下降。它需要使用论文中描述的复值内核进行几个卷积。

我们用这样的 GPUimage 框架完成了它(计算亮度和锐度):(这里有一些可能对你有帮助的片段)

-(BOOL) calculateBrightness:(UIImage *) image {
float result  = 0;
int i = 0;
for (int y = 0; y < image.size.height; y++) {
    for (int x = 0; x < image.size.width; x++) {
        UIColor *color = [self colorAt:image
                                   atX:x
                                  andY:y];
        const CGFloat * colors = CGColorGetComponents(color.CGColor);
        float r = colors[0];
        float g = colors[1];
        float b = colors[2];
        result += .299 * r + 0.587 * g + 0.114 * b;
        i++;
    }
}
float brightness = result / (float)i;
NSLog(@"Image Brightness : %f",brightness);
if (brightness > 0.8 || brightness < 0.3) {
    return NO;
}
return YES;

}

-(BOOL) calculateSharpness:(UIImage *) image {
GPUImageCannyEdgeDetectionFilter *filter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
BinaryImageDistanceTransform *binImagTrans = [[BinaryImageDistanceTransform alloc] init ];
NSArray *resultArray = [binImagTrans twoDimDistanceTransform:[self getBinaryImageAsArray:[filter imageByFilteringImage:image]]];

if (resultArray == nil) {
    return NO;
}

int sum = 0;
for (int x = 0; x < resultArray.count; x++) {
    NSMutableArray *col = resultArray[x];
    sum += (int)[col valueForKeyPath:@"@max.intValue"];
}

// Values under analysis
NSLog(@"Image Sharp : %i",sum);
if (sum < 26250000) { // tested - bad sharpness is under ca. 26250000
    return NO;
}
return YES;

}

但是速度很慢。它需要大约。来自 iPad 个相机的一张图像需要 40 秒。