使用 Core Image 缩放图像是裁剪而不是缩放
Scaling image with Core Image is cropping instead of scaling
我有一个函数可以使用 Core Image 滤镜缩放图像。它以一个 NSImage 和一个维数作为参数,returns 一个缩放的 NSImage。我一直在使用:Processing an Image Using Built-in Filters
作为参考。该代码似乎正在裁剪图像而不是调整图像大小。我发誓我曾经让它工作过,所以我哪里出错了?
func imageCIScale(_ image: NSImage, dimension: CGFloat) -> NSImage? {
guard let data = image.tiffRepresentation,
let scaleFilter = CIFilter(name: "CILanczosScaleTransform"),
let ciImage = CIImage(data: data)
else {
print("Failure! Abject failure! Couldn't even get started")
return nil
}
let scaleFactor = dimension / image.size.height
scaleFilter.setValue(ciImage, forKey: kCIInputImageKey)
scaleFilter.setValue(scaleFactor, forKey: kCIInputScaleKey)
scaleFilter.setValue(1.0, forKey: kCIInputAspectRatioKey)
let outputImage = scaleFilter.outputImage
let context = CIContext()
guard let scaledImage = context.createCGImage(ciImage, from: outputImage!.extent) else {
print("Failed to create CGImage")
return nil
}
return NSImage(cgImage: scaledImage, size: NSZeroSize)
}
该函数的调用方式如下:
let myScaledNSImage = imageCIScale(myOriginalNSImage, dimension: 32.0)
应该会生成 32 x 32 的图像
好的,经过一夜安眠,我知道出了什么问题。我在创建 CG 图像时使用了错误的图像源
let scaledImage = context.createCGImage(ciImage, from: outputImage!.extent)
应该是
let scaledImage = context.createCGImage(outputImage!, from: outputImage!.extent)
或者更像
let context = CIContext()
guard let outputImage = scaleFilter.outputImage,
let scaledImage = context.createCGImage(outputImage, from: outputImage.extent)
else {
print("Failed to create CGImage")
return nil
}
我会留下这个问题,因为一旦更正,我认为这是 Mac OS 上下文中 Swift 4.2 图像缩放的合理示例。
我有一个函数可以使用 Core Image 滤镜缩放图像。它以一个 NSImage 和一个维数作为参数,returns 一个缩放的 NSImage。我一直在使用:Processing an Image Using Built-in Filters 作为参考。该代码似乎正在裁剪图像而不是调整图像大小。我发誓我曾经让它工作过,所以我哪里出错了?
func imageCIScale(_ image: NSImage, dimension: CGFloat) -> NSImage? {
guard let data = image.tiffRepresentation,
let scaleFilter = CIFilter(name: "CILanczosScaleTransform"),
let ciImage = CIImage(data: data)
else {
print("Failure! Abject failure! Couldn't even get started")
return nil
}
let scaleFactor = dimension / image.size.height
scaleFilter.setValue(ciImage, forKey: kCIInputImageKey)
scaleFilter.setValue(scaleFactor, forKey: kCIInputScaleKey)
scaleFilter.setValue(1.0, forKey: kCIInputAspectRatioKey)
let outputImage = scaleFilter.outputImage
let context = CIContext()
guard let scaledImage = context.createCGImage(ciImage, from: outputImage!.extent) else {
print("Failed to create CGImage")
return nil
}
return NSImage(cgImage: scaledImage, size: NSZeroSize)
}
该函数的调用方式如下:
let myScaledNSImage = imageCIScale(myOriginalNSImage, dimension: 32.0)
应该会生成 32 x 32 的图像
好的,经过一夜安眠,我知道出了什么问题。我在创建 CG 图像时使用了错误的图像源
let scaledImage = context.createCGImage(ciImage, from: outputImage!.extent)
应该是
let scaledImage = context.createCGImage(outputImage!, from: outputImage!.extent)
或者更像
let context = CIContext()
guard let outputImage = scaleFilter.outputImage,
let scaledImage = context.createCGImage(outputImage, from: outputImage.extent)
else {
print("Failed to create CGImage")
return nil
}
我会留下这个问题,因为一旦更正,我认为这是 Mac OS 上下文中 Swift 4.2 图像缩放的合理示例。