Swift5:从相机获取brightness/light级
Swift5: obtain brightness/light level from the camera
我有这段代码,它是 AVCaptureMetadataOutputObjectsDelegate 的扩展:
internal func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
guard let captureSession = captureSession else { return }
captureSession.stopRunning()
if let metadataObject = metadataObjects.first {
guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
guard let stringValue = readableObject.stringValue else { return }
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
found(code: stringValue)
}
}
并在'views'一个二维码时被调用:
let metadataOutput = AVCaptureMetadataOutput()
if (captureSession.canAddOutput(metadataOutput)) {
captureSession.addOutput(metadataOutput)
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
metadataOutput.metadataObjectTypes = [.qr]
我想做的是增加一个新的功能,就是一打开相机就知道后置摄像头的亮度是多少。
我到处都发现他们用这个:
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection)
但在我看来它已经不在 AvFoundation 中了。
我假设您提到的相机的“光度”是某种光照级别指标。我知道几种测量方法。
我想,您的代码中一定已经定义了一个 videoDevice:let videoDevice: AVCaptureDevice
。如果不单独存放,从videoInput.videoDevice
.
获取
勾选videoDevice.iso
。值越低 - 照明条件越亮。这是一个KVO 属性,所以你可以实时观察它的变化。
勾选videoDevice.exposureDuration
。相同:较低的值 → 更亮的照明条件。曝光时间基本上是 iOS 系统相机为了更好的夜间模式拍摄而调整的时间。
如您所述,您还可以从相机获取实时像素缓冲区以进行分析。喜欢构建直方图并将亮像素与暗像素进行比较等
在你的相机里class:
/// You already have the session
private let session = AVCaptureSession()
/// Define a video output (probably you did that already,
/// otherwise how would your camera scan QRs at all)
private let videoOutput = AVCaptureVideoDataOutput()
/// Define a queue for sample buffer
private let videoSampleBufferQueue = DispatchQueue(label: "videoSampleBufferQueue")
然后将输出添加到会话:
if session.canAddOutput(videoOutput){
session.addOutput(videoOutput)
}
videoOutput.setSampleBufferDelegate(self, queue: videoSampleBufferQueue)
并执行 AVCaptureVideoDataOutputSampleBufferDelegate
:
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
// Handle the pixelBuffer the way you like
}
我有这段代码,它是 AVCaptureMetadataOutputObjectsDelegate 的扩展:
internal func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
guard let captureSession = captureSession else { return }
captureSession.stopRunning()
if let metadataObject = metadataObjects.first {
guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
guard let stringValue = readableObject.stringValue else { return }
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
found(code: stringValue)
}
}
并在'views'一个二维码时被调用:
let metadataOutput = AVCaptureMetadataOutput()
if (captureSession.canAddOutput(metadataOutput)) {
captureSession.addOutput(metadataOutput)
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
metadataOutput.metadataObjectTypes = [.qr]
我想做的是增加一个新的功能,就是一打开相机就知道后置摄像头的亮度是多少。
我到处都发现他们用这个:
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection)
但在我看来它已经不在 AvFoundation 中了。
我假设您提到的相机的“光度”是某种光照级别指标。我知道几种测量方法。
我想,您的代码中一定已经定义了一个 videoDevice:let videoDevice: AVCaptureDevice
。如果不单独存放,从videoInput.videoDevice
.
勾选
videoDevice.iso
。值越低 - 照明条件越亮。这是一个KVO 属性,所以你可以实时观察它的变化。勾选
videoDevice.exposureDuration
。相同:较低的值 → 更亮的照明条件。曝光时间基本上是 iOS 系统相机为了更好的夜间模式拍摄而调整的时间。如您所述,您还可以从相机获取实时像素缓冲区以进行分析。喜欢构建直方图并将亮像素与暗像素进行比较等
在你的相机里class:
/// You already have the session
private let session = AVCaptureSession()
/// Define a video output (probably you did that already,
/// otherwise how would your camera scan QRs at all)
private let videoOutput = AVCaptureVideoDataOutput()
/// Define a queue for sample buffer
private let videoSampleBufferQueue = DispatchQueue(label: "videoSampleBufferQueue")
然后将输出添加到会话:
if session.canAddOutput(videoOutput){
session.addOutput(videoOutput)
}
videoOutput.setSampleBufferDelegate(self, queue: videoSampleBufferQueue)
并执行 AVCaptureVideoDataOutputSampleBufferDelegate
:
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
// Handle the pixelBuffer the way you like
}