Firebase ML 套件未对齐的边界框

Firebase ML kit misaligned bounding box

我正在尝试使用新的 Detect and Track Objects with ML Kit on iOS 但是我似乎 运行 遇到了对象检测边界框的障碍。

以乐高积木为例,根据文档,图像方向始终转换为 .up 然而,尽管图像方向为正确的。这种类似的行为也存在于其他物体上,盒子被偏移。

let options = VisionObjectDetectorOptions()
    options.detectorMode = .singleImage
    options.shouldEnableMultipleObjects = false

    let objectDetector = Vision.vision().objectDetector(options: options)

    let image = VisionImage(image: self.originalImage)

    objectDetector.process(image) { detectedObjects, error in
      guard error == nil else {
        print(error)
        return
      }
      guard let detectedObjects = detectedObjects, !detectedObjects.isEmpty else {
        print("No objects detected")
        return
      }


        let primaryObject = detectedObjects.first

        print(primaryObject as Any)

        guard let objectFrame = primaryObject?.frame else{return}

        print(objectFrame)

        self.imageView.image = self.drawOccurrencesOnImage([objectFrame], self.originalImage)

    }

以及绘制红框的函数;

private func drawOccurrencesOnImage(_ occurrences: [CGRect], _ image: UIImage) -> UIImage? {
    let imageSize = image.size
    let scale: CGFloat = 0.0
    UIGraphicsBeginImageContextWithOptions(imageSize, false, scale)

    image.draw(at: CGPoint.zero)
    let ctx = UIGraphicsGetCurrentContext()

    ctx?.addRects(occurrences)
    ctx?.setStrokeColor(UIColor.red.cgColor)
    ctx?.setLineWidth(20)
    ctx?.strokePath()

    guard let drawnImage = UIGraphicsGetImageFromCurrentImageContext() else {
        return nil
    }

    UIGraphicsEndImageContext()
    return drawnImage
}

根据 image.size,图像尺寸为 (3024.0, 4032.0),框框为 (1274.0, 569.0, 1299.0, 2023.0)。必须赞赏对这种行为的任何见解。

最终没有正确缩放图像导致错位。

这个功能最终解决了我的问题。

public func updateImageView(with image: UIImage) {
  let orientation = UIApplication.shared.statusBarOrientation
  var scaledImageWidth: CGFloat = 0.0
  var scaledImageHeight: CGFloat = 0.0
  switch orientation {
  case .portrait, .portraitUpsideDown, .unknown:
    scaledImageWidth = imageView.bounds.size.width
    scaledImageHeight = image.size.height * scaledImageWidth / image.size.width
  case .landscapeLeft, .landscapeRight:
    scaledImageWidth = image.size.width * scaledImageHeight / image.size.height
    scaledImageHeight = imageView.bounds.size.height
  }
  DispatchQueue.global(qos: .userInitiated).async {
    // Scale image while maintaining aspect ratio so it displays better in the UIImageView.
    var scaledImage = image.scaledImage(
      with: CGSize(width: scaledImageWidth, height: scaledImageHeight)
    )
    scaledImage = scaledImage ?? image
    guard let finalImage = scaledImage else { return }
    DispatchQueue.main.async {
      self.imageView.image = finalImage
      self.processImage(finalImage)
    }
  }
}