如何使 UIViewController class 可重复使用以将数据传递回调用它的 viewController

How to made a UIViewController class reusable to pass data back to a viewController that calls it

我使用的是来自以下站点的代码

https://www.hackingwithswift.com/example-code/media/how-to-scan-a-qr-code

代码完美运行,访问上面link即可查看代码

这是一段代码,可以从相机中捕获 QRCode/BarCode 并将其转换为字符串。

显示字符串的代码部分是:

func found(code: String) {
    print(code)
}

代码串“打印”后,代码调用“dismiss”,return到之前的UIViewController

我想获取"code"字符串,获取数据给之前的UIViewController

我现在能够做到这一点的唯一方法是使用以下代码:

func found(code: String) {
    print("code: \(code)")
    ResenhaEquideoIdentificaAnimal1Controller.shared.microchipAnimalTextField.text = code
}

但此代码仅在“ResenhaEquideoIdentificaAnimal1Controller”调用时才有效class。

我使用以下代码通过 UIButton 在“ResenhaEquideoIdentificaAnimal1Controller”class 中调用新的 UIViewController。

let myScannerViewController = MyScannerViewController()
present(myScannerViewController, animated: true, completion: nil)

我怎样才能使这个 class 可重用以便能够调用“MyScannerViewController”class 并将数据发送回调用它的视图?

您想使用一个"delegate patten",即当发现代码或出现问题时,您将功能委托给其他方来处理。

例如,您可以修改现有示例以添加对简单委托的支持...

import AVFoundation
import UIKit

protocol ScannerDelegate: AnyObject {
    func scanner(_ controller: ScannerViewController, didDiscoverCode code: String)
    func failedToScanner(_ controller: ScannerViewController)
}

class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
    var captureSession: AVCaptureSession!
    var previewLayer: AVCaptureVideoPreviewLayer!
    
    weak var scannerDelegate: ScannerDelegate?

    override func viewDidLoad() {
        super.viewDidLoad()

        view.backgroundColor = UIColor.black
        captureSession = AVCaptureSession()

        guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
        let videoInput: AVCaptureDeviceInput

        do {
            videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
        } catch {
            return
        }

        if (captureSession.canAddInput(videoInput)) {
            captureSession.addInput(videoInput)
        } else {
            failed()
            return
        }

        let metadataOutput = AVCaptureMetadataOutput()

        if (captureSession.canAddOutput(metadataOutput)) {
            captureSession.addOutput(metadataOutput)

            metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
            metadataOutput.metadataObjectTypes = [.qr]
        } else {
            failed()
            return
        }

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer.frame = view.layer.bounds
        previewLayer.videoGravity = .resizeAspectFill
        view.layer.addSublayer(previewLayer)

        captureSession.startRunning()
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        if (captureSession?.isRunning == false) {
            captureSession.startRunning()
        }
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        if (captureSession?.isRunning == true) {
            captureSession.stopRunning()
        }
    }

    private func failed() {
        captureSession = nil
        scannerDelegate?.failedToScanner(self)
    }

    private func didFind(code: String) {
        scannerDelegate?.scanner(self, didDiscoverCode: code)
    }

    override var prefersStatusBarHidden: Bool {
        return true
    }

    override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
        return .portrait
    }

    // MARK: AVCaptureMetadataOutputObjectsDelegate
    
    func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
        captureSession.stopRunning()

        if let metadataObject = metadataObjects.first {
            guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
            guard let stringValue = readableObject.stringValue else { return }
            AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
            didFind(code: stringValue)
        }
    }
}

当你想扫描某些东西时,你的调用视图控制器可以采用协议...

extension ViewController: ScannerDelegate {
    func failedToScanner(_ controller: ScannerViewController) {
        controller.dismiss(animated: true) {
            let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert)
            ac.addAction(UIAlertAction(title: "OK", style: .default))
            self.present(ac, animated: true)
        }
    }
    
    func scanner(_ controller: ScannerViewController, didDiscoverCode code: String) {
        codeLabel.text = code
        controller.dismiss(animated: true)
    }
}

当你想要呈现扫描仪视图控制器时,你只需将视图控制器设置为委托...

let controller = ScannerViewController()
controller.scannerDelegate = self
present(controller, animated: true)

这样做的好处是,如果您不感兴趣,只需修改委托工作流程就可以轻松拒绝代码