管理 AudioKit 生命周期的正确方法是什么?

What is a correct way to manage AudioKit's lifecycle?

我正在构建一个必须跟踪用户麦克风输入振幅的应用程序。 AudioKit 有一堆方便的对象满足我的需要:AKAmplitudeTracker 等等。我还没有找到任何关于如何启动 AudioKit、开始跟踪等的可行信息。

目前所有与 AudioKit 初始化相关的代码都在我的录音机模块的根 VC 的 viewDidLoad 方法中。这是不正确的,因为会发生随机错误,我无法追踪错误。下面的代码显示了我现在如何使用 AudioKit。

var silence: AKBooster!
  var tracker: AKAmplitudeTracker!
    var mic: AKMicrophone!

      ...

      override func viewDidLoad() {
        super.viewDidLoad()

        switch AVAudioSession.sharedInstance().recordPermission() {

            case AVAudioSessionRecordPermission.granted:

              self.mic = AKMicrophone()
              self.tracker = AKAmplitudeTracker(self.mic)
              AKSettings.audioInputEnabled = true
              AudioKit.output = self.tracker
              AudioKit.start()
              self.mic.start()
              self.tracker.start()

              break

            case AVAudioSessionRecordPermission.undetermined:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }
            case AVAudioSessionRecordPermission.denied:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }


            default:
              print("")
          }

          ...

      }

请帮我弄清楚如何正确管理 AudioKit。

据我所知,它看起来应该可以正常工作,您的代码中的其他地方可能有问题。我制作了一个精简的演示来测试基础知识,并且它有效。我只是添加了一个定时器来轮询振幅。

import UIKit
import AudioKit

class ViewController: UIViewController {

    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    override func viewDidLoad() {
        super.viewDidLoad()

        mic = AKMicrophone()
        tracker = AKAmplitudeTracker(mic)
        AudioKit.output = tracker
        AudioKit.start()

        Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { (timer) in
            print(self.tracker.amplitude)
        }
    }
}

阿列克谢

我对管理 AudioKit 的生命周期的建议是将它放在一个单例中 class。这就是它在 repo 中包含的一些 AudioKit 示例中的设置方式,例如 Analog Synth X and Drums。这样,它就不会绑定到特定的 ViewController 的 viewDidLoad,并且可以从多个 ViewController 或管理应用程序状态的 AppDelegate 进行访问。它还确保您只会创建它的一个实例。

这是一个示例,其中 AudioKit 在 class 中初始化,称为 Conductor(也可以称为 AudioManager 等):

import AudioKit
import AudioKitUI

// Treat the conductor like a manager for the audio engine.
class Conductor {

    // Singleton of the Conductor class to avoid multiple instances of the audio engine
    static let sharedInstance = Conductor()

    // Create instance variables
    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    // Add effects
    var delay: AKDelay!
    var reverb: AKCostelloReverb!

    // Balance between the delay and reverb mix.
    var reverbAmountMixer = AKDryWetMixer()

    init() {

        // Allow audio to play while the iOS device is muted.
        AKSettings.playbackWhileMuted = true

        AKSettings.defaultToSpeaker = true

        // Capture mic input
        mic = AKMicrophone()

        // Pull mic output into the tracker node.
        tracker = AKAmplitudeTracker(mic)

        // Pull the tracker output into the delay effect node.
        delay = AKDelay(tracker)
        delay.time = 2.0
        delay.feedback = 0.1
        delay.dryWetMix = 0.5

        // Pull the delay output into the reverb effect node.
        reverb = AKCostelloReverb(delay)
        reverb.presetShortTailCostelloReverb()

        // Mix the amount of reverb to the delay output node.
        reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)

        // Assign the reverbAmountMixer output to be the final audio output
        AudioKit.output = reverbAmountMixer

        // Start the AudioKit engine
        // This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
        startAudioEngine()

    }

    internal func startAudioEngine() {
        AudioKit.start()
        print("Audio engine started")
    }

    internal func stopAudioEngine() {
        AudioKit.stop()
        print("Audio engine stopped")
    }
}

以下是如何从 ViewController 访问 Conductor 单音 class 中发生的振幅跟踪数据:

import UIKit

class ViewController: UIViewController {

    var conductor = Conductor.sharedInstance

    override func viewDidLoad() {
        super.viewDidLoad()

        Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
            print(self.conductor.tracker.amplitude)
        }

    }
}

您可以从此处下载此 GitHub 存储库:

https://github.com/markjeschke/AudioKit-Amplitude-Tracker

希望对您有所帮助。

保重,
马克