计时问题:节拍器使用 AVAudioEngine scheduleBuffer 的完成处理程序

Timing issues: Metronome using AVAudioEngine scheduleBuffer's completion handler

我想使用具有以下功能的 AVAudioEngine 构建一个简单的节拍器应用程序:

所以我创建了两个短的点击声音(26 毫秒/1150 个样本@16 位/44.1 kHz/立体声 wav 文件)并将它们加载到 2 个缓冲区中。它们的长度将被设置为代表一个周期。

我的 UI 设置很简单:一个切换开始/暂停的按钮和一个显示当前节拍的标签(我的“计数器”变量)。

当使用 scheduleBuffer 的循环 属性 时序没问题,但是因为我需要有 2 种不同的声音和一种方法来 sync/update 我的 UI 在循环点击时我无法使用这个。我想出使用 completionHandler 而不是重新启动我的 playClickLoop() 函数 - 请参阅下面的代码附件。

不幸的是,在实现这个时我并没有真正测量时间的准确性。正如现在将 bpm 设置为 120 时所证明的那样,它仅以大约 117.5 bpm 的速度播放循环 - 相当稳定但仍然太慢了。当 bpm 设置为 180 时,我的应用程序以大约 172.3 bpm 的速度播放。

这是怎么回事?这种延迟是使用 completionHandler 引入的吗?有什么办法可以改善时间?还是我的整个方法都错了?

提前致谢! 亚历克斯

import UIKit
import AVFoundation

class ViewController: UIViewController {
    
    private let engine = AVAudioEngine()
    private let player = AVAudioPlayerNode()
    
    private let fileName1 = "sound1.wav"
    private let fileName2 = "sound2.wav"
    private var file1: AVAudioFile! = nil
    private var file2: AVAudioFile! = nil
    private var buffer1: AVAudioPCMBuffer! = nil
    private var buffer2: AVAudioPCMBuffer! = nil
    
    private let sampleRate: Double = 44100
    
    private var bpm: Double = 180.0
    private var periodLengthInSamples: Double { 60.0 / bpm * sampleRate }
    private var counter: Int = 0
    
    private enum MetronomeState {case run; case stop}
    private var state: MetronomeState = .stop
    
    @IBOutlet weak var label: UILabel!
    
    override func viewDidLoad() {
        
        super.viewDidLoad()
        
        //
        // MARK: Loading buffer1
        //
        let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
        let url1 = URL(fileURLWithPath: path1)
        do {file1 = try AVAudioFile(forReading: url1)
            buffer1 = AVAudioPCMBuffer(
                pcmFormat: file1.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file1.read(into: buffer1!)
            buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer1 \(error)") }
        
        //
        // MARK: Loading buffer2
        //
        let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
        let url2 = URL(fileURLWithPath: path2)
        do {file2 = try AVAudioFile(forReading: url2)
            buffer2 = AVAudioPCMBuffer(
                pcmFormat: file2.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file2.read(into: buffer2!)
            buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer2 \(error)") }
        
        //
        // MARK: Configure + start engine
        //
        engine.attach(player)
        engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
        engine.prepare()
        do { try engine.start() } catch { print(error) }
    }
    
    //
    // MARK: Play / Pause toggle action
    //
    @IBAction func buttonPresed(_ sender: UIButton) {
        
        sender.isSelected = !sender.isSelected
        
        if player.isPlaying {
            state = .stop
        } else {
            state = .run
            
            try! engine.start()
            player.play()
            
            playClickLoop()
        }
    }
    
    private func playClickLoop() {
        
        //
        //  MARK: Completion handler
        //
        let scheduleBufferCompletionHandler = { [unowned self] /*(_: AVAudioPlayerNodeCompletionCallbackType)*/ in
            
            DispatchQueue.main.async {
                
                switch state {
                
                case .run:
                    self.playClickLoop()
            
                case .stop:
                    engine.stop()
                    player.stop()
                    counter = 0
                }
            }
        }
        
        //
        // MARK: Schedule buffer + play
        //
        if engine.isRunning {
            
            counter += 1; if counter > 4 {counter = 1} // Counting from 1 to 4 only
            
            if counter == 1 {
                //
                // MARK: Playing sound1 on beat 1
                //
                player.scheduleBuffer(buffer1,
                                      at: nil,
                                      options: [.interruptsAtLoop],
                                      //completionCallbackType: .dataPlayedBack,
                                      completionHandler: scheduleBufferCompletionHandler)
            } else {
                //
                // MARK: Playing sound2 on beats 2, 3 & 4
                //
                player.scheduleBuffer(buffer2,
                                      at: nil,
                                      options: [.interruptsAtLoop],
                                      //completionCallbackType: .dataRendered,
                                      completionHandler: scheduleBufferCompletionHandler)
            }
            //
            // MARK: Display current beat on UILabel + to console
            //
            DispatchQueue.main.async {
                self.label.text = String(self.counter)
                print(self.counter)
            }
        }
    }
}

您用来测量的工具或过程的准确性如何?

由于我不是 C 程序员,我无法确定您的文件具有正确数量的 PCM 帧。加载文件时,似乎包含来自 wav header 的数据。这让我想知道在每次播放或循环开始时重复处理 header 信息时,播放是否会产生一些延迟。

我很幸运地在 Java 中构建了一个节拍器,方法是使用连续输出从读取 PCM 帧中派生的无尽流的计划。根据所选节拍器设置的周期和 PCM 帧中点击的长度,通过计算 PCM 帧并在静音(PCM 数据点 = 0)或点击的 PCM 数据中路由来实现计时。

正如 Phil Freihofner 在上面所建议的,这是我自己的问题的解决方案:

我学到的最重要的一课:scheduleBuffer 命令提供的 completionHandler 回调调用不够早,无法在第一个缓冲区仍在播放时触发另一个缓冲区的重新调度。这将导致声音之间出现(听不见的)间隙并弄乱时间。必须已经有另一个缓冲区“保留”,即在当前缓冲区被安排之前已经被安排。

考虑到完成回调的时间,使用 scheduleBuffer 的 completionCallbackType 参数并没有太大变化:当将其设置为 .dataRendered 或 .dataConsumed 时,回调已经太晚了,无法重新安排另一个缓冲区。使用 .dataPlayedback 只会让事情变得更糟:-)

因此,为了实现无缝播放(正确计时!)我只是激活了一个定时器,每个周期触发两次。所有奇数计时器事件将重新安排另一个缓冲区。

有时解决方案太简单了,令人尴尬......但有时你必须先尝试几乎所有错误的方法才能找到它;-)

我完整的工作解决方案(包括两个声音文件和 UI)可以在 GitHub 上找到:

https://github.com/Alexander-Nagel/Metronome-using-AVAudioEngine

import UIKit
import AVFoundation

private let DEBUGGING_OUTPUT = true

class ViewController: UIViewController{
    
    private var engine = AVAudioEngine()
    private var player = AVAudioPlayerNode()
    private var mixer = AVAudioMixerNode()
    
    private let fileName1 = "sound1.wav"
    private let fileName2 = "sound2.wav"
    private var file1: AVAudioFile! = nil
    private var file2: AVAudioFile! = nil
    private var buffer1: AVAudioPCMBuffer! = nil
    private var buffer2: AVAudioPCMBuffer! = nil
    
    private let sampleRate: Double = 44100
    
    private var bpm: Double = 133.33
    private var periodLengthInSamples: Double {
        60.0 / bpm * sampleRate
    }
    private var timerEventCounter: Int = 1
    private var currentBeat: Int = 1
    private var timer: Timer! = nil
    
    private enum MetronomeState {case running; case stopped}
    private var state: MetronomeState = .stopped
        
    @IBOutlet weak var beatLabel: UILabel!
    @IBOutlet weak var bpmLabel: UILabel!
    @IBOutlet weak var playPauseButton: UIButton!
    
    override func viewDidLoad() {
        
        super.viewDidLoad()
        
        bpmLabel.text = "\(bpm) BPM"
        
        setupAudio()
    }
    
    private func setupAudio() {
        
        //
        // MARK: Loading buffer1
        //
        let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
        let url1 = URL(fileURLWithPath: path1)
        do {file1 = try AVAudioFile(forReading: url1)
            buffer1 = AVAudioPCMBuffer(
                pcmFormat: file1.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file1.read(into: buffer1!)
            buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer1 \(error)") }
        
        //
        // MARK: Loading buffer2
        //
        let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
        let url2 = URL(fileURLWithPath: path2)
        do {file2 = try AVAudioFile(forReading: url2)
            buffer2 = AVAudioPCMBuffer(
                pcmFormat: file2.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file2.read(into: buffer2!)
            buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer2 \(error)") }
        
        //
        // MARK: Configure + start engine
        //
        engine.attach(player)
        engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
        engine.prepare()
        do { try engine.start() } catch { print(error) }
    }
    
    //
    // MARK: Play / Pause toggle action
    //
    @IBAction func buttonPresed(_ sender: UIButton) {
        
        sender.isSelected = !sender.isSelected
        
        if state == .running {
            
            //
            // PAUSE: Stop timer and reset counters
            //
            state = .stopped
            
            timer.invalidate()
            
            timerEventCounter = 1
            currentBeat = 1
            
        } else {
            
            //
            // START: Pre-load first sound and start timer
            //
            state = .running
            
            scheduleFirstBuffer()
            
            startTimer()
        }
    }
    
    private func startTimer() {
        
        if DEBUGGING_OUTPUT {
            print("# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #  ")
            print()
        }
        
        //
        // Compute interval for 2 events per period and set up timer
        //
        let timerIntervallInSamples = 0.5 * self.periodLengthInSamples / sampleRate
        
        timer = Timer.scheduledTimer(withTimeInterval: timerIntervallInSamples, repeats: true) { timer in
            
            //
            // Only for debugging: Print counter values at start of timer event
            //
            // Values at begin of timer event
            if DEBUGGING_OUTPUT {
                print("timerEvent #\(self.timerEventCounter) at \(self.bpm) BPM")
                print("Entering \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) ")
            }
            
            //
            // Schedule next buffer at 1st, 3rd, 5th & 7th timerEvent
            //
            var bufferScheduled: String = "" // only needed for debugging / console output
            switch self.timerEventCounter {
            case 7:
                
                //
                // Schedule main sound
                //
                self.player.scheduleBuffer(self.buffer1, at:nil, options: [], completionHandler: nil)
                bufferScheduled = "buffer1"
                
            case 1, 3, 5:
                
                //
                // Schedule subdivision sound
                //
                self.player.scheduleBuffer(self.buffer2, at:nil, options: [], completionHandler: nil)
                bufferScheduled = "buffer2"
                
            default:
                bufferScheduled = ""
            }
            
            //
            // Display current beat & increase currentBeat (1...4) at 2nd, 4th, 6th & 8th timerEvent
            //
            if self.timerEventCounter % 2 == 0 {
                DispatchQueue.main.async {
                    self.beatLabel.text = String(self.currentBeat)
                }
                self.currentBeat += 1; if self.currentBeat > 4 {self.currentBeat = 1}
            }
            
            //
            // Increase timerEventCounter, two events per beat.
            //
            self.timerEventCounter += 1; if self.timerEventCounter > 8 {self.timerEventCounter = 1}
            
            
            //
            // Only for debugging: Print counter values at end of timer event
            //
            if DEBUGGING_OUTPUT {
                print("Exiting \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) \tscheduling: \(bufferScheduled)")
                print()
            }
        }
    }
    
    private func scheduleFirstBuffer() {
        
        player.stop()
        
        //
        // pre-load accented main sound (for beat "1") before trigger starts
        //
        player.scheduleBuffer(buffer1, at: nil, options: [], completionHandler: nil)
        player.play()
        beatLabel.text = String(currentBeat)
    }
}

非常感谢大家的帮助!这是一个很棒的社区。

亚历克斯