不为连续安排的文件调用 AVAudioPlayerNode 完成回调

AVAudioPlayerNode completion callbacks not called for files scheduled consecutively

使用AVAudioPlayerNode调度文件或缓冲区时,您可以提交完成回调。您还可以指定应为什么事件调用回调,选项为:

dataConsumed
dataRendered
dataPlayedBack

在大多数情况下,这会按预期工作。但是,一个接一个地调度文件时,如果类型为dataRendereddataPlayedBack.

,则不会调用除最后一个文件以外的文件的回调

这是一个演示问题的独立示例。 (此代码可以粘贴到 Xcode iOS 'App' 模板中的 'ViewController.swift' 文件的内容上。项目需要将代码中引用的音频文件包含在为了工作。)

import AVFoundation
import UIKit

class ViewController: UIViewController {
    private let engine = AVAudioEngine()
    private let filePlayer = AVAudioPlayerNode()
    private let bufferPlayer = AVAudioPlayerNode()
    private var file: AVAudioFile! = nil
    private var buffer: AVAudioPCMBuffer! = nil

    override func viewDidLoad() {
        super.viewDidLoad()

        let url = Bundle.main.resourceURL!.appendingPathComponent("audio.m4a")
        
        file = try! AVAudioFile(forReading: url)

        buffer = AVAudioPCMBuffer(
            pcmFormat: file.processingFormat,
            frameCapacity: AVAudioFrameCount(file.length)
        )
        
        try! file.read(into: buffer)
        
        let format = file.processingFormat

        engine.attach(filePlayer)
        engine.attach(bufferPlayer)
        engine.connect(filePlayer, to: engine.mainMixerNode, format: format)
        engine.connect(bufferPlayer, to: engine.mainMixerNode, format: format)
        
        try! engine.start()
        
        filePlayer.play()
        bufferPlayer.play()
        
        for i in 0 ..< 3 {
            filePlayer.scheduleFile(
                file, at: nil, completionCallbackType: .dataPlayedBack
            ) {
                type in print("File \(i)")
            }
        }
        for i in 0 ..< 3 {
            filePlayer.scheduleBuffer(
                buffer, at: nil, completionCallbackType: .dataPlayedBack
            ) {
                type in print("Buff \(i)")
            }
        }
    }
}

这是我得到的输出:

File 2
Buff 0
Buff 1
Buff 2

如您所见,为缓冲区的所有三个实例调用回调,但只为文件的最后一个实例调用。

同样,只有 dataRendereddataPlayedBack 没有调用回调。对于 dataConsumed,它工作正常。

有人遇到过这种情况吗?任何人都可以确认这种行为吗?这似乎是一个错误,但也有可能是我做错了什么。

编辑:

这是响应评论中提出的想法的另一个版本的代码。在这个版本中,不再调度同一个文件实例三次,而是连续调度同一个文件的三个实例:

import AVFoundation
import UIKit

class ViewController: UIViewController {
    private let engine = AVAudioEngine()
    private let filePlayer = AVAudioPlayerNode()
    private let bufferPlayer = AVAudioPlayerNode()
    private var files = [AVAudioFile]()
    private var buffer: AVAudioPCMBuffer! = nil

    override func viewDidLoad() {
        super.viewDidLoad()

        let url = Bundle.main.resourceURL!.appendingPathComponent("audio.m4a")
        
        for _ in 0 ..< 3 {
            files.append(try! AVAudioFile(forReading: url))
        }
        
        let file = files[0]

        buffer = AVAudioPCMBuffer(
            pcmFormat: file.processingFormat,
            frameCapacity: AVAudioFrameCount(file.length)
        )
        
        try! file.read(into: buffer)
        
        let format = file.processingFormat

        engine.attach(filePlayer)
        engine.attach(bufferPlayer)
        engine.connect(filePlayer, to: engine.mainMixerNode, format: format)
        engine.connect(bufferPlayer, to: engine.mainMixerNode, format: format)
        
        try! engine.start()
        
        filePlayer.play()
        bufferPlayer.play()
        
        for i in 0 ..< 3 {
            filePlayer.scheduleFile(
                files[i], at: nil, completionCallbackType: .dataPlayedBack
            ) {
                type in print("File \(i)")
            }
        }
        for i in 0 ..< 3 {
            filePlayer.scheduleBuffer(
                buffer, at: nil, completionCallbackType: .dataPlayedBack
            ) {
                type in print("Buff \(i)")
            }
        }
    }
}

结果是一样的。除了多次读取同一个文件的实用性之外,这是一个有趣的诊断,因为它提供了有关该行为的更多信息。如果是BUG,似乎与调度文件是否为同一个AVAudioFile实例无关。

因为 AVAudioPlayerNode 本质上是 kAudioUnitSubType_ScheduledSoundPlayer 的包装器(可能是 kAudioUnitSubType_AudioFilePlayer 中的一些文件读取和缓冲代码被抛入但使用 ExtAudioFile 代替)我做了一个实验,看看低级别的对手是否表现出相同的行为。

这并不是完全相同的比较,但 kAudioUnitSubType_ScheduledSoundPlayer 似乎按预期工作,所以这可能是 AVAudioPlayerNode 中的错误。

我用来测试的代码如下。 kAudioUnitSubType_ScheduledSoundPlayer用于调度三个切片(缓冲区)。它们来自同一个文件,但这无关紧要,因为 kAudioUnitSubType_ScheduledSoundPlayer 只知道缓冲区而不是文件。

所有三个切片都按预期调用了回调。所以看起来问题很可能是 AVAudioPlayerNode 如何在内部处理这些回调并将它们路由到非实时调度队列(因为 kAudioUnitSubType_ScheduledSoundPlayer 的回调是在 HAL 的实时 IO 线程上处理的,并且不能相信客户端不会阻塞 IO 线程。

//  ViewController.m

#import "ViewController.h"

@import AudioToolbox;
@import AVFoundation;
@import os.log;

@interface ViewController ()
{
    AUGraph _graph;
    AUNode _player;
    AUNode _mixer;
    AUNode _output;
    ScheduledAudioSlice _slice [3];
    AVAudioPCMBuffer *_buf;
}
- (void)scheduledAudioSliceCompleted:(ScheduledAudioSlice *)slice;
@end

void myScheduledAudioSliceCompletionProc(void * __nullable userData, ScheduledAudioSlice *slice)
{
    // ⚠️ WARNING ⚠️
    // THIS FUNCTION IS CALLED FROM THE REAL TIME RENDERING THREAD.
    // OBJ-C USE HERE IS FOR TESTING CALLBACK FUNCTIONALITY ONLY
    // OBJ-C IS NOT REAL TIME SAFE
    // DO NOT DO THIS IN PRODUCTION CODE!!!
    [(__bridge ViewController *)userData scheduledAudioSliceCompleted:slice];
}

@implementation ViewController

- (void)dealloc {
    [self closeGraph];
}

- (void)viewDidLoad {
    [super viewDidLoad];
    [self openGraph];
    [self schedule];
    [self startPlayer];
    [self startGraph];
}

-(OSStatus)openGraph {
    OSStatus result = NewAUGraph(&_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "NewAUGraph failed: %d", result);
        return result;
    }

    // The graph will look like:
    // Player -> MultiChannelMixer -> Output
    AudioComponentDescription desc;

    // Player
    desc.componentType          = kAudioUnitType_Generator;
    desc.componentSubType       = kAudioUnitSubType_ScheduledSoundPlayer;
    desc.componentManufacturer  = kAudioUnitManufacturer_Apple;
    desc.componentFlags         = kAudioComponentFlag_SandboxSafe;
    desc.componentFlagsMask     = 0;

    result = AUGraphAddNode(_graph, &desc, &_player);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphAddNode failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    // Mixer
    desc.componentType          = kAudioUnitType_Mixer;
    desc.componentSubType       = kAudioUnitSubType_MultiChannelMixer;
    desc.componentManufacturer  = kAudioUnitManufacturer_Apple;
    desc.componentFlags         = kAudioComponentFlag_SandboxSafe;
    desc.componentFlagsMask     = 0;

    result = AUGraphAddNode(_graph, &desc, &_mixer);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphAddNode failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    // Output
    desc.componentType          = kAudioUnitType_Output;
    desc.componentSubType       = kAudioUnitSubType_HALOutput;
    desc.componentFlags         = kAudioComponentFlag_SandboxSafe;
    desc.componentManufacturer  = kAudioUnitManufacturer_Apple;
    desc.componentFlagsMask     = 0;

    result = AUGraphAddNode(_graph, &desc, &_output);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphAddNode failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    // Connections
    result = AUGraphConnectNodeInput(_graph, _player, 0, _mixer, 0);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphConnectNodeInput failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    result = AUGraphConnectNodeInput(_graph, _mixer, 0, _output, 0);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphConnectNodeInput failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    // Open the graph
    result = AUGraphOpen(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphOpen failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    // Set the mixer's volume on the input and output
    AudioUnit au = NULL;
    result = AUGraphNodeInfo(_graph, _mixer, NULL, &au);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphNodeInfo failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    result = AudioUnitSetParameter(au, kMultiChannelMixerParam_Volume, kAudioUnitScope_Input, 0, 1.f, 0);
    if(noErr != result)
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetParameter (kMultiChannelMixerParam_Volume, kAudioUnitScope_Input) failed: %d", result);

    result = AudioUnitSetParameter(au, kMultiChannelMixerParam_Volume, kAudioUnitScope_Output, 0, 1.f, 0);
    if(noErr != result)
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetParameter (kMultiChannelMixerParam_Volume, kAudioUnitScope_Output) failed: %d", result);

    // Initialize the graph
    result = AUGraphInitialize(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphInitialize failed: %d", result);

        result = DisposeAUGraph(_graph);
        if(noErr != result)
            os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);

        _graph = NULL;
        return result;
    }

    return noErr;
}

- (OSStatus)closeGraph {
    Boolean graphIsRunning = NO;
    OSStatus result = AUGraphIsRunning(_graph, &graphIsRunning);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphIsRunning failed: %d", result);
        return result;
    }

    if(graphIsRunning) {
        result = AUGraphStop(_graph);
        if(noErr != result) {
            os_log_error(OS_LOG_DEFAULT, "AUGraphStop failed: %d", result);
            return result;
        }
    }

    Boolean graphIsInitialized = false;
    result = AUGraphIsInitialized(_graph, &graphIsInitialized);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphIsInitialized failed: %d", result);
        return result;
    }

    if(graphIsInitialized) {
        result = AUGraphUninitialize(_graph);
        if(noErr != result) {
            os_log_error(OS_LOG_DEFAULT, "AUGraphUninitialize failed: %d", result);
            return result;
        }
    }

    result = AUGraphClose(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphClose failed: %d", result);
        return result;
    }

    result = DisposeAUGraph(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "DisposeAUGraph failed: %d", result);
        return result;
    }

    _graph = NULL;
    _player = -1;
    _mixer = -1;
    _output = -1;

    return noErr;
}

- (OSStatus)startGraph {
    OSStatus result = AUGraphStart(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphStart failed: %d", result);
        return result;
    }

    return noErr;
}

- (OSStatus)stopGraph {
    OSStatus result = AUGraphStop(_graph);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphStop failed: %d", result);
        return result;
    }

    return noErr;
}

- (OSStatus)startPlayer {
    AudioUnit au;
    OSStatus result = AUGraphNodeInfo(_graph, _player, NULL, &au);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphNodeInfo failed: %d", result);
        return result;
    }

    AudioTimeStamp ts = {0};

    ts.mFlags           = kAudioTimeStampSampleTimeValid;
    ts.mSampleTime      = 0;

    result = AudioUnitSetProperty(au, kAudioUnitProperty_ScheduleStartTimeStamp, kAudioUnitScope_Global, 0, &ts, sizeof(ts));
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetProperty failed: %d", result);
        return result;
    }

    return noErr;
}

- (OSStatus)schedule {
    AudioUnit au;
    OSStatus result = AUGraphNodeInfo(_graph, _player, NULL, &au);
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AUGraphNodeInfo failed: %d", result);
        return result;
    }

    AVAudioFile *file = [[AVAudioFile alloc] initForReading:[NSURL fileURLWithPath:@"/tmp/test.wav" isDirectory:NO] commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
    if(!file)
        return paramErr;

    _buf = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(file.processingFormat.sampleRate * 2)];

    if(![file readIntoBuffer:_buf error:nil])
        return paramErr;

    AudioTimeStamp ts = {0};

    ts.mFlags           = kAudioTimeStampSampleTimeValid;
    ts.mSampleTime      = 0;

    _slice[0].mTimeStamp                = ts;
    _slice[0].mCompletionProc           = myScheduledAudioSliceCompletionProc;
    _slice[0].mCompletionProcUserData   = (__bridge void *)self;
    _slice[0].mNumberFrames             = _buf.frameLength;
    _slice[0].mBufferList               = _buf.mutableAudioBufferList;

    result = AudioUnitSetProperty(au, kAudioUnitProperty_ScheduleAudioSlice, kAudioUnitScope_Global, 0, &_slice[0], sizeof(_slice[0]));
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetProperty failed: %d", result);
        return result;
    }

    ts.mSampleTime      += _slice[0].mNumberFrames;

    _slice[1]                           = _slice[0];
    _slice[1].mTimeStamp                = ts;

    result = AudioUnitSetProperty(au, kAudioUnitProperty_ScheduleAudioSlice, kAudioUnitScope_Global, 0, &_slice[1], sizeof(_slice[1]));
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetProperty failed: %d", result);
        return result;
    }

    ts.mSampleTime      += _slice[1].mNumberFrames;

    _slice[2]                           = _slice[1];
    _slice[2].mTimeStamp                = ts;

    result = AudioUnitSetProperty(au, kAudioUnitProperty_ScheduleAudioSlice, kAudioUnitScope_Global, 0, &_slice[2], sizeof(_slice[2]));
    if(noErr != result) {
        os_log_error(OS_LOG_DEFAULT, "AudioUnitSetProperty failed: %d", result);
        return result;
    }

    return noErr;
}

- (void)scheduledAudioSliceCompleted:(ScheduledAudioSlice *)slice {
    if(slice == &_slice[0])
        NSLog(@"_slice[0] scheduledAudioSliceCompleted:%p, mFlags = 0x%.2x", slice, slice->mFlags);
    else if(slice == &_slice[1])
        NSLog(@"_slice[1] scheduledAudioSliceCompleted:%p, mFlags = 0x%.2x", slice, slice->mFlags);
    else if(slice == &_slice[2])
        NSLog(@"_slice[2] scheduledAudioSliceCompleted:%p, mFlags = 0x%.2x", slice, slice->mFlags);
    else
        NSLog(@"scheduledAudioSliceCompleted:%p, mFlags = 0x%.2x for unknown slice", slice, slice->mFlags);
}

@end

输出:

XXX _slice[0] scheduledAudioSliceCompleted:0x7f82ee41add0, mFlags = 0x03
XXX _slice[1] scheduledAudioSliceCompleted:0x7f82ee41ae40, mFlags = 0x03
XXX _slice[2] scheduledAudioSliceCompleted:0x7f82ee41aeb0, mFlags = 0x03
0x03

mFlags 等同于 kScheduledAudioSliceFlag_Complete | kScheduledAudioSliceFlag_BeganToRender.