用于音乐文件的 AVAudioPCMBuffer

AVAudioPCMBuffer for music files

我一直在尝试在我的 SpriteKit 游戏中播放音乐,并使用 AVAudioPlayerNode class 通过 AVAudioPCMBuffers 来播放音乐。每次我导出我的 OS X 项目时,它都会崩溃并给我一个关于音频播放的错误。在过去的 24 小时里,我的头撞在墙上后,我决定重新观看 WWDC session 501(参见 54:17)。我对这个问题的解决方案是演示者使用的方法,即将缓冲区的帧分成更小的部分以分解正在读取的音频文件。

NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;

AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
    AVAudioFramePosition readPosition = audioFile.framePosition;
    if (![audioFile readIntoBuffer: readBuffer error: &error])
        return NO;
    if (readBuffer.frameLength == 0) //end of file reached
        break;
}

我目前的问题是播放器只播放读入缓冲区的最后一帧。我正在播放的音乐只有 2 分钟长。显然,这对于直接读入缓冲区来说太长了。每次在循环内调用 readIntoBuffer: 方法时缓冲区是否被覆盖?我对这些东西一窍不通...我怎样才能播放整个文件?

如果我不能让它工作,跨多个 SKScene 播放音乐(2 个不同文件)的好方法是什么?

这是我想出的解决方案。它仍然不完美,但希望它能帮助那些和我陷入同样困境的人。我创建了一个单身人士 class 来处理这项工作。将来可以进行的一项改进是仅在需要时加载特定 SKScene 所需的音效和音乐文件。我对这段代码有太多的问题,现在我不想弄乱它。目前,我没有太多的声音,所以它没有使用过多的内存。

概览
我的策略如下:

  1. 将游戏的音频文件名存储在 plist 中
  2. 从该 plist 读取并创建两个词典(一个用于音乐,一个用于短音效)
  3. 音效字典由一个AVAudioPCMBuffer和一个AVAudioPlayerNode组成,对应每个声音
  4. 音乐字典由一个 AVAudioPCMBuffers 数组,一个时间戳数组组成,这些缓冲区应该在队列中播放,一个 AVAudioPlayerNode 和原始音频文件的采样率

    • 采样率对于确定每个缓冲区应该播放的时间是必要的(您将看到在代码中完成的计算)
  5. 创建一个 AVAudioEngine 并从引擎中获取主混音器并将所有 AVAudioPlayerNode 附加到混音器(按照惯例)

  6. 使用各种方法播放音效或音乐
    • 音效播放很简单...调用方法-(void) playSfxFile:(NSString*)file; 并播放声音
    • 对于音乐,如果不调用试图播放音乐的场景的帮助,我找不到好的解决方案。场景将调用 -(void) playMusicFile:(NSString*)file; 并安排缓冲区按照创建顺序播放。我找不到让音乐在 AudioEngine class 中完成后重复播放的好方法,所以我决定让场景检查其 update: 方法是否播放音乐正在播放特定文件,如果没有,请再次播放(不是一个非常巧妙的解决方案,但它有效)

AudioEngine.h

#import <Foundation/Foundation.h>

@interface AudioEngine : NSObject

+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;

@end

AudioEngine.m

#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)

@interface AudioEngine()

@property AVAudioEngine *engine;
@property AVAudioMixerNode *mixer;

@property NSMutableDictionary *musicDict;
@property NSMutableDictionary *sfxDict;

@property NSString *audioInfoPList;

@property float musicVolumePercent;
@property float sfxVolumePercent;
@property float fadeVolume;
@property float timerCount;

@end

@implementation AudioEngine

int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = @"player";
static NSString * const MUSIC_BUFFERS = @"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = @"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = @"sampleRate";

static NSString * const SFX_BUFFER = @"buffer";
static NSString * const SFX_PLAYER = @"player";

+(instancetype) sharedData {
    static AudioEngine *sharedInstance = nil;

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        sharedInstance = [[self alloc] init];
        [sharedInstance startEngine];
    });

    return sharedInstance;
}

-(instancetype) init {
    if (self = [super init]) {
        _engine = [[AVAudioEngine alloc] init];
        _mixer = [_engine mainMixerNode];

        _audioInfoPList = [[NSBundle mainBundle] pathForResource:@"AudioInfo" ofType:@"plist"]; //open a plist called AudioInfo.plist

        [self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
        [self initMusic];
        [self initSfx];
    }
    return self;
}

//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
    _musicDict = [NSMutableDictionary dictionary];

    _audioInfoPList = [[NSBundle mainBundle] pathForResource: @"AudioInfo" ofType: @"plist"];
    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *musicFileName in audioInfoData[@"music"]) {
        [self loadMusicIntoBuffer:musicFileName];
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
        [_engine connect:player to:_mixer format:buffer.format];
        [_musicDict[musicFileName] setObject:player forKey:@"player"];
    }
}

//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:@"aif"];
    //NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"aif"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
    float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
    [_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];

    NSMutableArray *buffers = [NSMutableArray array];
    NSMutableArray *framePositions = [NSMutableArray array];

    const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
    while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
        [framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
        AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
        if (![audioFile readIntoBuffer:readBuffer error:&error]) {
            NSLog(@"failed to read audio file: %@", error);
            return;
        }
        if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
            break;
        }
        [buffers addObject:readBuffer];
    }

    [_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
    [_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}

-(void) initSfx {
    _sfxDict = [NSMutableDictionary dictionary];

    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *sfxFileName in audioInfoData[@"sfx"]) {
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        [self loadSoundIntoBuffer:sfxFileName];
        AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
        [_engine connect:player to:_mixer format:buffer.format];
        [_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
    }
}

//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"mp3"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
    [audioFile readIntoBuffer:readBuffer error:&error];

    [_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}

-(void)startEngine {
    [_engine startAndReturnError:nil];
}

-(void) playSfxFile:(NSString*)file {
    AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:@"player"];
    AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
    [player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
    [player setVolume:1.0];
    [player setVolume:_sfxVolumePercent];
    [player play];
}

-(void) playMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying] == NO) {
        NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];

        double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];


        for (int i = 0; i < [buffers count]; i++) {
            long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
            AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];

            AVAudioPCMBuffer *buffer  = [buffers objectAtIndex:i];
            [player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
                if (i == [buffers count] - 1) {
                    [player stop];
                }
            }];
            [player setVolume:_musicVolumePercent];
            [player play];
        }
    }
}

-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
    if ([file isEqualToString:@"menuscenemusic"]) {
        AVAudioPlayerNode *player = [_musicDict[@"levelscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
    else {
        AVAudioPlayerNode *player = [_musicDict[@"menuscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
}

//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying]) {
        _timerCount = FADE_ITERATIONS;
        _fadeVolume = _musicVolumePercent;
        [self fadeOutMusicForPlayer:player]; //fade out the music
    }
}

//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
    [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(handleTimer:) userInfo:player repeats:YES];
}

//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
    AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
    if (_timerCount > 0) {
        _timerCount--;
        AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
        _fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
        [player setVolume:_fadeVolume];
    }
    else {
        [player stop];
        [player setVolume:_musicVolumePercent];
        [timer invalidate];
    }
}

-(void) pauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying]) {
        [player pause];
    }
}

-(void) unpauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    [player play];
}

//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
    NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:@"musicVolume"];
    _musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;
    NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:@"sfxVolume"];
    _sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;

    //immediately sets music to new volume
    for (NSString *file in [_musicDict allKeys]) {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        [player setVolume:_musicVolumePercent];
    }
}

-(bool) isPlayingMusic:(NSString *)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying])
        return YES;
    return NO;
}

@end