设置 AVAudioFormat 连接功能崩溃
setting AVAudioFormat to connect function has crash
我一直在尝试使用 AVAudioEngine 播放 16 位流数据。
但是传递给 AVAudioFormat 连接函数总是会导致崩溃。
代码如下:
let AUDIO_OUTPUT_SAMPLE_RATE = 44100
let AUDIO_OUTPUT_CHANNELS = 2
let AUDIO_OUTPUT_BITS = 16
var audioEngine: AVAudioEngine?
var audioPlayer: AVAudioPlayerNode?
...
audioEngine = AVAudioEngine()
audioPlayer = AVAudioPlayerNode()
audioEngine?.attach(audioPlayer!)
let mixer = audioEngine?.mainMixerNode
mixer!.outputVolume = 1.0
let stereoFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(AUDIO_OUTPUT_SAMPLE_RATE), channels: 2, interleaved: false)
audioEngine!.connect(audioPlayer!, to: mixer!, format: stereoFormat)
...
audioEngine!.connect(...) 崩溃线
我正在使用 Xcode 8 beta 6,OS X El Capitan,而且,这种情况是由模拟器和设备引起的。
这是崩溃消息的一部分:
ERROR: >avae> AVAudioNode.mm:751: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'
...
3 AVFAudio 0x000000011e0a5630 _Z19AVAE_RaiseExceptionP8NSStringz + 176
4 AVFAudio 0x000000011e0f270d _ZN19AVAudioNodeImplBase11AUSetFormatEP28OpaqueAudioComponentInstancejjP13AVAudioFormat + 213
5 AVFAudio 0x000000011e0f2630 _ZN19AVAudioNodeImplBase15SetOutputFormatEmP13AVAudioFormat + 46
6 AVFAudio 0x000000011e0f9663 _ZN21AVAudioPlayerNodeImpl15SetOutputFormatEmP13AVAudioFormat + 25
7 AVFAudio 0x000000011e099cfd _ZN18AVAudioEngineGraph8_ConnectEP19AVAudioNodeImplBaseS1_jjP13AVAudioFormat + 2377
8 AVFAudio 0x000000011e09d15f _ZN18AVAudioEngineGraph7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 355
9 AVFAudio 0x000000011e0fc80e _ZN17AVAudioEngineImpl7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 348
使用音频文件的缓冲区和格式播放没有问题。
我犯了什么错误?
谢谢。
-10868,a.k.a。 kAudioUnitErr_FormatNotSupported
,看来您的 PCMFormatInt16
没有受到重视。将其更改为 .PCMFormatFloat32
确实有效。
来自 Apple,AVAudioPlayerNode
When playing buffers, there's an implicit assumption that the buffers are at the same sample rate as the node’s output format.
然后打印AVAudioPlayerNode
的输出格式,即engine.mainMixerNode.inputFormat
open func connectNodes() {
print(engine.mainMixerNode.inputFormat(forBus: 0))
engine.connect(playerNode, to: engine.mainMixerNode, format: readFormat)
}
结果是
<AVAudioFormat 0x6000024c18b0: 2 ch, 44100 Hz, Float32, non-inter>
所以选择.pcmFormatFloat32
,而不是.pcmFormatInt16
您不需要自己创建音频格式,
从实际音频数据(pcm缓冲区或音频文件)获取音频格式
the audio format thing from WWDC 2016 , Delivering an Exceptional Audio Experience
The actual audio format theory from WWDC 2015, What's New in Core Audio
- 输出的实际音频格式
- 输入的实际音频格式
- 做音频通道映射,音频位深保持不变
可以使用singed 16位音频格式,但需要先进行转换
// Setup your own format
let inputFormat = AVAudioFormat(
commonFormat: .pcmFormatInt16,
sampleRate: 44100,
channels: AVAudioChannelCount(2),
interleaved: true
)!
let engine = AVAudioEngine()
// Use system format as output format
let outputFormat = engine.mainMixerNode.outputFormat(forBus: 0)
self.converter = AVAudioConverter(from: inputFormat, to: outputFormat)!
self.playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: nil)
...
// Prepare input and output buffer
let inputBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: maxSamplesPerBuffer)!
let outputBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: maxSamplesPerBuffer)!
// When you fill your Int16 buffer with data, send it to converter
self.converter.convert(to: outputBuffer, error: nil) { inNumPackets, outStatus in
outStatus.pointee = .haveData
return inputBuffer
}
// Now in outputBuffer sound in system format and we can play it
self.playerNode.scheduleBuffer(outputBuffer)
我一直在尝试使用 AVAudioEngine 播放 16 位流数据。
但是传递给 AVAudioFormat 连接函数总是会导致崩溃。
代码如下:
let AUDIO_OUTPUT_SAMPLE_RATE = 44100
let AUDIO_OUTPUT_CHANNELS = 2
let AUDIO_OUTPUT_BITS = 16
var audioEngine: AVAudioEngine?
var audioPlayer: AVAudioPlayerNode?
...
audioEngine = AVAudioEngine()
audioPlayer = AVAudioPlayerNode()
audioEngine?.attach(audioPlayer!)
let mixer = audioEngine?.mainMixerNode
mixer!.outputVolume = 1.0
let stereoFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(AUDIO_OUTPUT_SAMPLE_RATE), channels: 2, interleaved: false)
audioEngine!.connect(audioPlayer!, to: mixer!, format: stereoFormat)
...
audioEngine!.connect(...) 崩溃线
我正在使用 Xcode 8 beta 6,OS X El Capitan,而且,这种情况是由模拟器和设备引起的。
这是崩溃消息的一部分:
ERROR: >avae> AVAudioNode.mm:751: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'
...
3 AVFAudio 0x000000011e0a5630 _Z19AVAE_RaiseExceptionP8NSStringz + 176
4 AVFAudio 0x000000011e0f270d _ZN19AVAudioNodeImplBase11AUSetFormatEP28OpaqueAudioComponentInstancejjP13AVAudioFormat + 213
5 AVFAudio 0x000000011e0f2630 _ZN19AVAudioNodeImplBase15SetOutputFormatEmP13AVAudioFormat + 46
6 AVFAudio 0x000000011e0f9663 _ZN21AVAudioPlayerNodeImpl15SetOutputFormatEmP13AVAudioFormat + 25
7 AVFAudio 0x000000011e099cfd _ZN18AVAudioEngineGraph8_ConnectEP19AVAudioNodeImplBaseS1_jjP13AVAudioFormat + 2377
8 AVFAudio 0x000000011e09d15f _ZN18AVAudioEngineGraph7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 355
9 AVFAudio 0x000000011e0fc80e _ZN17AVAudioEngineImpl7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 348
使用音频文件的缓冲区和格式播放没有问题。
我犯了什么错误?
谢谢。
-10868,a.k.a。 kAudioUnitErr_FormatNotSupported
,看来您的 PCMFormatInt16
没有受到重视。将其更改为 .PCMFormatFloat32
确实有效。
来自 Apple,AVAudioPlayerNode
When playing buffers, there's an implicit assumption that the buffers are at the same sample rate as the node’s output format.
然后打印AVAudioPlayerNode
的输出格式,即engine.mainMixerNode.inputFormat
open func connectNodes() {
print(engine.mainMixerNode.inputFormat(forBus: 0))
engine.connect(playerNode, to: engine.mainMixerNode, format: readFormat)
}
结果是
<AVAudioFormat 0x6000024c18b0: 2 ch, 44100 Hz, Float32, non-inter>
所以选择.pcmFormatFloat32
,而不是.pcmFormatInt16
您不需要自己创建音频格式,
从实际音频数据(pcm缓冲区或音频文件)获取音频格式
the audio format thing from WWDC 2016 , Delivering an Exceptional Audio Experience
The actual audio format theory from WWDC 2015, What's New in Core Audio
- 输出的实际音频格式
- 输入的实际音频格式
- 做音频通道映射,音频位深保持不变
可以使用singed 16位音频格式,但需要先进行转换
// Setup your own format
let inputFormat = AVAudioFormat(
commonFormat: .pcmFormatInt16,
sampleRate: 44100,
channels: AVAudioChannelCount(2),
interleaved: true
)!
let engine = AVAudioEngine()
// Use system format as output format
let outputFormat = engine.mainMixerNode.outputFormat(forBus: 0)
self.converter = AVAudioConverter(from: inputFormat, to: outputFormat)!
self.playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: nil)
...
// Prepare input and output buffer
let inputBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: maxSamplesPerBuffer)!
let outputBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: maxSamplesPerBuffer)!
// When you fill your Int16 buffer with data, send it to converter
self.converter.convert(to: outputBuffer, error: nil) { inNumPackets, outStatus in
outStatus.pointee = .haveData
return inputBuffer
}
// Now in outputBuffer sound in system format and we can play it
self.playerNode.scheduleBuffer(outputBuffer)