尝试通过多点连接将音频从 microphone 流式传输到另一个 phone

Trying to stream audio from microphone to another phone via multipeer connectivity

我正在尝试通过 Apple Multipeer Connectivity 框架将音频从微型 phone 流式传输到另一个 iPhone。为了进行音频捕获和播放,我正在使用 AVAudioEngine(非常感谢 Rhythmic Fistman's answer )。

我通过在输入端安装一个水龙头从 microphone 接收数据,由此我得到一个 AVAudioPCMBuffer,然后将其转换为一个 UInt8 数组,然后将其流式传输到另一个 phone.

但是当我将数组转换回 AVAudioPCMBuffer 时,我得到一个 EXC_BAD_ACCESS 异常,编译器指向我再次将字节数组转换为 AVAudioPCMBuffer 的方法。

这里是我要获取、转换和流式传输输入的代码:

input.installTap(onBus: 0, bufferSize: 2048, format: input.inputFormat(forBus: 0), block: {
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

                let audioBuffer = self.typetobinary(buffer)
                stream.write(audioBuffer, maxLength: audioBuffer.count)
            })

我转换数据的两个函数(取自Martin.R's answer ):

func binarytotype <T> (_ value: [UInt8], _: T.Type) -> T {
    return value.withUnsafeBufferPointer {
        UnsafeRawPointer([=12=].baseAddress!).load(as: T.self)
    }

}

func typetobinary<T>(_ value: T) -> [UInt8] {
    var data = [UInt8](repeating: 0, count: MemoryLayout<T>.size)
    data.withUnsafeMutableBufferPointer {
        UnsafeMutableRawPointer([=12=].baseAddress!).storeBytes(of: value, as: T.self)
    }
    return data
}

在接收端:

func session(_ session: MCSession, didReceive stream: InputStream, withName streamName: String, fromPeer peerID: MCPeerID) {
    if streamName == "voice" {

        stream.schedule(in: RunLoop.current, forMode: .defaultRunLoopMode)
        stream.open()

        var bytes = [UInt8](repeating: 0, count: 8)
        stream.read(&bytes, maxLength: bytes.count)

        let audioBuffer = self.binarytotype(bytes, AVAudioPCMBuffer.self) //Here is where the app crashes

        do {
            try engine.start()

            audioPlayer.scheduleBuffer(audioBuffer, completionHandler: nil)
            audioPlayer.play()
       }catch let error {
            print(error.localizedDescription)

        }
    }
}

问题是我可以来回转换字节数组并在流式传输之前从中播放声音(在相同的 phone 中)但不在接收端创建 AVAudioPCMBuffer。有谁知道为什么转换在接收端不起作用?这是正确的方法吗?

如有任何帮助,thoughts/input 将不胜感激。

你的AVAudioPCMBufferserialisation/deserialisation错了。

Swift3 的转换发生了很大变化,而且似乎比 Swift2 需要更多的复制。

以下是 [UInt8]AVAudioPCMBuffer 之间的转换方法:

N.B:此代码假定单声道浮点数据为 44.1kHz。
你可能想改变它。

func copyAudioBufferBytes(_ audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

    // initialize bytes to 0 (how to avoid?)
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)

    // copy data from buffer
    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            [=10=].baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }

    return audioByteArray
}

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    // format assumption! make this part of your protocol?
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]
    // for stereo
    // let dstRight = audioBuffer.floatChannelData![1]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer([=10=].baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }

    return audioBuffer
}