Mac - Swift 3 - 排队播放音频文件

Mac - Swift 3 - queuing audio files and playing

我想在 swift 3 中编写一个应用程序,以便在播放队列中的音频文件时没有任何间隙、裂纹或噪音。

我的第一次尝试是使用 AvAudioPlayer 和 AvAudioDelegate (),但我不知道如何预加载下一首歌曲以避免出现间隙。即使我知道该怎么做,我也不确定这是实现我目标的最佳方式。 AVQueuePlayer 似乎是这份工作的更好人选,它就是为此目的而制作的,但我找不到任何例子来帮助我。 也许这只是预加载或缓冲的问题?我有点迷失在这片可能性的海洋中。

欢迎任何建议。

它远非完美,特别是如果你想做两次或更多次("file exist" 错误),但它可以作为一个基础。

它所做的是获取两个文件(我的是 ap. 4 秒的 aif 样本),将它们编码在一个文件中并播放生成的文件。如果你有数百个,随意组合或不组合,它会很有趣。

mergeAudioFiles 函数的所有功劳归于@Peyman 和@Pigeon_39。

Swift 3

import Cocoa
import AVFoundation

var action = AVAudioPlayer()
let path = Bundle.main.path(forResource: "audiofile1.aif", ofType:nil)!
let url = URL(fileURLWithPath: path)
let path2 = Bundle.main.path(forResource: "audiofile2.aif", ofType:nil)!
let url2 = URL(fileURLWithPath: path2)
let array1 = NSMutableArray(array: [url, url2])


class ViewController: NSViewController, AVAudioPlayerDelegate
{

    @IBOutlet weak var LanceStop: NSButton!

    override func viewDidLoad()
    {
        super.viewDidLoad()
    }
    override var representedObject: Any?
    {
        didSet
        {
        // Update the view, if already loaded.
        }
    }

    @IBAction func Lancer(_ sender: NSButton)
    {
      mergeAudioFiles(audioFileUrls: array1)
        let url3 = NSURL(string: "/Users/ADDUSERNAMEHERE/Documents/FinalAudio.m4a")

        do
        {
            action = try AVAudioPlayer(contentsOf: url3 as! URL)
            action.delegate = self
            action.numberOfLoops = 0
            action.prepareToPlay()
            action.volume = 1
            action.play()
        }
        catch{print("error")}

    }


    func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool)
    {
        if flag == true
        {

        }
    }

    var mergeAudioURL = NSURL()

    func mergeAudioFiles(audioFileUrls: NSArray) {
        //audioFileUrls.adding(url)
        //audioFileUrls.adding(url2)
        let composition = AVMutableComposition()

        for i in 0 ..< audioFileUrls.count {

            let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())

            let asset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL)

            let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]

            let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)

            try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
        }

        let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL
        self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL

        let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
        assetExport?.outputFileType = AVFileTypeAppleM4A
        assetExport?.outputURL = mergeAudioURL as URL
        assetExport?.exportAsynchronously(completionHandler:
            {
                switch assetExport!.status
                {
                case AVAssetExportSessionStatus.failed:
                    print("failed \(assetExport?.error)")
                case AVAssetExportSessionStatus.cancelled:
                    print("cancelled \(assetExport?.error)")
                case AVAssetExportSessionStatus.unknown:
                    print("unknown\(assetExport?.error)")
                case AVAssetExportSessionStatus.waiting:
                    print("waiting\(assetExport?.error)")
                case AVAssetExportSessionStatus.exporting:
                    print("exporting\(assetExport?.error)")
                default:
                    print("Audio Concatenation Complete")
                }
        })
    }
}