无缝连接 AVAssets
Concatenating AVAssets seamlessly
我有一些简单的 AVFoundation
代码可以将一堆四秒长的 mp4 文件连接在一起,如下所示:
func
compose(parts inParts: [Part], progress inProgress: (CMTime) -> ())
-> AVAsset?
{
guard
let composition = self.composition,
let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid),
let audioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
else
{
debugLog("Unable to create tracks for composition")
return nil
}
do
{
var time = CMTime.zero
for p in inParts
{
let asset = AVURLAsset(url: p.path.url)
if let track = asset.tracks(withMediaType: .video).first
{
try videoTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
}
if let track = asset.tracks(withMediaType: .audio).first
{
try audioTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
}
time = CMTimeAdd(time, asset.duration)
inProgress(time)
}
}
catch (let e)
{
debugLog("Error adding clips: \(e)")
return nil
}
return composition
}
不幸的是,每隔四秒您就会听到音频中断片刻,向我表明这不是完全无缝的连接。我可以做些什么来改善这一点吗?
解决方案
感谢下面 NoHalfBits 的出色回答,我用以下内容更新了上面的循环,并且效果很好:
for p in inParts
{
let asset = AVURLAsset(url: p.path.url)
// It’s possible (and turns out, it’s often the case with UniFi NVR recordings)
// for the audio and video tracks to be of slightly different start time
// and duration. Find the intersection of the two tracks’ time ranges and
// use that range when inserting both tracks into the composition…
// Calculate the common time range between the video and audio tracks…
let sourceVideo = asset.tracks(withMediaType: .video).first
let sourceAudio = asset.tracks(withMediaType: .audio).first
var commonTimeRange = CMTimeRange.zero
if sourceVideo != nil && sourceAudio != nil
{
commonTimeRange = CMTimeRangeGetIntersection(sourceVideo!.timeRange, otherRange: sourceAudio!.timeRange)
}
else if sourceVideo != nil
{
commonTimeRange = sourceVideo!.timeRange
}
else if sourceAudio != nil
{
commonTimeRange = sourceAudio!.timeRange
}
else
{
// There’s neither video nor audio tracks, bail…
continue
}
debugLog("Asset duration: \(asset.duration.seconds), common time range duration: \(commonTimeRange.duration.seconds)")
// Insert the video and audio tracks…
if sourceVideo != nil
{
try videoTrack.insertTimeRange(commonTimeRange, of: sourceVideo!, at: time)
}
if sourceAudio != nil
{
try audioTrack.insertTimeRange(commonTimeRange, of: sourceAudio!, at: time)
}
time = time + commonTimeRange.duration
inProgress(time)
}
在 mp4 容器中,每个曲目都可以有自己的开始时间和持续时间。特别是在录制的 material 中,音频和视频轨道的时间范围略有不同的情况并不少见(在 insertTimeRange
附近插入一些 CMTimeRangeShow(track.timeRange)
以查看)。
为了克服这个问题,而不是盲目地从 CMTime.zero 插入整个资产的持续时间(所有曲目的最大结束时间):
- 获取源音视频轨道
timeRange
- 根据这些计算出共同的时间范围(
CMTimeRangeGetIntersection
为您完成)
- 将片段从源曲目插入目标曲目时使用公共时间范围
- 将您的
time
增加公共时间范围的持续时间
我有一些简单的 AVFoundation
代码可以将一堆四秒长的 mp4 文件连接在一起,如下所示:
func
compose(parts inParts: [Part], progress inProgress: (CMTime) -> ())
-> AVAsset?
{
guard
let composition = self.composition,
let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid),
let audioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
else
{
debugLog("Unable to create tracks for composition")
return nil
}
do
{
var time = CMTime.zero
for p in inParts
{
let asset = AVURLAsset(url: p.path.url)
if let track = asset.tracks(withMediaType: .video).first
{
try videoTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
}
if let track = asset.tracks(withMediaType: .audio).first
{
try audioTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
}
time = CMTimeAdd(time, asset.duration)
inProgress(time)
}
}
catch (let e)
{
debugLog("Error adding clips: \(e)")
return nil
}
return composition
}
不幸的是,每隔四秒您就会听到音频中断片刻,向我表明这不是完全无缝的连接。我可以做些什么来改善这一点吗?
解决方案
感谢下面 NoHalfBits 的出色回答,我用以下内容更新了上面的循环,并且效果很好:
for p in inParts
{
let asset = AVURLAsset(url: p.path.url)
// It’s possible (and turns out, it’s often the case with UniFi NVR recordings)
// for the audio and video tracks to be of slightly different start time
// and duration. Find the intersection of the two tracks’ time ranges and
// use that range when inserting both tracks into the composition…
// Calculate the common time range between the video and audio tracks…
let sourceVideo = asset.tracks(withMediaType: .video).first
let sourceAudio = asset.tracks(withMediaType: .audio).first
var commonTimeRange = CMTimeRange.zero
if sourceVideo != nil && sourceAudio != nil
{
commonTimeRange = CMTimeRangeGetIntersection(sourceVideo!.timeRange, otherRange: sourceAudio!.timeRange)
}
else if sourceVideo != nil
{
commonTimeRange = sourceVideo!.timeRange
}
else if sourceAudio != nil
{
commonTimeRange = sourceAudio!.timeRange
}
else
{
// There’s neither video nor audio tracks, bail…
continue
}
debugLog("Asset duration: \(asset.duration.seconds), common time range duration: \(commonTimeRange.duration.seconds)")
// Insert the video and audio tracks…
if sourceVideo != nil
{
try videoTrack.insertTimeRange(commonTimeRange, of: sourceVideo!, at: time)
}
if sourceAudio != nil
{
try audioTrack.insertTimeRange(commonTimeRange, of: sourceAudio!, at: time)
}
time = time + commonTimeRange.duration
inProgress(time)
}
在 mp4 容器中,每个曲目都可以有自己的开始时间和持续时间。特别是在录制的 material 中,音频和视频轨道的时间范围略有不同的情况并不少见(在 insertTimeRange
附近插入一些 CMTimeRangeShow(track.timeRange)
以查看)。
为了克服这个问题,而不是盲目地从 CMTime.zero 插入整个资产的持续时间(所有曲目的最大结束时间):
- 获取源音视频轨道
timeRange
- 根据这些计算出共同的时间范围(
CMTimeRangeGetIntersection
为您完成) - 将片段从源曲目插入目标曲目时使用公共时间范围
- 将您的
time
增加公共时间范围的持续时间