AVAssetExportSession 结合视频文件和视频之间的冻结帧

AVAssetExportSession combine video files and freeze frame between videos

我有一个应用程序可以将视频文件组合在一起制作一个长视频。视频之间可能会有延迟(例如,V1 从 t=0s 开始并运行 5 秒,V1 从 t=10s 开始)。在这种情况下,我希望视频冻结 V1 的最后一帧,直到 V2 开始。

我正在使用下面的代码,但是在视频之间,整个视频变白了。

关于如何获得我正在寻找的效果的任何想法?

谢谢!

@interface VideoJoins : NSObject

-(instancetype)initWithURL:(NSURL*)url
                  andDelay:(NSTimeInterval)delay;

@property (nonatomic, strong) NSURL* url;
@property (nonatomic) NSTimeInterval delay;

@end

+(void)joinVideosSequentially:(NSArray*)videoJoins
                 withFileType:(NSString*)fileType
                     toOutput:(NSURL*)outputVideoURL
                 onCompletion:(dispatch_block_t) onCompletion
                      onError:(ErrorBlock) onError
                     onCancel:(dispatch_block_t) onCancel
{
  //From original question on 
  // Didn't add support for portrait+landscape.
  AVMutableComposition *composition = [AVMutableComposition composition];

  AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

  AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

  CMTime startTime = kCMTimeZero;

  /*videoClipPaths is a array of paths of the video clips recorded*/

  //for loop to combine clips into a single video
  for (NSInteger i=0; i < [videoJoins count]; i++)
  {
    VideoJoins* vj = videoJoins[i];
    NSURL *url  = vj.url;
    NSTimeInterval nextDelayTI = 0;
    if(i+1 < [videoJoins count])
    {
      VideoJoins* vjNext = videoJoins[i+1];
      nextDelayTI = vjNext.delay;
    }

    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];

    CMTime assetDuration = [asset duration];
    CMTime assetDurationWithNextDelay = assetDuration;
    if(nextDelayTI != 0)
    {
      CMTime nextDelay = CMTimeMakeWithSeconds(nextDelayTI, 1000000);
      assetDurationWithNextDelay = CMTimeAdd(assetDuration, nextDelay);
    }

    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

    //set the orientation
    if(i == 0)
    {
      [compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];
    }

    BOOL ok = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetDurationWithNextDelay) ofTrack:videoTrack atTime:startTime error:nil];
    ok = [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetDuration) ofTrack:audioTrack atTime:startTime error:nil];

    startTime = CMTimeAdd(startTime, assetDurationWithNextDelay);
  }

  //Delete output video if it exists
  NSString* outputVideoString = [outputVideoURL absoluteString];
  if ([[NSFileManager defaultManager] fileExistsAtPath:outputVideoString])
  {
    [[NSFileManager defaultManager] removeItemAtPath:outputVideoString error:nil];
  }

  //export the combined video
  AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                    presetName:AVAssetExportPresetHighestQuality];

  exporter.outputURL = outputVideoURL;
  exporter.outputFileType = fileType;
  exporter.shouldOptimizeForNetworkUse = YES;

  [exporter exportAsynchronouslyWithCompletionHandler:^(void)
  {
    switch (exporter.status)
    {
      case AVAssetExportSessionStatusCompleted: {
        onCompletion();
        break;
      }
      case AVAssetExportSessionStatusFailed:
      {
        NSLog(@"Export Failed");
        NSError* err = exporter.error;
        NSLog(@"ExportSessionError: %@", [err localizedDescription]);
        onError(err);
        break;
      }
      case AVAssetExportSessionStatusCancelled:
        NSLog(@"Export Cancelled");
        NSLog(@"ExportSessionError: %@", [exporter.error localizedDescription]);
        onCancel();
        break;
    }
  }];
}

编辑:成功了。以下是我如何提取图像并从这些图像生成视频:

+ (void)writeImageAsMovie:(UIImage*)image
                   toPath:(NSURL*)url
                 fileType:(NSString*)fileType
                 duration:(NSTimeInterval)duration
               completion:(VoidBlock)completion
{
  NSError *error = nil;
  AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:url
                                                         fileType:fileType
                                                            error:&error];
  NSParameterAssert(videoWriter);

  CGSize size = image.size;

  NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                 AVVideoCodecH264, AVVideoCodecKey,
                                 [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                 [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                 nil];
  AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                      assetWriterInputWithMediaType:AVMediaTypeVideo
                                      outputSettings:videoSettings];

  AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                   assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                   sourcePixelBufferAttributes:nil];
  NSParameterAssert(writerInput);
  NSParameterAssert([videoWriter canAddInput:writerInput]);
  [videoWriter addInput:writerInput];

  //Start a session:
  [videoWriter startWriting];
  [videoWriter startSessionAtSourceTime:kCMTimeZero];

  //Write samples:
  CMTime halfTime = CMTimeMakeWithSeconds(duration/2, 100000);
  CMTime endTime = CMTimeMakeWithSeconds(duration, 100000);
  CVPixelBufferRef buffer = [VideoCreator pixelBufferFromCGImage:image.CGImage];
  [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
  [adaptor appendPixelBuffer:buffer withPresentationTime:halfTime];
  [adaptor appendPixelBuffer:buffer withPresentationTime:endTime];

  //Finish the session:
  [writerInput markAsFinished];
  [videoWriter endSessionAtSourceTime:endTime];
  [videoWriter finishWritingWithCompletionHandler:^{
    if(videoWriter.error)
    {
      NSLog(@"Error:%@", [error localizedDescription]);
    }
    if(completion)
    {
      completion();
    }
  }];
}

+(void)generateVideoImageFromURL:(NSURL*)url
                          atTime:(CMTime)thumbTime
                     withMaxSize:(CGSize)maxSize
                      completion:(ImageBlock)handler
{
  AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:url options:nil];

  if(!asset)
  {
    if(handler)
    {
      handler(nil);
      return;
    }
  }
  if(CMTIME_IS_POSITIVE_INFINITY(thumbTime))
  {
    thumbTime = asset.duration;
  }
  else if(CMTIME_IS_NEGATIVE_INFINITY(thumbTime) || CMTIME_IS_INVALID(thumbTime) || CMTIME_IS_INDEFINITE(thumbTime))
  {
    thumbTime = CMTimeMake(0, 30);
  }

  AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
  generator.appliesPreferredTrackTransform=TRUE;
  generator.maximumSize = maxSize;

  CMTime actualTime;
  NSError* error;
  CGImageRef image = [generator copyCGImageAtTime:thumbTime actualTime:&actualTime error:&error];
  UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
  CGImageRelease(image);

  if(handler)
  {
    handler(thumb);
  }
}

AVMutableComposition只能拼接视频。我通过做两件事做到了:

  • 正在提取第一个视频的最后一帧作为图像。
  • 使用此图片制作视频(时长取决于您的要求)。

然后你就可以合成这三个视频(V1,V2和你的单图视频)。这两项任务都非常容易完成。

要从视频中提取图像,请看这个 link。如果您不想使用已接受的答案使用的MPMoviePlayerController,请查看史蒂夫的其他答案。

要使用图像制作视频,请查看此 link。问题是关于音频的问题,但我认为您不需要音频。所以直接看题中提到的方法本身就可以了

更新: 有一种更简单的方法,但它有一个缺点。你可以有两个 AVPlayer。第一个播放您的视频,中间有白框。另一个坐在后面,在视频 1 的最后一帧暂停。所以当中间部分出现时,您会看到第二个 AVPlayer 加载了最后一帧。所以从整体上看,视频 1 似乎已暂停。相信我,肉眼无法辨认玩家何时更换。但明显的缺点是您导出的视频将与空白帧相同。因此,如果您只想在您的应用程序中播放它,您可以采用这种方法。

视频资产的第一帧总是黑色或白色

 CMTime delta = CMTimeMake(1, 25); //1 frame (if fps = 25)
 CMTimeRange timeRangeInVideoAsset = CMTimeRangeMake(delta,clipVideoTrack.timeRange.duration);
 nextVideoClipStartTime = CMTimeAdd(nextVideoClipStartTime, timeRangeInVideoAsset.duration);

将超过 400 个衬衫视频合并为一个。