由图像组成的 AVPlayerItem

AVPlayerItem that consists of an image

我需要创建一个可变长度的静音 "video"(即它只是一个图像),我可以在 ios 上的 AVPlayer 中使用它。

有谁知道我可以创建一个仅由持续 n 秒的图像组成的 AVPlayerItem 的方法吗?

如果我必须生成一个 .mov 文件,我需要该文件非常小。

您可以从该图像创建一个 .mov 视频,播放时间很短,比方说一秒钟,然后循环播放此视频

yourplayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;

[[NSNotificationCenter defaultCenter] addObserver:self
                                            selector:@selector(playerItemDidReachEnd:)
                                                 name:AVPlayerItemDidPlayToEndTimeNotification
                                               object:[yourplayer currentItem]];


- (void)playerItemDidReachEnd:(NSNotification *)notification {
        [[yourplayer currentItem] seekToTime:kCMTimeZero];
}

如果视频的持续时间为 n 秒,那么您可以在 playerItemDidReachEnd 方法中使用计数器并设置限制。

好的,我已经开始编写自己的视频了。事实证明,如果您在第一个和最后一个关键帧(并且是唯一的关键帧)使用您想要的图像编写视频,那么您将获得一个不错的紧凑型视频,不需要 "too" 长的时间来编写.

我的代码如下:

- (CVPixelBufferRef) createPixelBufferOfSize: (CGSize) size fromUIImage: (UIImage*) pImage
{
    NSNumber*           numYes      = [NSNumber numberWithBool: YES];
    NSDictionary*       pOptions    = [NSDictionary dictionaryWithObjectsAndKeys:   numYes, kCVPixelBufferCGImageCompatibilityKey,
                                                                            numYes, kCVPixelBufferCGBitmapContextCompatibilityKey,
                                                                            nil];

    CVPixelBufferRef    retBuffer   = NULL;
    CVReturn            status      = CVPixelBufferCreate( kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)pOptions, &retBuffer );

    CVPixelBufferLockBaseAddress( retBuffer, 0 );
    void*               pPixelData  = CVPixelBufferGetBaseAddress( retBuffer );

    CGColorSpaceRef     colourSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef        context     = CGBitmapContextCreate( pPixelData, size.width, size.height, 8, 4 * size.width, colourSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst );

    CGSize              inSize      = pImage.size;
    float               inAspect    = inSize.width  / inSize.height;
    float               outAspect   = size.width    / size.height;

    CGRect drawRect;
    if ( inAspect > outAspect )
    {
        float   scale   = inSize.width / size.width;
        CGSize  outSize = CGSizeMake( size.width, inSize.height / scale );

        drawRect        = CGRectMake( 0, (size.height / 2) - (outSize.height / 2), outSize.width, outSize.height );
    }
    else
    {
        float   scale   = inSize.height / size.height;
        CGSize  outSize = CGSizeMake( inSize.width / scale, size.height );

        drawRect        = CGRectMake( (size.width / 2) - (outSize.width / 2), 0, outSize.width, outSize.height );
    }

    CGContextDrawImage( context, drawRect, [pImage CGImage] );

    CGColorSpaceRelease( colourSpace );
    CGContextRelease( context );

    CVPixelBufferUnlockBaseAddress( retBuffer, 0 );

    return retBuffer;
}

- (void) writeVideo: (NSURL*) pURL withImage: (UIImage*) pImage ofLength: (NSTimeInterval) length
{
    [[NSFileManager defaultManager] removeItemAtURL: pURL error: nil];

    NSError*                pError              = nil;
    AVAssetWriter*          pAssetWriter        = [AVAssetWriter assetWriterWithURL: pURL fileType: AVFileTypeQuickTimeMovie error: &pError];

    const int               kVidWidth           = 1920;//pImage.size.width;
    const int               kVidHeight          = 1080;//pImage.size.height;

    NSNumber*               numVidWidth         = [NSNumber numberWithInt: kVidWidth];
    NSNumber*               numVidHeight        = [NSNumber numberWithInt: kVidHeight];

    NSDictionary*           pVideoSettings      = [NSDictionary dictionaryWithObjectsAndKeys:   AVVideoCodecH264,   AVVideoCodecKey,
                                                                                                numVidWidth,        AVVideoWidthKey,
                                                                                                numVidHeight,       AVVideoHeightKey,
                                                                                                nil];

    AVAssetWriterInput*     pAssetWriterInput   = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                                                     outputSettings: pVideoSettings];
    [pAssetWriter addInput: pAssetWriterInput];

    AVAssetWriterInputPixelBufferAdaptor*    pAssetWriterInputPixelBufferAdaptor    =
                                                [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput: pAssetWriterInput
                                                                                                                 sourcePixelBufferAttributes: pVideoSettings];

    __block volatile int finished = 0;

    [pAssetWriter   startWriting];
    [pAssetWriter   startSessionAtSourceTime: kCMTimeZero];

    // Write the image.
    CVPixelBufferRef        pixelBuffer     = [self createPixelBufferOfSize: CGSizeMake( kVidWidth, kVidHeight ) fromUIImage: pImage];

    [pAssetWriterInputPixelBufferAdaptor appendPixelBuffer: pixelBuffer withPresentationTime: kCMTimeZero];
    [pAssetWriterInputPixelBufferAdaptor appendPixelBuffer: pixelBuffer withPresentationTime: CMTimeMake( length * 1000000, 1000000 )];

    CVPixelBufferRelease( pixelBuffer );

    [pAssetWriterInput  markAsFinished];

    // Set end time accurate to micro-seconds.
    [pAssetWriter   endSessionAtSourceTime: CMTimeMake( length * 1000000, 1000000 )];
    [pAssetWriter   finishWritingWithCompletionHandler: ^
        {
            OSAtomicIncrement32( &finished );
        }];

    // Wait for the writing to complete.
    while( finished == 0 )
    {
        [NSThread sleepForTimeInterval: 0.01];
    }
}

您可能会注意到我将视频设置为始终为 1920x1080 并在适当的位置对图像进行信箱处理。