将图像转换为视频的 FPS 错误 ios

converting images to video has wrong FPS ios

大家好,这里需要快速帮助;我试图使用在线教程将图像转换为视频。该过程完成并在 iphone/i-devices 和 quicktime 上正确播放,但是在任何其他播放器(如 VLC realplayer、网络播放器等)上播放时,它无法播放。查看生成的最终文件后,我可以看到视频的 fps 范围在 0 到 0.16 之间(理想情况下应该在 24-30fps 之间)。任何人都可以建议一种修复 fps 的方法。我使用的代码如下:

+ (void)createVideoWithImages:(NSArray <NSString *> *)images withDuration:(long int)duration completion:(void (^) (BOOL success, NSString *output, NSError *error))completion
{
    __block BOOL success                = NO;
    __block NSString *videoOutputPath   = [ViewController newfNameWithPath];
    NSError *error                      = nil;

    double width                    = 0;
    double height                   = 0;
    CGSize imageSize                = CGSizeZero;
    NSUInteger framesPerSecond      = 30;
    NSMutableArray *imageArray      = [[NSMutableArray alloc] initWithCapacity:images.count];

    for (NSUInteger i = 0; i < [images count]; i++)
    {
        NSString *imagePath = [images objectAtIndex:i];
        [imageArray addObject:[UIImage imageWithContentsOfFile:imagePath]];

        UIImage *newImage = [ViewController createImage:((UIImage *)[imageArray objectAtIndex:i])];

        UIImage *img = [ViewController imageOrientation:newImage];
        [imageArray replaceObjectAtIndex:i withObject:img];

        width   = (width > ((UIImage *)[imageArray lastObject]).size.width) ? width : ((UIImage *)[imageArray lastObject]).size.width;
        height  = (height > ((UIImage *)[imageArray lastObject]).size.height) ? height : ((UIImage *)[imageArray lastObject]).size.height;
    }

    imageSize  = CGSizeMake(1280, 720);

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[ViewController getURL:videoOutputPath] fileType:AVFileTypeQuickTimeMovie error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                                [NSNumber numberWithInt:1280], AVVideoCleanApertureWidthKey,
                                                [NSNumber numberWithInt:720], AVVideoCleanApertureHeightKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
                                                nil];

    NSDictionary *aspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                         [NSNumber numberWithInt:3], AVVideoPixelAspectRatioHorizontalSpacingKey,
                                         [NSNumber numberWithInt:3],AVVideoPixelAspectRatioVerticalSpacingKey,
                                         nil];

    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:960000], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey,
                                   videoCleanApertureSettings, AVVideoCleanApertureKey,
                                   aspectRatioSettings, AVVideoPixelAspectRatioKey,
                                   //AVVideoProfileLevelH264Main30, AVVideoProfileLevelKey,
                                   nil];

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   codecSettings,AVVideoCompressionPropertiesKey,
                                   [NSNumber numberWithInt: 1280], AVVideoWidthKey,
                                   [NSNumber numberWithInt: 720], AVVideoHeightKey,
                                   nil];


    AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    videoWriterInput.expectsMediaDataInRealTime = YES;
    [videoWriter addInput:videoWriterInput];


    videoWriterInput.transform = CGAffineTransformMakeRotation(M_PI_2 * 2);

    if (![videoWriter startWriting])
    {
        completion(success, nil, videoWriter.error);
    }

    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    CVPixelBufferRef buffer         = NULL;
    int frameCount                  = 0;
    double numberOfSecondsPerFrame  = duration / [imageArray count];
    double frameDuration            = framesPerSecond * numberOfSecondsPerFrame;

    for(UIImage *image in imageArray)
    {
        buffer                      = [ViewController pixelBufferFromCGImage:[image CGImage] withSize:imageSize];
        BOOL completeWitnNoError    = NO;
        int counter                 = 0;

        while (!completeWitnNoError && counter < 30)
        {

            NSLog(@"writing image: %ld at frame: %d",(long)frameCount, counter);
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {

                CMTime frameTime        = CMTimeMake(frameCount*frameDuration, (int32_t)framesPerSecond);
                CMTimeShow(frameTime);
                completeWitnNoError     = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];

                if(!completeWitnNoError)
                {
                    if(buffer)
                        CVPixelBufferRelease(buffer);

                    completion(!completion, nil, videoWriter.error);
                }
            }
            else
            {
                if(buffer)
                    CVPixelBufferRelease(buffer);

                NSLog(@"adaptor not ready %d, %d\n", frameCount, counter);
                [NSThread sleepForTimeInterval:0.1];
            }

            counter++;
        }

        if(buffer)
            CVPixelBufferRelease(buffer);

        if (!completeWitnNoError)
        {
            NSDictionary *userInfo = @{
                                       NSLocalizedDescriptionKey:               NSLocalizedString(@"Error\n", nil),
                                       NSLocalizedFailureReasonErrorKey:        NSLocalizedString(@"Images to video writing failed ", nil),
                                       NSLocalizedRecoverySuggestionErrorKey:   NSLocalizedString(@"Images to video writing failed ", nil)
                                       };

            NSError *error = [NSError errorWithDomain:@"Images to video error" code: -1005 userInfo:userInfo];
            completion(!completion, nil, error);
        }

        frameCount++;
    }

    [videoWriterInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^{

        switch ([videoWriter status])
        {
            case AVAssetReaderStatusReading:
                break;

            case AVAssetReaderStatusCompleted:
            {
                NSLog(@"IMAGES TO VIDEO WRITE SUCCESS");
                success = !success;
                completion(success, videoOutputPath, videoWriter.error);
                break;
            }
            case AVAssetReaderStatusCancelled:
            {
                NSLog(@"IMAGES TO VIDEO WRITE FAILURE");
                completion(success, nil, videoWriter.error);
                [videoWriter cancelWriting];
                break;
            }

            case AVAssetReaderStatusFailed:
            {
                NSLog(@"IMAGES TO VIDEO WRITE FAILURE");
                completion(success, nil, videoWriter.error);
                [videoWriter cancelWriting];
                break;
            }

            case AVAssetReaderStatusUnknown:
            {
                NSLog(@"IMAGES TO VIDEO WRITE FAILURE");
                completion(success, nil, videoWriter.error);
                [videoWriter cancelWriting];
                break;
            }
        }
    }];

    return;
}

和缓冲区:

+ (CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef)image withSize:(CGSize)size
{

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];

    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          size.width,
                                          size.height,
                                          kCVPixelFormatType_32ARGB,
                                          (__bridge CFDictionaryRef) options,
                                          &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    if (status != kCVReturnSuccess)
    {
        NSString *description  = [NSString stringWithFormat:@"Failed to create pixel buffer."];
        NSDictionary *userInfo = @{
                                   NSLocalizedDescriptionKey:               NSLocalizedString(description, nil),
                                   NSLocalizedFailureReasonErrorKey:        NSLocalizedString(@"Could create buffer", nil),
                                   NSLocalizedRecoverySuggestionErrorKey:   NSLocalizedString(@"Try again", nil)
                                   };

        NSError *error = [NSError errorWithDomain:@"CVPixelBufferRef" code: -1005 userInfo:userInfo];
        NSLog(@"ALERT: %@", [error localizedDescription]);
    }

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace   = CGColorSpaceCreateDeviceRGB();
    CGContextRef context            = CGBitmapContextCreate(pxdata, size.width,
                                                            size.height, 8, 4*size.width, rgbColorSpace,
                                                            kCGImageAlphaPremultipliedFirst);


    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake((size.width/(CGImageGetWidth(image) / 2)), (size.height/(CGImageGetWidth(image) / 2)), 1280, 720), image);

    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

有什么建议吗??

尝试使用我以前用过的 CEMovieMaker,效果很好,因为我认为你的问题出在 videoCodec

https://github.com/cameronehrlich/CEMovieMaker

    @interface ViewController ()

@property (nonatomic, strong) CEMovieMaker *movieMaker;

@end

- (IBAction)process:(id)sender
{
    NSMutableArray *frames = [[NSMutableArray alloc] init];

    UIImage *icon1 = [UIImage imageNamed:@"icon1"];
    UIImage *icon2 = [UIImage imageNamed:@"icon2"];
    UIImage *icon3 = [UIImage imageNamed:@"icon3"];

    NSDictionary *settings = [CEMovieMaker videoSettingsWithCodec:AVVideoCodecH264 withHeight:icon1.size.width andWidth:icon1.size.height];
    self.movieMaker = [[CEMovieMaker alloc] initWithSettings:settings];
    for (NSInteger i = 0; i < 10; i++) {
        [frames addObject:icon1];
        [frames addObject:icon2];
        [frames addObject:icon3];
    }

    [self.movieMaker createMovieFromImages:[frames copy] withCompletion:^(BOOL success, NSURL *fileURL){
        if (success) {
            [self viewMovieAtUrl:fileURL];
        }
    }];
}

- (void)viewMovieAtUrl:(NSURL *)fileURL
{
    MPMoviePlayerViewController *playerController = [[MPMoviePlayerViewController alloc] initWithContentURL:fileURL];
    [playerController.view setFrame:self.view.bounds];
    [self presentMoviePlayerViewControllerAnimated:playerController];
    [playerController.moviePlayer prepareToPlay];
    [playerController.moviePlayer play];
    [self.view addSubview:playerController.view];
}