7

を使用してAVAssetWriterでMP4ビデオファイルを作成していAVAssetWriterInputPixelBufferAdaptorます。

ソースはUIImagePickerController、カメラまたはアセットライブラリから新たにキャプチャされたからのビデオです。現在の品質はですUIImagePickerControllerQualityTypeMedium

時々、ライターは失敗します。ステータスはAVAssetWriterStatusFailedで、AVAssetWriterオブジェクトのエラープロパティは次のとおりです。

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" 
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210), 
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)",
NSLocalizedDescription=The operation could not be completed

このエラーは、コードが実行される回数の約20%で発生します。iPhone5よりもiPhone4/4Sの方が頻繁に失敗するようです。

ソースビデオの品質が高い場合にも、より頻繁に発生します。UIImagePickerControllerQualityTypeLowを使用すると、エラーはそれほど頻繁には発生しません。UIImagePickerControllerQualityTypeHighを使用すると、エラーがもう少し頻繁に発生します。

私はまた何か他のものに気づきました:それは波のように来ているようです。失敗すると、アプリを削除して再インストールしても、次の実行も失敗することがよくあります。それは私のプログラムがいくらかのメモリをリークするかどうか、そしてアプリが殺されてもそのメモリが生き続けるかどうか、私に疑問を残します(それは可能ですか?)。

ビデオのレンダリングに使用するコードは次のとおりです。

- (void)writeVideo
{
    offlineRenderingInProgress = YES;

/* --- Writer Setup --- */

    [locationQueue cancelAllOperations];

    [self stopWithoutRewinding];

    NSError *writerError = nil;

    BOOL succes;

    succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil];

    // DLog(@"Url: %@, succes: %i, error: %@", self.outputURL, succes, fileError);

    writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError];
    //writer.shouldOptimizeForNetworkUse = NO;

    if (writerError) {
        DLog(@"Writer error: %@", writerError);
        return;
    }

    float bitsPerPixel;
    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0]));
    int numPixels = dimensions.width * dimensions.height;
    int bitsPerSecond;

    // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
    if ( numPixels < (640 * 480) )
        bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
    else
        bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.

    bitsPerSecond = numPixels * bitsPerPixel;

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                          AVVideoCodecH264, AVVideoCodecKey,
                                          [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey,
                                          [NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey,
                                          [NSDictionary dictionaryWithObjectsAndKeys:
                                           [NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
                                           nil], AVVideoCompressionPropertiesKey,
                                          nil];

    writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
    writerVideoInput.transform =  movie.preferredTransform;
    writerVideoInput.expectsMediaDataInRealTime = YES;
    [writer addInput:writerVideoInput];

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

    writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
                                                                                      sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
    BOOL couldStart = [writer startWriting];

    if (!couldStart) {
        DLog(@"Could not start AVAssetWriter!");
        abort = YES;
        [locationQueue cancelAllOperations];
        return;
    }

    [self configureFilters];

    CIContext *offlineRenderContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @NO}];


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    if (!self.canEdit) {
        [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES];
    } else {
        [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES];
    }

    CMTime startOffset = reader.timeRange.start;

    DLog(@"startOffset: %llu", startOffset.value);

    [self.thumbnailEditView removeFromSuperview];
    //    self.thumbnailEditView = nil;

    [glLayer removeFromSuperlayer];
    glLayer = nil;

    [playerView removeFromSuperview];
    playerView = nil;

    glContext = nil;



    [writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{

        @try {


        BOOL didWriteSomething = NO;

        DLog(@"Preparing to write...");

        while ([writerVideoInput isReadyForMoreMediaData]) {

            if (abort) {
                NSLog(@"Abort == YES");
                [locationQueue cancelAllOperations];
                [writerVideoInput markAsFinished];
                videoConvertCompletionBlock(NO, writer.error.localizedDescription);
            }

            if (writer.status == AVAssetWriterStatusFailed) {
                DLog(@"Writer.status: AVAssetWriterStatusFailed, error: %@", writer.error);

                [[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:@"QualityOverride"];
                [[NSUserDefaults standardUserDefaults] synchronize];

                abort = YES;
                [locationQueue cancelAllOperations];
                videoConvertCompletionBlock(NO, writer.error.localizedDescription);
                return;
                DLog(@"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]);
            }

            DLog(@"Writing started...");

            CMSampleBufferRef buffer = nil;

            if (reader.status != AVAssetReaderStatusUnknown) {

                if (reader.status == AVAssetReaderStatusReading) {
                    buffer = [readerVideoOutput copyNextSampleBuffer];
                    if (didWriteSomething == NO) {
                        DLog(@"Copying sample buffers...");
                    }
                }

                if (!buffer) {

                    [writerVideoInput markAsFinished];

                    DLog(@"Finished...");

                    CGColorSpaceRelease(colorSpace);

                    [self offlineRenderingDidFinish];


                    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{

                        [writer finishWriting];
                        if (writer.error != nil) {
                            DLog(@"Error: %@", writer.error);
                        } else {
                            DLog(@"Succes!");
                        }

                        if (writer.status == AVAssetWriterStatusCompleted) {

                            videoConvertCompletionBlock(YES, nil);
                        }

                        else {
                            abort = YES;
                            videoConvertCompletionBlock(NO, writer.error.localizedDescription);
                        }

                    });


                    return;
                }

                didWriteSomething = YES;
            }
            else {

                DLog(@"Still waiting...");
                //Reader just needs a moment to get ready...
                continue;
            }

            CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);

            if (pixelBuffer == NULL) {
                DLog(@"Pixelbuffer == NULL");
                continue;
            }

            //DLog(@"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer));

            //NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace];

            CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];

            CIImage *outputImage = [self filteredImageWithImage:ciimage];


            CVPixelBufferRef outPixelBuffer = NULL;
            CVReturn status;

            CFDictionaryRef empty; // empty value for attr value.
            CFMutableDictionaryRef attrs;
            empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
                                       NULL,
                                       NULL,
                                       0,
                                       &kCFTypeDictionaryKeyCallBacks,
                                       &kCFTypeDictionaryValueCallBacks);

            attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                              1,
                                              &kCFTypeDictionaryKeyCallBacks,
                                              &kCFTypeDictionaryValueCallBacks);

            CFDictionarySetValue(attrs,
                                 kCVPixelBufferIOSurfacePropertiesKey,
                                 empty);

            CFDictionarySetValue(attrs,
                                 kCVPixelBufferCGImageCompatibilityKey,
                                 (__bridge const void *)([NSNumber numberWithBool:YES]));

            CFDictionarySetValue(attrs,
                                 kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 (__bridge const void *)([NSNumber numberWithBool:YES]));


            status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer);

            //DLog(@"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer));

            if (status != kCVReturnSuccess) {
                DLog(@"Couldn't allocate output pixelBufferRef!");
                continue;
            }

            [offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace];

            CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);
            CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset);
            CMTime duration = reader.timeRange.duration;
            if (CMTIME_IS_POSITIVE_INFINITY(duration)) {
                duration = movie.duration;
            }
            CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default);

            float durationFloat = (float)durationConverted.value;
            float progress =  ((float) currentTime.value) / durationFloat;

            //DLog(@"duration : %f, progress: %f", durationFloat, progress);

            [self updateOfflineRenderProgress:progress];

            if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) {
                [writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];
            } else {
                continue;
            }

            if (writer.status == AVAssetWriterStatusWriting) {
                DLog(@"Writer.status: AVAssetWriterStatusWriting");
            }

            CFRelease(buffer);
            CVPixelBufferRelease(outPixelBuffer);
        }

        }

        @catch (NSException *exception) {
            DLog(@"Catching exception: %@", exception);
        }

    }];

}
4

1 に答える 1

14

わかりました、私はそれを自分で解決したと思います。悪者はこの行でした:

[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ ....

私が渡したグローバルキューは並行キューです。これにより、前のコールバックが終了する前に新しいコールバックを作成できます。アセットライターは、一度に複数のスレッドから書き込まれるようには設計されていません。

新しいシリアルキューを作成して使用すると、問題が解決するようです。

assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL);

[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...
于 2013-01-02T14:26:15.283 に答える