1

UIImageからオブジェクトを作成していますCMSampleBufferRef。私はこれを別のキューで (バックグラウンドで) 行っているので、処理を@autorealeaseプールに含めています。問題は、メモリがリーク通知なしで構築されていることです。以下は私が使用している方法です:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

       // Get the number of bytes per row for the pixel buffer
       size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
       // Get the pixel buffer width and height
       size_t width = CVPixelBufferGetWidth(imageBuffer);
       size_t height = CVPixelBufferGetHeight(imageBuffer);

       // Create a device-dependent RGB color space
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

       // Create a bitmap graphics context with the sample buffer data
       CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      // Create a Quartz image from the pixel data in the bitmap graphics context
       CGImageRef quartzImage = CGBitmapContextCreateImage(context);
       // Unlock the pixel buffer
       CVPixelBufferUnlockBaseAddress(imageBuffer,0);

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       return (image);
   }
}

そして、これが私がそれを使用している方法です:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);
    dispatch_async(movieWritingQueue, ^{
    @autoreleasepool {

        if (self.returnCapturedImages && captureOutput != audioOutput) {

            UIImage *capturedImage = [self imageFromSampleBuffer: sampleBuffer];

            dispatch_async(callbackQueue, ^{

                @autoreleasepool {

                    if (self.delegate && [self.delegate respondsToSelector: @selector(recorderCapturedImage:)]) {
                        [self.delegate recorderCapturedImage: capturedImage];
                    }

                    [capturedImage release];
                }
            });
        }
        CFRelease(sampleBuffer);
    }
});
4

2 に答える 2

2

一時的な解決策を見つけました。同じ操作を行っていますが、メイン キューで行います。これはエレガントでも効率的でもありませんが、少なくともメモリは蓄積されません。

これはiOSのバグなのだろうか…?

更新: これは、メイン スレッドで CMSampleBuffers を処理する方法です。

[[NSOperationQueue mainQueue] addOperationWithBlock:^ {

    CGImageRef cgImage = [self cgImageFromSampleBuffer:sampleBuffer];
    UIImage *capturedImage =     [UIImage imageWithCGImage: cgImage ];

    //do something with the image - I suggest in a background thread
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
       // do something with the image
    });

    CGImageRelease( cgImage );
    CFRelease(sampleBuffer);
}];

- (CGImageRef) cgImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);

    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}
于 2015-02-10T11:46:14.520 に答える
1

私は実際に数日前に同様の問題を抱えていました...

すでに をリリースしていますCMSampleBufferRefが、 もリリースしてみてくださいCVPixelBufferRef。例:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {

       // ...

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       CVPixelBufferRelease(imageBuffer); <-- release your pixel buffer

       return (image);
   }
}
于 2015-01-29T16:05:14.100 に答える