I am using the followinf code for applying image filters. This is working fine on scaled down images. But when I apply more than 2 filters on full resolution images, the app crashes. A memory warning is received.
When I open the 'allocations' instrument, I see that CFData(store) takes up most of the memory used by the program. When I apply more than 2 filters on a full resolution image, the 'overall bytes' go upto 54MB. While the 'live bytes' don't seem to reach more than 12MB when I use my eyes on the numbers as such, but the spikes show that live bytes also reach upto this number and come back.
Where am i going wrong?
- (UIImage *)editImage:(UIImage *)imageToBeEdited tintValue:(float)tint
{
CIImage *image = [[CIImage alloc] initWithImage:imageToBeEdited];
NSLog(@"in edit Image:\ncheck image: %@\ncheck value:%f", image, tint);
[tintFilter setValue:image forKey:kCIInputImageKey];
[tintFilter setValue:[NSNumber numberWithFloat:tint] forKey:@"inputAngle"];
CIImage *outputImage = [tintFilter outputImage];
NSLog(@"check output image: %@", outputImage);
return [self completeEditingUsingOutputImage:outputImage];
}
- (UIImage *)completeEditingUsingOutputImage:(CIImage *)outputImage
{
CGImageRef cgimg = [context createCGImage:outputImage fromRect:outputImage.extent];
NSLog(@"check cgimg: %@", cgimg);
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
NSLog(@"check newImge: %@", newImage);
CGImageRelease(cgimg);
return newImage;
}
Edit: I also tried making cgimg as nil. Didn't help. I tried putting context declaration and definition inside the 2nd function. Didn't help. I tried to move declarations and definitions of filters inside the functions, didn't help.
AlsoCrash happens at
CGImageRef cgimg = [context createCGImage:outputImage fromRect:outputImage.extent];