How to implement fast image filters on iOS platform

12,179

Solution 1

For the filter in your example code, you could use a lookup table to make it much faster. I assume your input image is 8 bits per color and you are converting it to float before passing it to this function. For each color, this only gives 256 possible values and therefore only 256 possible output values. You could precompute these and store them in an array. This would avoid the pow() calculation and the bounds checking since you could factor them into the precomputation.

It would look something like this:

unsigned char table[256];
for(int i=0; i<256; i++) {
    float tmp = pow((float)i/255.0f, 1.3f) * 255.0; 
    table[i] = tmp > 255 ? 255 : (unsigned char)tmp;
}

for(int i=0; i<length; ++i)
    m_OriginalPixelBuf[i] = table[m_OriginalPixelBuf[i]];

In this case, you only have to perform pow() 256 times instead of 3*640*640 times. You would also avoid the branching caused by the bounds checking in your main image loop which can be costly. You would not have to convert to float either.

Even a faster way may be to precompute the table outside the program and just put the 256 coefficients in the code.

None of the operations you have listed there should require a convolution or even a matrix multiply. They are all pixel-wise operations, meaning that each output pixel only depends on the single corresponding input pixel. You would need to consider convolution for operations like blurring or sharpening where multiple input pixels affect a single output pixel.

Solution 2

If you're looking for the absolute fastest way to do this, you're going to want to use the GPU to handle the processing. It's built to do massively parallel operations, like color adjustments on single pixels.

As I've mentioned in other answers, I measured a 14X - 28X improvement in performance when running an image processing operation using OpenGL ES instead of on the CPU. You can use the Accelerate framework to do faster on-CPU image manipulation (I believe Apple claims around a ~4-5X boost is possible here), but it won't be as fast as OpenGL ES. It can be easier to implement, however, which is why I've sometimes used Accelerate for this over OpenGL ES.

iOS 5.0 also brings over Core Image from the desktop, which gives you a nice wrapper around these kind of on-GPU image adjustments. However, there are some limitations to the iOS Core Image implementation that you don't have when working with OpenGL ES 2.0 shaders directly.

I present an example of an OpenGL ES 2.0 shader image filter in my article here. The hardest part about doing this kind of processing is getting the OpenGL ES scaffolding set up. Using my sample application there, you should be able to extract that setup code and apply your own filters using it. To make this easier, I've created an open source framework called GPUImage that handles all of the OpenGL ES interaction for you. It has almost every filter you list above, and most run in under 2.5 ms for a 640x480 frame of video on an iPhone 4, so they're far faster than anything processed on the CPU.

Solution 3

As I said in a comment, you should post this question on the official Apple Developer Forums as well.

That aside, one real quick check: are you calling pow( ) or powf( )? Even if your data is float, calling pow( ) will get you the double-precision math library function, which is significantly slower than the single-precision variant powf( ) (and you'll have to pay for the extra conversions between float and double as well).

And a second check: have you profiled your filters in Instruments? Do you actually know where the execution time is being spent, or are you guessing?

Solution 4

I actually wanted to do all this myself but I found Silverberg's Image Filters. You could apply various instagram type image filters on your images. This so much better than other image filters out there - GLImageProcessing or Cimg.

Also check Instagram Image Filters on iPhone.

Hope this helps...

Solution 5

From iOS 5 upwards, you can use the Core Image filters to adjust a good range of image parameters.

To adjust contrast for example, this code works like a charm:

    - (void)setImageContrast:(float)contrast forImageView:(UIImageView *)imageView {

    if (contrast > MIN_CONTRAST && contrast < MAX_CONTRAST) {
        CIImage *inputImage = [[CIImage alloc] initWithImage:imageView.image];
        CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:@"CIColorControls"];
        [exposureAdjustmentFilter setDefaults];
        [exposureAdjustmentFilter setValue:inputImage forKey:@"inputImage"];
        [exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:contrast] forKey:@"inputContrast"]; //default = 1.00
//        [exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:1.0f] forKey:@"inputSaturation"]; //default = 1.00
//        [exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:0.0f] forKey:@"inputBrightness"];
        CIImage *outputImage = [exposureAdjustmentFilter valueForKey:@"outputImage"];
        CIContext *context = [CIContext contextWithOptions:nil];
        imageView.image = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
    }

}

N.B. Default value for contrast is 1.0 (maximum value suggested is 4.0).
Also, contrast is calculated here on the imageView's image, so calling this method repeatedly will cumulate the contrast. Meaning, if you call this method with contrast value 2.0 first and then again with contrast value 3.0, you will get the original image with contrast value increased by 6.0 (2.0 * 3.0) - not 5.0.

Check the Apple documentation for more filters and parameters.

To list all available filters and parameters in code, just run this loop:

NSArray* filters = [CIFilter filterNamesInCategories:nil];
for (NSString* filterName in filters)
{
    NSLog(@"Filter: %@", filterName);
    NSLog(@"Parameters: %@", [[CIFilter filterWithName:filterName] attributes]);
}
Share:
12,179
Andrey Chernih
Author by

Andrey Chernih

Ruby on Rails and DevOps

Updated on June 13, 2022

Comments

  • Andrey Chernih
    Andrey Chernih almost 2 years

    I am working on iOS application where user can apply a certain set of photo filters. Each filter is basically set of Photoshop actions with a specific parameters. This actions are:

    • Levels adjustment
    • Brightness / Contrast
    • Hue / Saturation
    • Single and multiple overlay

    I've repeated all this actions in my code using arithmetic expressions looping through the all pixels in image. But when I run my app on iPhone 4, each filter takes about 3-4 sec to apply which is quite a few time for the user to wait. The image size is 640 x 640 px which is @2x of my view size because it's displayed on Retina display. I've found that my main problem is levels modifications which are calling the pow() C function each time I need to adjust the gamma. I am using floats not doubles of course because ARMv6 and ARMv7 are slow with doubles. Tried to enable and disable Thumb and got the same result.

    Example of the simplest filter in my app which is runs pretty fast though (2 secs). The other filters includes more expressions and pow() calls thus making them slow.

    https://gist.github.com/1156760

    I've seen some solutions which are using Accelerate Framework vDSP matrix transformations for fast image modifications. I've also seen OpenGL ES solutions. I am not sure that they are capable of my needs. But probably it's just a matter of translating my set of changes into some good convolution matrix?

    Any advice would be helpful.

    Thanks,
    Andrey.

  • Andrey Chernih
    Andrey Chernih over 12 years
    Thanks, Srikar, but actually 'ios-image-filters' is pretty slow as well (using the same pixel by pixel algorithms).
  • Andrey Chernih
    Andrey Chernih over 12 years
    Thanks, good catch with powf! Indeed, I was using double-version of this function.
  • Andrey Chernih
    Andrey Chernih over 12 years
    Brilliant, thanks so much! I am going to implement lookup tables in my code.