How to downscale a UIImage in IOS by the Data size

17,537

Solution 1

Right now, you have a routine that says:

// Check if the image size is too large
if ((imageData.length/1024) >= 1024) {

    while ((imageData.length/1024) >= 1024) {
        NSLog(@"While start - The imagedata size is currently: %f KB",roundf((imageData.length/1024)));

        // While the imageData is too large scale down the image

        // Get the current image size
        CGSize currentSize = CGSizeMake(image.size.width, image.size.height);

        // Resize the image
        image = [image resizedImage:CGSizeMake(roundf(((currentSize.width/100)*80)), roundf(((currentSize.height/100)*80))) interpolationQuality:kMESImageQuality];

        // Pass the NSData out again
        imageData = UIImageJPEGRepresentation(image, kMESImageQuality);

    }
}

I wouldn't advise recursively resizing the image. Every time you resize, you lose some quality (often manifesting itself as a "softening" of the image with loss of detail, with cumulative effects). You always want to go back to original image and resize that smaller and smaller. (As a minor aside, that if statement is redundant, too.)

I might suggest the following:

NSData  *imageData    = UIImageJPEGRepresentation(image, kMESImageQuality);
double   factor       = 1.0;
double   adjustment   = 1.0 / sqrt(2.0);  // or use 0.8 or whatever you want
CGSize   size         = image.size;
CGSize   currentSize  = size;
UIImage *currentImage = image;

while (imageData.length >= (1024 * 1024))
{
    factor      *= adjustment;
    currentSize  = CGSizeMake(roundf(size.width * factor), roundf(size.height * factor));
    currentImage = [image resizedImage:currentSize interpolationQuality:kMESImageQuality];
    imageData    = UIImageJPEGRepresentation(currentImage, kMESImageQuality);
}

Note, I'm not touching image, the original image, but rather assigning currentImage by doing a resize from the original image each time, by a decreasing scale each time.

BTW, if you're wondering about my cryptic 1.0 / sqrt(2.0), I was trying to draw a compromise between your iterative 80% factor and my desire to favor resizing by a power of 2 where I can (because a reduction retains more sharpness when done by a power of 2). But use whatever adjustment factor you want.

Finally, if you're doing this on huge images, you might think about using @autoreleasepool blocks. You'll want to profile your app in Allocations in Instruments and see where your high water mark is, as in the absence of autorelease pools, this may constitute a fairly aggressive use of memory.

Solution 2

Besides the maximum size you also need to choose a minimum size as well as decide on performance. For example, you could check the size of UIImageJPEGRepresentation(image, 1.0). If too big, do you then check at 0.95 or 0.1?

One possible approach is to get the size of UIImageJPEGRepresentation(image, 1.0) and see by what percent it is too big. For example, say it is 600kB. You should then compute 500.0 / 600 which is roughly 0.83. So then do UIImageJPEGRepresentation(image, 0.83). That won't give exactly 500kB but it may be close enough.

Another approach would be to start with UIImageJPEGRepresentation(image, 1.0). It it's too big then do UIImageJPEGRepresentation(image, 0.5) If too big then go with 0.25 but if too small go with 0.75. Keep splitting the difference until you get within an acceptable range of your desired size.

Solution 3

This was my approach:

// Check if the image size is too large
if ((imageData.length/1024) >= 1024) {

    while ((imageData.length/1024) >= 1024) {
        NSLog(@"While start - The imagedata size is currently: %f KB",roundf((imageData.length/1024)));

        // While the imageData is too large scale down the image

        // Get the current image size
        CGSize currentSize = CGSizeMake(image.size.width, image.size.height);

        // Resize the image
        image = [image resizedImage:CGSizeMake(roundf(((currentSize.width/100)*80)), roundf(((currentSize.height/100)*80))) interpolationQuality:kMESImageQuality];

        // Pass the NSData out again
        imageData = UIImageJPEGRepresentation(image, kMESImageQuality);

    }
}

The resize image method is as follows:

// Returns a rescaled copy of the image, taking into account its orientation
// The image will be scaled disproportionately if necessary to fit the bounds specified by the parameter
- (UIImage *)resizedImage:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality {
    BOOL drawTransposed;

    switch (self.imageOrientation) {
        case UIImageOrientationLeft:
        case UIImageOrientationLeftMirrored:
        case UIImageOrientationRight:
        case UIImageOrientationRightMirrored:
            drawTransposed = YES;
            break;

        default:
            drawTransposed = NO;
    }

    return [self resizedImage:newSize
                    transform:[self transformForOrientation:newSize]
               drawTransposed:drawTransposed
         interpolationQuality:quality];
}

Followed by:

// Returns a copy of the image that has been transformed using the given affine transform and scaled to the new size
// The new image's orientation will be UIImageOrientationUp, regardless of the current image's orientation
// If the new size is not integral, it will be rounded up
- (UIImage *)resizedImage:(CGSize)newSize
                transform:(CGAffineTransform)transform
           drawTransposed:(BOOL)transpose
     interpolationQuality:(CGInterpolationQuality)quality {
    CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
    CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
    CGImageRef imageRef = self.CGImage;

    // Build a context that's the same dimensions as the new size
    CGContextRef bitmap = CGBitmapContextCreate(NULL,
                                                newRect.size.width,
                                                newRect.size.height,
                                                CGImageGetBitsPerComponent(imageRef),
                                                0,
                                                CGImageGetColorSpace(imageRef),
                                                CGImageGetBitmapInfo(imageRef));

    // Rotate and/or flip the image if required by its orientation
    CGContextConcatCTM(bitmap, transform);

    // Set the quality level to use when rescaling
    CGContextSetInterpolationQuality(bitmap, quality);

    // Draw into the context; this scales the image
    CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);

    // Get the resized image from the context and a UIImage
    CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
    UIImage *newImage = [UIImage imageWithCGImage:newImageRef];

    // Clean up
    CGContextRelease(bitmap);
    CGImageRelease(newImageRef);

    return newImage;
}

// Additional methods for reference

// Returns an affine transform that takes into account the image orientation when drawing a scaled image
- (CGAffineTransform)transformForOrientation:(CGSize)newSize {
    CGAffineTransform transform = CGAffineTransformIdentity;

    switch (self.imageOrientation) {
        case UIImageOrientationDown:           // EXIF = 3
        case UIImageOrientationDownMirrored:   // EXIF = 4
            transform = CGAffineTransformTranslate(transform, newSize.width, newSize.height);
            transform = CGAffineTransformRotate(transform, M_PI);
            break;

        case UIImageOrientationLeft:           // EXIF = 6
        case UIImageOrientationLeftMirrored:   // EXIF = 5
            transform = CGAffineTransformTranslate(transform, newSize.width, 0);
            transform = CGAffineTransformRotate(transform, M_PI_2);
            break;

        case UIImageOrientationRight:          // EXIF = 8
        case UIImageOrientationRightMirrored:  // EXIF = 7
            transform = CGAffineTransformTranslate(transform, 0, newSize.height);
            transform = CGAffineTransformRotate(transform, -M_PI_2);
            break;
        default:
            break;
    }

    switch (self.imageOrientation) {
        case UIImageOrientationUpMirrored:     // EXIF = 2
        case UIImageOrientationDownMirrored:   // EXIF = 4
            transform = CGAffineTransformTranslate(transform, newSize.width, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;

        case UIImageOrientationLeftMirrored:   // EXIF = 5
        case UIImageOrientationRightMirrored:  // EXIF = 7
            transform = CGAffineTransformTranslate(transform, newSize.height, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;
        default:
            break;
    }

    return transform;
}

Solution 4

-(NSData*)testData
{
    UIImage *imageToUpload=[UIImage imageNamed:@"images/lifestyle2.jpg"];
    NSData *imgData=UIImageJPEGRepresentation(imageToUpload,1.0);
    float compressionRate=10;
    while (imgData.length>1024)
    {
        if (compressionRate>0.5)
        {
            compressionRate=compressionRate-0.5;
            imgData=UIImageJPEGRepresentation(imageToUpload,compressionRate/10);
        }
        else
        {
            return imgData;
        }
    }
    return imgData;
}

It maintains image quality not lass than 1MB.

Call it with,

NSData *compressedImageData=[self testData];
NSLog(@"%lu KB",compressedImageData.length);

Solution 5

In your revised question, you clarified that your goal was to say within file size limitations while uploading images. In that case, playing around with JPEG compression options is fine as suggested by rmaddy.

The interesting question is that you have two variables to play around with, JPEG compression and image dimensions (there are others, too, but I'll keep it simple). How do you want to prioritize one over the other? For example, I don't think it makes sense to keep a full resolution, absurdly compressed image (e.g. 0.1 quality factor). Nor does it make sense to keep a tiny resolution, uncompressed image. Personally, I'd iteratively adjust quality as suggested by rmaddy, but set some reasonable floor (e.g. JPEG quality not less than, say 0.70). At that point, I might consider changing the image dimensions (and that changes file size pretty quickly, too), and altering the dimensions until the resulting NSData was an appropriate size.

Anyway, in my original answer, I focused on the memory consumption within the app (as opposed to file size). For posterity's sake, see that answer below:


If you are trying to control how much memory is used when you load the images into UIImage objects to be used in UIKit objects, then playing around with JPEG compression won't help you much, because the internal representation of the images once you load them into UIKit objects is uncompressed. Thus, in that scenario, JPEG compression options doesn't accomplish much (other than sacrificing image quality).

To illustrate the idea, I have an image that is 1920 x 1080. I have it in PNG format (the file is 629kb), a compressed JPEG format (217kb), and a minimally compressed JPEG format (1.1mb). But, when I load those three different images into UIImageView objects (even if they have a very small frame), Instrument's "Allocations" tool shows me that they're each taking up 7.91mb:

allocations

This is because when you load the image into an image view, the internal, uncompressed representation of these three images is four bytes per pixel (a byte for red, one for green, one for blue, and one for alpha). Thus a my 1920 x 1080 images take up 1920 x 1080 x 4 = 8,249,400 = 7.91mb.

So, if you don't want them to take up more than 500kb in memory when loading them into image view objects, that means that you want to resize them such that the product of the width times the height will be 128,000 or less (i.e. if square, less than 358 x 358 pixels).

But, if your concern is one of network bandwidth as you upload images or persistent storage capacity, then go ahead and play around with JPEG compression values as suggested by rmaddy's excellent answer. But if you're trying to address memory consumption issues while the images are loaded into UIKit objects, then don't focus on compression, but focus on resizing the image.

Share:
17,537

Related videos on Youtube

StuartM
Author by

StuartM

Web Developer/iOS Developer. Current Apps: Where to Next -https://itunes.apple.com/us/app/where-to-next/id884766874?ls=1&mt=8 Shows business data around your current location.

Updated on June 23, 2022

Comments

  • StuartM
    StuartM almost 2 years

    I am looking to downscale a UIImage in iOS.

    I have seen other questions below and their approach on how to downscale the image by size. Resizing Images Objective-C
    How to resize the image programmatically in objective-c in iphone
    The simplest way to resize an UIImage?

    These questions are all based on re-sizing the image to a specific size. In my case I am looking to re-size/downscale the image based on a maximum size.

    As an example, I would like to set a maximum NSData size to be 500 KB. I know that I can get the size of the image like this:// Check the size of the image returned

    NSData *imageData = UIImageJPEGRepresentation(image, 0.5);
    
    // Log out the image size
    NSLog(@"%lu KB",(imageData.length/1024));
    

    What I would like to do is some form of loop here. If the size is greater than the maximum size that I set, I would like to scale down the image slightly, then check the size. If the size is still too large scale down again slightly then check again, until it is lower than the maximum set size.

    I am not sure what the best approach for this is. Ideally I do not want to scale down the image to a specific size all the time, but only slightly scale down the image each time. That way I can have the largest (size w/h) of the image itself and at its maximum size (bytes). If I scale down slightly only at a time, what would be the best way to accomplish this?

    EDIT To confirm, I am looking to re-size the actual image but re-size the image so that it is smaller than the maximum NSData Length. For example:
    -Check the NSData Length
    -If above the maximum I want to pass the UIImage into a method
    -Then loop through this method slightly re-sizing the actual image size each time
    -Until it is under the maximum NSData length, then return the image?

    • Rob
      Rob over 10 years
      Are you doing this to optimize persistent storage or network bandwidth? (In which case, playing around with JPEG compression options is fine.) Or are you trying to control how much memory is used when you load the images into UIImage object? (In that scenario, JPEG compression options aren't really relevant, just resulting in image quality loss without affecting how much memory is used.)
    • StuartM
      StuartM over 10 years
      @Rob - I am doing this for upload restrictions. I would like to scale the image size as opposed to compression as to keep quality. At the moment memory usage is not an issue.
    • Rob
      Rob over 10 years
      Ok, then you can disregard my answer, but I'll keep it there for future readers.
    • Albert Renshaw
      Albert Renshaw over 10 years
  • StuartM
    StuartM over 10 years
    I like your thought on calculating the percentage difference that does seem like a logical approach. However, I am not looking to change the compression rate as I do not want to loose the quality of the image really. I would prefer to run through changing the size of the image until the data length is then in the relevant size.
  • rmaddy
    rmaddy over 10 years
    Your question was unclear. You talked about all of the re-sizing questions being not what you wanted and basing it on file size. Anyway, it would be the same basic approach. Instead of adjusting the quality, adjust the image size. Start at 1024x768 (for example). Adjust that size based on how much you need to reduce the image.
  • StuartM
    StuartM over 10 years
    Apologies I was not clear on the explanation. I meant that I am not trying to directly change to a specific size, however I would like to change the size of the image to affect the size. So if the image is 1024KB in size (NSData Length) I want to re-size the image which would affect the size. Specifically I am looking for the best way to loop through this, I do not want to write a bunch of if statements (re-size), then another re-size. But rather pass the image into a method and return an image which is of appropriate size (NSData Length) could you assist with this approach?
  • Rob
    Rob over 10 years
    @StuartM Understood. The interesting question is that you have two variables to play around with, JPEG compression and image dimensions (there are others, too, but I'll keep it simple). How do you want to prioritize one over the other? For example, I don't think it makes sense to keep a full resolution, highly compressed image (e.g. 0.1 factor). Nor does it make sense to keep a tiny resolution, uncompressed image. Personally, I'd target some reasonable compression that doesn't degrade the image too much (e.g. 0.75), and then alter the size until the resulting NSData was an appropriate size.
  • StuartM
    StuartM over 10 years
    I completely agree with that approach, I would use the 0.75 compression as a base point. As a sub question, if I am passing a UIImage between ViewControllers I guess this passes the full quality image, unless I was to actually pass the NSData pointer instead? If I scale down the image using NSData compression does this actually compress the UIImage image?
  • Rob
    Rob over 10 years
    @StuartM I'm not entirely sure. I was surprised, for example, that image created via imageNamed and imageWithContentsOfFile did not consume much memory until I used it in a UIImageView. This makes me think that there are all sorts of wild optimizations going on in the background (and which may vary based upon iOS version). But in answer to your question, I do everything in my power to avoid passing around UIImage objects (certainly never maintaining arrays of UIImage objects). If dealing with lots of images, I'll generally pass around a path to the file rather than the image itself.
  • StuartM
    StuartM over 10 years
    Thanks, in my case there is only one image and no need to save on disk so I pass the pointer to the image around VCs, not the actual image... so that should not really matter.
  • Albert Renshaw
    Albert Renshaw over 10 years
    @rmaddy The percentage method you've described above is a great approach. I tried it and was getting values much lower than my target "maximum size" though... so what I did was take the current JPG-compression-parameter-float and the new percentage-multiplied-float and averaged the two, then applied that and re-iterated in a while loop until I got my target file size. Usually only takes me 2-3 iterations to achieve a file size that is within about a 2% margin of error to my target "max size" thus limiting the amount of data loss. :)
  • rmaddy
    rmaddy over 10 years
    @AlbertRenshaw Sounds good. If this helped don't forget to upvote.
  • Albert Renshaw
    Albert Renshaw over 10 years
    Included my iterative approach as the answer here: stackoverflow.com/q/20458558/2057171
  • StuartM
    StuartM over 10 years
    @Rob/@rmaddy - This was the approach that I took in the end. I use a while statement to check the imageData size and then re-size if the size is still too large.
  • StuartM
    StuartM over 10 years
    I have taken a different approach using a while statement, shown in my answer below. Thanks for your help
  • StuartM
    StuartM over 10 years
    Thanks for your detailed answer you have certainly provided food for thought on the topic. I took a simple approach of a while statement checking the NSData length and then resized the image (by size w/h), checking each time the image size is changed.
  • Rob
    Rob over 10 years
    Very good. BTW, I wouldn't recursively resize the image. Every time you resize, you lose some quality (often manifesting itself as a "softening" of the image with loss of detail, with cumulative effects). You always want to go back to original image and resize that smaller and smaller. And, if sharpness is important, I'd personally make the first resizing 1/2 size, if still too big, then 1/3 the size of the original, if still too big 1/4 of original, etc. (powers of 2 are best, actually). But recursively resizing an image can really degrade the sharpness.
  • StuartM
    StuartM over 10 years
    @Rob - that is a great point in the above I am constantly changing the image as opposed to the original itself. What a great thought, I'll update the above...
  • StuartM
    StuartM over 10 years
    @rob - Can you assist with an approach to this please? I cannot think how to rewrite my scaleDown method (while) to work with a scaled down image as opposed to the original image. I think I would have to resize to a specific set all the time, whilst I was hoping I could scale down 80% at a time for best quality. I am trying to complete the same as the above but with the original image each time.
  • StuartM
    StuartM over 10 years
    Updated to accept Robs approach which is much better as we keep the original image in tact during the resizing. Then once we have an acceptable size we change the collectedImage reference to be the new changed size image.
  • Neil Galiaskarov
    Neil Galiaskarov over 9 years
    what is transformForOrientation: method?
  • StuartM
    StuartM over 9 years
    @NeilGaliaskarov - Updated with that method also. Thanks
  • Kevin
    Kevin over 8 years
    @StuartM you should give credit to author of this resize code, Trevor Harmon
  • StuartM
    StuartM over 8 years
    Sorry this is old... Where's it from?
  • SayeedHussain
    SayeedHussain almost 8 years
    @Rob This is kind of old but the thread is interesting. I have just one doubt. No matter what you play around with, image dimension or compression factor, you have to load the original image as an UIImage because that's the input argument to either techniques. If that itself is large as compared to it's NSData representation, you still end up juggling a lot of memory. Is this hypothesis correct?
  • Rob
    Rob almost 8 years
    Yes, but that spike in memory usage is fleeting and can be constrained (e.g. if resizing a dozen large images to be presented in collection view, the resizing routine can resize them serially, so peak memory usage is limited to that required by a single large image).
  • Ahmed Sahib
    Ahmed Sahib almost 7 years
    what is kMESImageQuality?
  • Rob
    Rob almost 7 years
    AFAIK, this is some random constant that StuartM defined (which I then faithfully copied in my answer). You'll see it referenced in other questions by him (see stackoverflow.com/q/20483242/1271826, for example). It's not really relevant here. Use whatever you want. Personally, when doing JPEGs, I'll use values between 0.7 and 0.9, which offer decent compression, but without destroying image quality too much. And if I don't want to introduce any image loss, I'll use PNG representations (which only offer a little compression, but it's lossless and smaller than JPEG with quality of 1.0).