UIImageView only displays when I call initWithImage

13,914

Solution 1

The problem is that when you set the image property on an already-init'd UIImageView, the frame size isn't updated to match the new image size (unlike initWithImage:).

Whenever you have a problem like this, it's always worth checking out the docs in case you missed something:

Setting the image property does not change the size of a UIImageView. Call sizeToFit to adjust the size of the view to match the image.

So, add a call to sizeToFit after you've set the image property:

self.imageView.image = testImg;
[self.imageView sizeToFit];

As a side note, I'd only use the self. dot notation when you're writing to a property, not reading it, or calling a method. In other words, you can get away with just writing:

// we are not setting imageView itself here, only a property on it
imageView.image = testImg; 

// this is a method call, so "self." not needed
[imageView sizeToFit];     

Solution 2

It might be possible that your image view isn't resizing for the image, so you're loading an image into a UIImageView with a frame of zero size. Try manually setting the frame of the image view to some other value. Something along the lines of:

UIImageView* test = [[UIImageView alloc] init];
[test setFrame:CGRectMake(0, 0, 100, 100)];
Share:
13,914
Kevin_TA
Author by

Kevin_TA

Updated on June 04, 2022

Comments

  • Kevin_TA
    Kevin_TA almost 2 years

    For some reason, I can only display a UIImageView when I alloc/init it each iteration with a different image. What's strange is I know the image data is being loaded because I am running processing on the image and the processing is working as expected. In short, here are the two methods I was trying:

    // interface
    @interface ViewController : UIViewController <UIAlertViewDelegate>
    {
    
        UIImageView *imageView;
    
    }
    
    @property (nonatomic, retain) UIImageView *imageView;
    
    @end
    
    // implementation
    @implementation ViewController
    
    @synthesize imageView;
    
    //...
    
    - (void) loadAndDisplayImage {
    
        // Load testing image
        UIImage *testImg;
        testImg = [UIImage imageNamed:@"Test.png"];
    
        self.imageView = [[UIImageView alloc] initWithImage:testImg];
    
        //size of imageView rect
        CGRect frame = self.imageView.frame;
        int ivw = frame.size.width;
        int ivh = frame.size.height;
    
        //...
    
    }
    
    @end
    

    When I use this method self.imageView = [[UIImageView alloc] initWithImage:testImg]; the ivw and ivh have valid values and the image is displayed. However, if I change the implementation to this:

    // implementation
    @implementation ViewController
    
    @synthesize imageView;
    
    //...
    
    - (void) viewDidLoad {
    
        self.imageView = [[UIImageView alloc] init];
        [self loadAndDisplayImage];
    
    }
    
    - (void) loadAndDisplayImage {
    
        // Load testing image
        UIImage *testImg;
        testImg = [UIImage imageNamed:@"Test.png"];
    
        self.imageView.image = testImg;
    
        //size of imageView rect
        CGRect frame = self.imageView.frame;
        int ivw = frame.size.width;
        int ivh = frame.size.height;
    
        //...
    
    }
    
    @end
    

    Where I am setting the image using self.imageView.image = testImg;, the values ivw and ivh are both zero and no image is displayed but the subsequent processing on the image is still accurate. In both cases, I am sending the image to processing using [self doRecognizeImage:self.imageView.image];. I can't figure out how this is possible. It would make a lot more sense to me if the processing failed when the image could not be shown.

    Ideas? Thanks.