enum values: NSInteger or int?

46,029

Solution 1

There is now NS_ENUM starting Xcode 4.5:

typedef NS_ENUM(NSUInteger, NSCellType) {
    NSNullCellType = 0,
    NSTextCellType = 1,
    NSImageCellType = 2
};

And you can consider NS_OPTIONS if you work with binary flags:

typedef NS_OPTIONS(NSUInteger, MyCellFlag) {
    MyTextCellFlag = 1 << 0,
    MyImageCellFlag = 1 << 1,
};

Solution 2

I run a test on the simulator so the intention of the test is check the size of different integer types. For that, the result of sizeof was printed in the console. So I test this enum values:

      
typedef enum {
    TLEnumCero = 0,
    TLEnumOne = 1,
    TLEnumTwo = 2
} TLEnum;

typedef enum {
    TLEnumNegativeMinusOne = -1,
    TLEnumNegativeCero = 0,
    TLEnumNegativeOne = 1,
    TLEnumNegativeTwo = 2
} TLEnumNegative;

typedef NS_ENUM(NSUInteger, TLUIntegerEnum) {
    TLUIntegerEnumZero = 0,
    TLUIntegerEnumOne = 1,
    TLUIntegerEnumTwo = 2
};

typedef NS_ENUM(NSInteger, TLIntegerEnum) {
    TLIntegerEnumMinusOne = -1,
    TLIntegerEnumZero = 0,
    TLIntegerEnumOne = 1,
    TLIntegerEnumTwo = 2
};

Test Code:


    NSLog(@"sizeof enum: %ld", sizeof(TLEnum));
    NSLog(@"sizeof enum negative: %ld", sizeof(TLEnumNegative));
    NSLog(@"sizeof enum NSUInteger: %ld", sizeof(TLUIntegerEnum));
    NSLog(@"sizeof enum NSInteger: %ld", sizeof(TLIntegerEnum));

Result for iPhone Retina (4-inch) Simulator:


sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 4
sizeof enum NSInteger: 4

Result for iPhone Retina (4-inch 64 bit) Simulator:


sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 8
sizeof enum NSInteger: 8

Conclusion

A generic enum can be an int or unsigned int types of 4 bytes for 32 or 64 bits. As we already know NSUInteger and NSInteger are 4 bytes for 32 bits and 8 bytes in 64 bits compiler for iOS.

Solution 3

These are two separate declarations. The typedef guarantees that, when you use that type, you always get an NSUInteger.

The problem with an enum is not that it's not large enough to hold the value. In fact, the only guarantee you get for an enum is that sizeof(enum Foo) is large enough to hold whatever values you've currently defined in that enum. But its size may change if you add another constant. That's why Apple do the separate typedef, to maintain binary stability of the API.

Solution 4

The data types of the enum's constants are not guaranteed to be NSUInteger, but they are guaranteed to be cast to NSUInteger every time you use them through NSCellType.

In other words, the declaration decrees that although the enum's values would currently fit into an unsigned int, the storage reserved for them when accessed through NSCellType should be an NSUInteger.

Share:
46,029
Michael Fey
Author by

Michael Fey

Updated on May 02, 2020

Comments

  • Michael Fey
    Michael Fey about 4 years

    tl;dr Version

    How are the data types of an enum's constants guaranteed to be NSUInteger instead of unsigned int when declaring an enum thusly:

    enum {
        NSNullCellType = 0,
        NSTextCellType = 1,
        NSImageCellType = 2
    };
    typedef NSUInteger NSCellType;
    

    The typedef to NSUInteger does not appear to be tied to the enum declaration in any way.

    Full Version

    I was reading through Apple's 64-Bit Transition Guide for Cocoa for some guidance on enum values and I came away with a question. Here's a (lengthy) quote from the Enumeration Constants section, emphasis mine:

    A problem with enumeration (enum) constants is that their data types are frequently indeterminate. In other words, enum constants are not predictably unsigned int. With conventionally constructed enumerations, the compiler actually sets the underlying type based on what it finds. The underlying type can be (signed) int or even long. Take the following example:

    type enum {
        MyFlagError = -1,
        MyFlagLow = 0,
        MyFlagMiddle = 1,
        MyFlagHigh = 2
    } MyFlagType;
    

    The compiler looks at this declaration and, finding a negative value assigned to one of the member constants, declares the underlying type of the enumeration int. If the range of values for the members does not fit into an int or unsigned int, then the base type silently becomes 64-bit (long). The base type of quantities defined as enumerations can thus change silently size to accord with the values in the enumeration. This can happen whether you're compiling 32-bit or 64-bit. Needless to say, this situation presents obstacles for binary compatibility.

    As a remedy for this problem, Apple has decided to be more explicit about the enumeration type in the Cocoa API. Instead of declaring arguments in terms of the enumeration, the header files now separately declare a type for the enumeration whose size can be specified. The members of the enumeration and their values are declared and assigned as before. For example, instead of this:

    typedef enum {
        NSNullCellType = 0,
        NSTextCellType = 1,
        NSImageCellType = 2
    } NSCellType;
    

    there is now this:

    enum {
        NSNullCellType = 0,
        NSTextCellType = 1,
        NSImageCellType = 2
    };
    typedef NSUInteger NSCellType;
    

    The enumeration type is defined in terms of NSInteger or NSUInteger to make the base enumeration type 64-bit capable on 64-bit architectures.

    My question is this: given that the typedef doesn't appear to be tied explicitly to the enum declaration, how does one know if their data types are unsigned int or NSUInteger?

  • jscs
    jscs over 12 years
    What guarantees that the created type, (e.g., NSCellType) is not smaller than the size of the enum? Does it always have to be based on the widest available type, under this scheme?
  • Chuck
    Chuck over 12 years
    @JoshCaswell: As long as the constants you declare for the enum are representable in the type that you use in the typedef, it shouldn't matter what size the enum uses. For example, short x = 1LL will correctly give you a short with the value 1 even on systems where a long long is four times as wide as a short.
  • jscs
    jscs over 12 years
    @Chuck: Right, that's clear. My thought was the other way around: that (a hypothetical) declaration like enum { JCWishy = (1 << 0), ..., JCWashy = (1 << 63) }; would require at least an 8-byte type to be used for typedef xxx JCShy, unless there were some kind of compiler enforcement (which I can't see a mechanism for). Otherwise, JCShy j = JCWashy; wouldn't work.
  • Chuck
    Chuck over 12 years
    @JoshCaswell: Yes, that's what I said. You're declaring a constant that requires 8 bytes, so the type in your typedef must be at least 8 bytes wide. But it doesn't matter what size the enum type itself is.
  • jscs
    jscs over 12 years
    @Chuck: Oh, sorry, I thought you meant it the other way around. Thanks.
  • klefevre
    klefevre about 10 years
    @javionegas Thanks for the test, but you have swapped results between 32bit and 64bit.
  • Beltalowda
    Beltalowda about 9 years
    Just make sure when you do this that you never accidentally place a negative value in the enumeration. It will cause Xcode to hang during compilation without any warnings or errors. This is easy to do if you start off with an NSUInteger for the enum type then later decide you want to give it a negative value and forget to remove the U from the type. Hopefully this saves someone some time.
  • Cœur
    Cœur about 9 years
    The hang probably comes from the compiler trying to allocate all intermediate enum values, meaning roughly 4 billion values on 32 bits.