Convert NSInteger to NSUInteger?

25,907

Solution 1

NSInteger and NSUInteger are just typedefs for primitive integer types:

#if __LP64__ || NS_BUILD_32_LIKE_64
  typedef long NSInteger;
  typedef unsigned long NSUInteger;
#else
  typedef int NSInteger;
  typedef unsigned int NSUInteger;
#endif

As such, you don't need to "convert" between them. A simple cast should be sufficient. Like:

NSInteger myInt = 0;
NSUInteger unsignedInt = (NSUInteger)myInt;

Solution 2

Since this might be useful for other coming across this issue here is a little table showing you the actual effect of casting. These values were taken straight from the debugger as hex values. Choose accordingly, as you can see casting does cause effects. For 32-bit, lop off the bottom ffffffff and for 16-bit lop off bottom ffffffffffff. Also note, -1 is always 0xffffffffffffffff.

NSInteger si = NSIntegerMax;    // si   NSInteger   0x7fffffffffffffff
si = -1;                        // si   NSInteger   0xffffffffffffffff
si = (NSInteger)-1;             // si   NSInteger   0xffffffffffffffff
si = (NSUInteger)-1;            // si   NSInteger   0xffffffffffffffff
si = NSUIntegerMax;             // si   NSInteger   0xffffffffffffffff
si = (NSInteger)NSIntegerMax;   // si   NSInteger   0x7fffffffffffffff
si = (NSUInteger)NSIntegerMax;  // si   NSInteger   0x7fffffffffffffff
si = (NSInteger)NSUIntegerMax;  // si   NSInteger   0xffffffffffffffff
si = (NSUInteger)NSUIntegerMax; // si   NSInteger   0xffffffffffffffff

NSUInteger ui = NSUIntegerMax;  // ui   NSUInteger  0xffffffffffffffff
ui = -1;                        // ui   NSUInteger  0xffffffffffffffff
ui = (NSInteger)-1;             // ui   NSUInteger  0xffffffffffffffff
ui = (NSUInteger)-1;            // ui   NSUInteger  0xffffffffffffffff
ui = NSIntegerMax;              // ui   NSUInteger  0x7fffffffffffffff
ui = (NSInteger)NSIntegerMax;   // ui   NSUInteger  0x7fffffffffffffff
ui = (NSUInteger)NSIntegerMax;  // ui   NSUInteger  0x7fffffffffffffff
ui = (NSInteger)NSUIntegerMax;  // ui   NSUInteger  0xffffffffffffffff
ui = (NSUInteger)NSUIntegerMax; // ui   NSUInteger  0xffffffffffffffff

Solution 3

If you are certain that your NSInteger is greater than or equal to zero, you simply cast it as NSUInteger. You should not be casting to NSUInteger if your NSInteger represents a negative number, as it may lead to inconsistency.

NSInteger signedInteger = -30;
NSUInteger unsignedInteger = (NSUInteger)signedInteger;

results in

integer results

if (unsignedInteger == signedInteger) {
    // is TRUE
}
if (signedInteger == 4294967266) {
    // is FALSE
}
if (signedInteger == -30) {
    // is TRUE
}
if (unsignedInteger == 4294967266) {
    // is TRUE
}
if (unsignedInteger == -30) {
    // is TRUE
}
Share:
25,907

Related videos on Youtube

SimplyKiwi
Author by

SimplyKiwi

Mobile Tech Lead at SimplyKiwi. Senior iOS Engineer at Rose Digital.

Updated on July 10, 2020

Comments

  • SimplyKiwi
    SimplyKiwi almost 4 years

    I am trying to convert a NSInteger to a NSUInteger and I googled it and found no real answer. How would I do this?

  • Constantino Tsarouhas
    Constantino Tsarouhas over 12 years
    Isn't casting a signed int to an unsigned int dangerous? I read that it may cause corruption as a sign bit may be interpreted as a value in unsigned ints? Or does C all the work to prevent that from happening?
  • Dmitrii Cooler
    Dmitrii Cooler about 10 years
    in NSObjCRuntime.h #define NSUIntegerMax ULONG_MAX NSLog(@"ULONG_MAX: %lu", ULONG_MAX); ULONG_MAX: 4294967295 4294967295-30+1 = 4294967266 +1 because of NSInteger ui = NSIntegerMax; ui = -1; ui = (NSUInteger)-1;