What is the downside of replacing size_t with unsigned long

18,429

Solution 1

What warnings? The most obvious one I can think of is for a "narrowing conversion", that is to say you're assigning size_t to unsigned int, and getting a warning that information might be lost.

The main downside of replacing size_t with unsigned long is that unsigned long is not guaranteed to be large enough to contain every possible value of size_t, and on Windows 64 it is not large enough. So you might find that you still have warnings.

The proper fix is that if you assign a size_t to a variable (or data member), you should make sure that variable has a type large enough to contain any value of size_t. That's what the warning is all about. So you should not switch to unsigned long, you should switch those variables to size_t.

Conversely, if you have a variable that doesn't need to be big enough to hold any size, just big enough for unsigned int, then don't use size_t for it in the first place.

Both types (size_t and unsigned int) have valid uses, so any approach that indiscriminately replaces all use of them by some other type must be wrong :-) Actually, you could replace everything with size_t or uintmax_t and for most programs that would be OK. The exceptions are where the code relies on using an unsigned type of the same size as int, or whatever, such that a larger type breaks the code.

Solution 2

The standard makes little guarantees about the sizes of types like int and long. size_t is guaranteed to be large enough to hold any object, and all std containers operate on size_t.

It's perfectly possible for a platform to define long as smaller than size_t, or have the size of long subject to compilation options, for example. To be safe, it's best to stick to size_t.

Another criterion to consider is that size_t carries a meaning - "this thing is used to store a size or an index." It makes the code slightly more self-documenting.

Solution 3

If you are using size_t in places where you should get a size_t and replace it with unsigned long, you will introduce new warnings.

example:

size_t count = some_vector.size();

Replace size_t with unsigned long, and (to the degree they are different) you will have introduced a new warning (because some_vector.size() returns a size_t - actually a std:::vector<something>::size_type but in practice it should evaluate to the same).

Share:
18,429
Yulia V
Author by

Yulia V

Updated on July 24, 2022

Comments

  • Yulia V
    Yulia V almost 2 years

    The library I am working on need to be used on both 32 and 64 bit machines; I have lots of compiler warnings because on 64bit machines unsigned int != size_t.

    Is there any downside in replacing all unsigned ints and size_ts by 'unsigned long'? I appreciate it does not look very elegant, but, in out case, the memory is not too much of an issue... I am wondering if there is a possibility of any bugs/unwanted behaviour etc. created by such replace all operation (could you give examples)? Thanks.

  • Yulia V
    Yulia V over 11 years
    I might get more warning, true, but can upsizing induce a genuine trouble (e.g. bug)?
  • Angew is no longer proud of SO
    Angew is no longer proud of SO over 11 years
    @YuccaV In any program which takes itself seriously, warnings are genuine trouble.
  • utnapistim
    utnapistim over 11 years
    That depends on your code. If you have - for another example - calculated differences of offsets and compare them for being less than zero, with unsigned you may get negatives cast to unsigned, which should come somewhere near the max range the unsigned value of that type can represent (0x11111111 or similar). It depends on your codebase basically. Angew's point is valid though: your best bet is to set your compiler to have a zero tollerance for warnings (for gcc that's the -Werror option).