std::vector resize downward

21,188

Solution 1

Calling resize() with a smaller size has no effect on the capacity of a vector. It will not free memory.

The standard idiom for freeing memory from a vector is to swap() it with an empty temporary vector: std::vector<T>().swap(vec);. If you want to resize downwards you'd need to copy from your original vector into a new local temporary vector and then swap the resulting vector with your original.

Updated: C++11 added a member function shrink_to_fit() for this purpose, it's a non-binding request to reduce capacity() to size().

Solution 2

Actually, the standard does specify what should happen:

This is from vector, but the theme is the same for all the containers (list, deque, etc...)

23.2.4.2 vector capacity [lib.vector.capacity]

void resize(size_type sz, T c = T());

6) Effects:

if (sz > size())
    insert(end(), sz-size(), c);
else if (sz < size())
    erase(begin()+sz, end());
else
    ; //do nothing

That is to say: If the size specified to resize is less than the number of elements, those elements will be erased from the container. Regarding capacity(), this depends on what erase() does to it.

I cannot locate it in the standard, but I'm pretty sure clear() is defined to be:

void clear()
{
    erase(begin(), end());
}

Therefore, the effects clear() has on capacity() is also tied to the effects erase() has on it. According to the standard:

23.2.4.3 vector modifiers [lib.vector.modifiers]

iterator erase(iterator position);
iterator erase(iterator first, iterator last);

4) Complexity: The destructor of T is called the number of times equal to the number of the elements erased....

This means that the elements will be destructed, but the memory will remain intact. erase() has no effect on capacity, therefore resize() and clear() also have no effect.

Solution 3

The capacity will never decrease. I'm not sure if the standard states this explicitly, but it is implied: iterators and references to vector's elements must not be invalidated by resize(n) if n < capacity().

Share:
21,188

Related videos on Youtube

Deduplicator
Author by

Deduplicator

Updated on July 05, 2020

Comments

  • Deduplicator
    Deduplicator almost 4 years

    The C++ standard seems to make no statement regarding side-effects on capacity by either resize(n), with n < size(), or clear().

    It does make a statement about amortized cost of push_back and pop_back - O(1)

    I can envision an implementation that does the usual sort of capacity changes ala CLRS Algorithms (e.g. double when enlarging, halve when decreasing size to < capacity()/4). (Cormen Lieserson Rivest Stein)

    Does anyone have a reference for any implementation restrictions?

  • mattnewport
    mattnewport almost 15 years
    The way I read it he's asking about the impact on memory usage - he specifically asks what the effect on capacity is of resize. The standard doesn't specify the result in that case but the only reason to ask I can think of is a desire to free unused memory. The swap with temporary trick is the idiomatic way of achieving that.
  • MSalters
    MSalters almost 15 years
    The standard does specify the result by not specifying a decrease of capacity() for these operations. Therefore it can't decrease.
  • davidA
    davidA almost 12 years
    There are some environments where it is forbidden to allocate or free memory after an initial 'construction' phase. Vectors are usable in this environment as long as one can be sure they don't attempt to allocate or free memory during operations. So this question is relevant in this situation (which brought me here).
  • Ben Voigt
    Ben Voigt over 10 years
    resize downward is now documented as equivalent to a series of pop_back() calls and not erase. Does that remove the guarantee that capacity won't change? (See stackoverflow.com/q/19941601/103167)
  • Max Power
    Max Power about 2 years
    with g++ and libstdc++ 10: std::vector::shrink_to_fit does a fresh allocation. myvector.data() yields a different address every time I call shrink_to_fit()