std::vector resize downward
Solution 1
Calling resize()
with a smaller size has no effect on the capacity of a vector
. It will not free memory.
The standard idiom for freeing memory from a vector
is to swap()
it with an empty temporary vector
: std::vector<T>().swap(vec);
. If you want to resize downwards you'd need to copy from your original vector into a new local temporary vector and then swap the resulting vector with your original.
Updated: C++11 added a member function shrink_to_fit()
for this purpose, it's a non-binding request to reduce capacity()
to size()
.
Solution 2
Actually, the standard does specify what should happen:
This is from vector
, but the theme is the same for all the containers (list
, deque
, etc...)
23.2.4.2 vector capacity [lib.vector.capacity]
void resize(size_type sz, T c = T());
6) Effects:
if (sz > size())
insert(end(), sz-size(), c);
else if (sz < size())
erase(begin()+sz, end());
else
; //do nothing
That is to say: If the size specified to resize
is less than the number of elements, those elements will be erased from the container. Regarding capacity()
, this depends on what erase()
does to it.
I cannot locate it in the standard, but I'm pretty sure clear()
is defined to be:
void clear()
{
erase(begin(), end());
}
Therefore, the effects clear()
has on capacity()
is also tied to the effects erase()
has on it. According to the standard:
23.2.4.3 vector modifiers [lib.vector.modifiers]
iterator erase(iterator position);
iterator erase(iterator first, iterator last);
4) Complexity: The destructor of T is called the number of times equal to the number of the elements erased....
This means that the elements will be destructed, but the memory will remain intact. erase()
has no effect on capacity, therefore resize()
and clear()
also have no effect.
Solution 3
The capacity will never decrease. I'm not sure if the standard states this explicitly, but it is implied: iterators and references to vector's elements must not be invalidated by resize(n)
if n < capacity()
.
Related videos on Youtube
Deduplicator
Updated on July 05, 2020Comments
-
Deduplicator almost 4 years
The C++ standard seems to make no statement regarding side-effects on capacity by either
resize(n)
, withn < size()
, orclear()
.It does make a statement about amortized cost of
push_back
andpop_back
- O(1)I can envision an implementation that does the usual sort of capacity changes ala CLRS Algorithms (e.g. double when enlarging, halve when decreasing
size to < capacity()/4
). (Cormen Lieserson Rivest Stein)Does anyone have a reference for any implementation restrictions?
-
mattnewport almost 15 yearsThe way I read it he's asking about the impact on memory usage - he specifically asks what the effect on capacity is of resize. The standard doesn't specify the result in that case but the only reason to ask I can think of is a desire to free unused memory. The swap with temporary trick is the idiomatic way of achieving that.
-
MSalters almost 15 yearsThe standard does specify the result by not specifying a decrease of capacity() for these operations. Therefore it can't decrease.
-
davidA almost 12 yearsThere are some environments where it is forbidden to allocate or free memory after an initial 'construction' phase. Vectors are usable in this environment as long as one can be sure they don't attempt to allocate or free memory during operations. So this question is relevant in this situation (which brought me here).
-
Ben Voigt over 10 years
resize
downward is now documented as equivalent to a series ofpop_back()
calls and noterase
. Does that remove the guarantee that capacity won't change? (See stackoverflow.com/q/19941601/103167) -
Max Power about 2 yearswith g++ and libstdc++ 10: std::vector::shrink_to_fit does a fresh allocation.
myvector.data()
yields a different address every time I call shrink_to_fit()