Will a std::vector's capacity ever be reduced?

  • A+
Category:Languages

C++14 final working draft makes the following comment about std::vector:

Storage management is handled automatically, though hints can be given to improve efficiency.

cppreference says:

The storage of the vector is handled automatically, being expanded and contracted as needed.

And Wikipedia entry for Dynamic array says:

C++'s std::vector and Rust's std::vec::Vec are implementations of dynamic arrays

So I think that a vector should automatically reduce its capacity when its capacity is much larger than its size. I wrote the following code to check my assumption:

#include <iostream> #include <vector>  using namespace std;  int main() {   vector<int> v = {};    cout << "initialization" << endl;   cout << "  capacity: " << v.capacity() << endl;   cout << "  size: " << v.size() << endl;    for (int i = 1; i <= 10000; i++)      v.push_back(i);    cout << "after inserting a lot of elements" << endl;   cout << "  capacity: " << v.capacity() << endl;   cout << "  size: " << v.size() << endl;    v.erase(v.begin() + 1, v.begin() + 10000);   cout << "after erasing a lot of elements" << endl;   cout << "  capacity: " << v.capacity() << endl;   cout << "  size: " << v.size() << endl;    v.push_back(9);    cout << "after inserting another element" << endl;   cout << "  capacity: " << v.capacity() << endl;   cout << "  size: " << v.size() << endl; } 

I used g++ -std=c++14 code.cc to compile the code. And running the resulted a.out produces the following output. I am using macOS Mojave.

initialization   capacity: 0   size: 0 after inserting a lot of elements   capacity: 16384   size: 10000 after erasing a lot of elements   capacity: 16384   size: 1 after inserting another element   capacity: 16384   size: 2 

So it seems that a std::vector does not reduce its capacity even if its capacity is much larger than its size.

Will a std::vector ever reduce its capacity?
So are there some conditions to trigger the reduction of its capacity?

 


So I think that a vector should reduce its capacity when its capacity is much larger than its size.

Firstly, the standard would have to specify what "capacity is much larger than its size" would mean. That would limit the choices that implementations currently have over the reallocation strategy.

Secondly, if reducing the capacity required re-allocating and moving all the remaining elements. This means that all iterators may be invalidated by an erase, which limits safe usage.

Currently, erase states

Invalidates iterators and references at or after the point of the erase, including the end() iterator.

Thirdly, a vector is just as likely to reach it's high watermark of capacity again as it is to remain small for a long time.

You would be making the usage worse for a number of valid scenarios, for the dubious benefit of releasing a large allocation. Modern virtual memory systems deal fine with old allocations sticking around strictly longer than neccecary.

So are there some conditions to trigger the reduction of its capacity?

Yes, shrink_to_fit is an explicit request to do what you want. If you do desire it to reallocate to a smaller size, you can ask for that. Other usages, which would be harmed by shinks, are unaffected.

Comment

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: