What is the best C++ data structure that could be used for storing and managing a collection of integers? [on hold]

  • A+
Category:Languages

this is my first StackOverflow question so please let me know if I didn't follow community guidelines with this question and if I should delete it.

I got my first ever interview question and I got rejected because of my implementation.

The question is:

Design and implement a C++ class that stores a collection of integers. On construction, the collection should be empty. The same number may be stored more than once.

Implement the following methods:

  1. Insert(int x). Insert an entry for the value “x”.

  2. Erase(int x). Remove one entry with the value “x” (if one exists) from the collection.

  3. Erase(int from, int to). Remove all the entries with a value in the range [from, to).

  4. Count(int from, int to). Count how many entries have a value in the range [from, to).

I thought a good implementation would be to use linked lists since it uses non-contiguous memory and removing entries would not require shuffling a lot of data (like in vectors or arrays). However, I got feedback from the company saying my implementation was O(n^2) time complexity and was very inefficient so I was rejected. I don't want to repeat the same mistake if a similar question pops up in another interview so I'd like to know what is the best way to approach this question (a friend suggested using maps but he is also unsure).

My code is:

void IntegerCollector::insert(int x) {     entries.push_back(x); }  void IntegerCollector::erase(int x) {     list<int>::iterator position = find(entries.begin(), entries.end(), x);     if (position != entries.end())         entries.erase(position); }  void IntegerCollector::erase(int from, int to) {     list<int>::iterator position = entries.begin();      while (position != entries.end())     {         if (*position >= from && *position <= to)             position = entries.erase(position);         else             position++;     } }  int IntegerCollector::count(int from, int to) {     list<int>::iterator position = entries.begin();     int count = 0;      while (position != entries.end())     {         if (*position >= from && *position <= to)             count++;          position++;     }      return count; } 

The feedback mentioned that they would only hire candidates that can implement solutions with O(nlogn) complexity.

 


The key consideration here is that integers of the same value are indistinguishable. Thus, all you need to do is store a count of each distinct value in the container.

Then, you can just use a std::map<int, size_t> as backing structure that maps each integer (key) to the number of times it exists in your data structure (value = count).

Inserting and erasing single elements is just incrementing and decrementing (possibly removing in the latter case) values for the given key (both O(log(distinct_values_in_container)) for finding the key).

Since std::map is ordered, you can use lower_bound and upper_bound to do binary search, so finding the keys in [from, to) is very efficient (finding the range is also O(log(distinct_values_in_container))). Erasing them or summing their counts is easy then (runtime is more complicated here).


If you want to gain extra credit, it will pay to understand the limitations of asymptotic runtimes. Consider these points:

What these asymptotic runtimes mean in practice depends a lot on the usage pattern. If no duplicates are ever inserted, we are at O(n), but you can also get arbitrarily good times (in terms of n = number of insertions) if there are lots of identical elements (for example, if each key has O(exp(n)) values then O(distinct_values_in_container) = O(log(n))). In the extreme case that all involved integers are the same, all operations are O(1).

As an interviewee, I would also talk about whether these asymptotic runtimes are meaningful in practice. It may be that the map's tree structure (which is toxic for the cache and branch predictor) loses to a simple std::vector<std::pair<int, size_t>> (if erasure is always in bulk) or even a std::vector<size_t> (if the keys are "dense") for the given application.


I think your main mistake (and why you were rejected) is not realizing that there is no need to store each inserted integer separately. You unfortunately also seem to have missed the possibility of keeping the list sorted, but I don't see where the O(n^2) comes from.

Comment

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: