Binary vs Linear searches for unsorted N elements

I try to understand a formula when we should use quicksort. For instance, we have an array with N = 1_000_000 elements. If we will search only once, we should use a simple linear search, but if we'll do it 10 times we should use sort array O(n log n)....

Sort array elements based on their frequency

I need to sort an array of elements based on their frequency, for example:I tried with the code below:

Is it possible to compare two binary trees in less than O(n log n) time?

I wrote a java routine to compare 2 binary trees. I am looking for better algorithms that run in less time.

All the ways to return 3 if you get 7 and vice versa – interview question

This is a question that I was asked in an interview: Implement a function that gets an integer n and does the following: 1. if n is 3 -> return 7. 2. else if n is 7 -> return 3. 3. otherwise return any number you like (undefined behavior).

Why does this O(n^2) code execute faster than O(n)? [duplicate]

This question already has an answer here:I have written code for two approaches to find out the first unique character in a string on LeetCode.

Why does O(n^2) code execute faster than O(n)?

I have written code for 2 approaches to find out the first unique character in a string on LeetCode.Problem Statement: Given a string, find the first non-repeating character in it and return it's index. If it doesn't exist, return -1.

data structure with O(1) search time complexity in c++

is there a data structure in c++ that has a search time complexity of O(1)? As in to check if an element is present in it or not, and if present, what its position or associated index/ key/value is

O(NlogN) algorithm runs faster than O(n)… wait, what?

I am a bit confused, to be honest. I was working out the one of the classical algorithm problems. Given a collection of integers, find if are there 2 elements summing to a given number.