Okay, so what is happening here puzzled me for 2 days. Basically, as an activity I was measuring how much time some algorthim takes to execute in its worse, average, and best cases. Until now, I got expected results for all the ones I tested, except for the binary search: Linear search time was increasing linearly, bubble sort was increasing following N^2, etc. Also the average values were as expected, so in linear search it was almost half of the worse case, etc.
And then came binary search. As expected, both average and worse case where following log(N), but the worse case was a curve smaller than the average case (?????), and I can’t figure out why. Here is the code used to retrieve the data:
for(int i = 0; i<sample_size;i++){
int search_target;
switch (mode)
{
case BEST_CASE:
search_target = *(int*)vector_get(vec, (vector_size(vec)-1)/2);
break;
case AVERAGE_CASE:
search_target = *(int*)vector_get(vec, rand() % element_amt); //random index
break;
case WORST_CASE:
search_target = *(int*)vector_get(vec, vector_size(vec)-1); //Biggest value
break;
}
double start_time = _get_timestamp();
int itt = vector_binary_search(vec, &search_target, _diff_number);
double time_taken = _get_timestamp() - start_time;
total_time += time_taken;
total_itt += itt;
}
So basically for different sizes of vectors I would run this function with a sample size (to remove a bit of the noise) and then plot the graph (this is being done correctly, I did not plot the average as worse or anything like it, I checked multiple times). Also, vec is a vector with values from 0 up to size-1.
This is what I got, multiple times, trying them in a lot of different ways:
green is average case, purple is worse case. (x=vec size, y=time taken)
same thing (x=vec size, y=iterations taken)
I have no idea of why this is happening.
So, here is the full code, if anyone would like to give it a try:
https://github.com/rsadr0pyz/weird