Why in .Net HashHelpers.IsPrime is implemented in a non-optimal way?

  • A+
Category:Languages

Looking at .NET System.Collections.Generic.Dictionary<T,T> implementation I found a method HashHelpers.IsPrime(n) which check if number is prime or not. I was confused a little bit why they use very simple optimization technique testing only odd numbers starting from 3.

(from source code)

int limit = (int)Math.Sqrt (candidate); for (int divisor = 3; divisor <= limit; divisor+=2) {      if ((candidate % divisor) == 0)           return false;      } return true; 

So, they reduce checks in twice from 3 to limit. But more optimal is testing for numbers 6*k-1,6*k+1 according to Wikipedia reducing tests in 3 times. And I think there's even more optimal and faster solutions for primality test.

I understand that in particular Dictionary<T,T> implementation it's not so important because it is called only for huge dictionaries and in pretty rare cases while resizing. But in general, it's a framework and very popular from well known company. Maybe some logic exists or I don't see something here? Thanks.

 


I think there's even more optimal and faster solutions for primality test.

Yep.

Maybe some logic exists or I don't see something here?

Nope. You've accurately summed up the situation.

You ask:

Why in .Net HashHelpers.IsPrime is implemented in a non-optimal way?

and then you answer your own question:

I understand that in particular Dictionary<T,T> implementation it's not so important because it is called only for huge dictionaries and in pretty rare cases while resizing.

So you know the answer to your question. It's not optimized because fast enough is by definition fast enough, and the given algorithm is fast enough.

If you want to make it faster, hey, it's open source. Go implement an algorithm you like better and submit the detailed results of your carefully designed, accurate and precise empirical performance tests that clearly demonstrate that making an unnecessary change to a widely used foundational piece of functionality is justified by your superior performance algorithm in an unimportant and rare scenario.

If that sounds like a lot of work for almost no benefit then again, you know the answer to your question. Expensive work that has small benefits gets sorted to the bottom of the priority list.

Comment

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: