Entropy

kohmoto zbi74583 at boat.zero.ad.jp
Sat Apr 2 07:55:29 CEST 2005


    Hello, Seqfans.
    I have an idea which represents a scale for "Intelligence"of an 
algorithm.
    I think that it is a kind of "Entropy".

         Definition of E(a) :
         If a=algorithm

         E(a)=1-log{search area of a}/log{object of a}


         Example1: Prime test
         Divide N by k , k=0 to N^(1/2)

         E(P)=1-log(N^(1/2)/logN=1-1/2=0.5


         Example2 : Polynomial time prime test
         E(PP)=1-log(k*logN)/logN=almost{1-logk/logN}....it depends on k


         Example3: x^3+y^3+z^3+u^3=0
         E(3rd)=1-log(N^2)/log(N^4)=0.5


         Example4: ax^2+bx+c=0
         Calculate (-b+(b^2-4ac)^(1/2))/2a .... the search area is only one 
point

         E(2nd)=1-log1/logN=1


         Example5 : Unitary Amicable pair's record.
         It has 317 digits.
         I searched 10^8 candidates.

         E(UA)=1-8/317=0.975

        Do  Mathematicians know it?
        Or, is this scale not so good?

    Yasutoshi
 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.seqfan.eu/pipermail/seqfan/attachments/20050402/eb46a5d4/attachment.htm>


More information about the SeqFan mailing list