Re: ALFREDO RIZZI - 21-st Century Breakthrough - AKS algorithm (primality test) is O log n



On Jun 6, 12:38 pm, pamela fluente <pamelaflue...@xxxxxxxxx> wrote:
On 6 Giu, 17:16, Pubkeybreaker <pubkeybrea...@xxxxxxx> wrote:



I hope that you are being facetious........- Nascondi testo citato

Just sad irony. Unfortunately there isn't much to laught about. These
are people who
dont care destroying  other people lives and destroying real talent or
dedication, in the attempt to reduce the world to their level.

Anyway, talking of real science, it captured me your statement (if i
correcly undestood)
that just to set up the number for testing, we already "burn" log n.

You have to read (or otherwise insert) the number into memory!
It has O(log n) bits to read in!!!!


Computational complexity really fashinates me, but i still have some
(actually many) doubts.

About what? Perhaps I can clarify things.


Since you are here :-) ,  one or some questions. Assume i had to write
a paper where a present a new algorithm (eg, factoring or testing).

Do I have to include in the time computations even the input time of
the numbers
or it is kept out of the time computations ?  What is the proper way
(or rule) ?

For a new algorithm?? One definitely needs an analysis of the run
time.

Input time does not matter. If an algorithm is O(log^2 n) and the
input time is
O(log n), the latter is majorized by the former. O(log^2 n + log n) =
O(log^2 n)



Also what is more "correct", working with n as the actual number (for
instance being factored or tested), or with the number of digits ?

Algorithm run time is typically given in terms of bit complexity, but
it
really doesn't matter. One can express the result in terms of
log(n)
or in terms of the number of digits, or in terms of the number of
bits.
They are all the same. e.g. let ln denote natural log. Then ln(x)
= O(log(x))
where log is base 10 log........They only differ in the implied
constant
in the O() estimate.

May I suggest that you read a book on complexity analysis? I can
recommend
some good ones.

I can imagine we can easily pass from one to another
since the number
of digits are in the order of log,

Yep!

but just curious about
what would be considered best from the field experts.

There is no "best". It is merely a matter of notational convenience
and writing style.


Also why when talking about arrays we can say that a loop for i=0,,,n
(containing for instance 10 multiplications)
has time O(n) and we dont care about the intermediate operations.

It depends on what operations take place within the loop.
And if multiplications are in the loop, then the length of the
operands
DOES matter if one counts bit operations.

Or, one can just count 32-bit arithmetic operations, absorbing the
bit-level complexity into a 'word-level' complexity. The run time
complexity
will be the same except for a constant factor that is covered by the
big-oh
notation anyway.


Or
we normally say we can get a value from an hastable
in O(1)  [eghttp://www.cs.auckland.ac.nz/~jmor159/PLDS210/hash_tables.html
]. Is this another kind of convention ??

Table lookup is a different kind of operation. Table lookup can not
be done
in time O(1) because this would assume one was working on a pointer
machine [a machine where any piece of memory can be accessed in
constant
time, which is clearly not true for current computers; the bigger the
table, the more
memory is needed. Geometry requires that the more memory one has, the
farther
away it must be physically from the CPU. Now consider speed-of-light
limitations]

Real world computers are not pointer machines. Neither are Turing
machines --> To
access e.g. a memory square that is at distance 'D' from the square
under the current
pointer requires that we traverse D locations. This takes time
O(D).

Time is not the only complexity measure of an algorithm. Space is
also a major
requirement and many algorithms can make time/space tradeoffs.



From the discussions i have followed so far, it's like there were 2
different framework, one used by number theory experts, and
one used by computer scientists (??).

Not as I see it.

.