# [plt-scheme] HtDP Section 29 -- Abstract Running Time & Big-O

 From: Stephen Bloch (sbloch at adelphi.edu) Date: Mon Aug 24 17:07:42 EDT 2009 Previous message: [plt-scheme] Re: HtDP Section 29 -- Abstract Running Time & Big-O Next message: [BULK] [plt-scheme] HtDP Section 29 -- Abstract Running Time & Big-O Messages sorted by: [date] [thread] [subject] [author]

```On Aug 24, 2009, at 1:47 PM, David Yrueta wrote:

> ... a quadratic equation which accepts the number of elements in
> the list -- 1/2 n^2 + 1/2n -- and outputs abstract running times
> (N=4; abstract running time = 10).  We took the abstract running
> time as "on the order of n^2" because, as Eugene says, "we ignore
> the coeeficients because, as n grows, they matter less and less."
> Am I in the ballpark?

Yes, that's on the right track.  If Program A runs twice as fast (or
ten times as fast) as Program B, you can make up the difference by
running Program B on faster hardware, or a better operating system,
or with a more-optimizing compiler, or something like that.  But if
Program A runs in O(n) time and Program B in O(n^2) time, nothing you
can do will make Program B consistently faster than Program A, short
of changing the algorithm.

Now suppose your goal is to study algorithmic efficiency, and you're
not interested in hardware, compiler optimizations, or operating
systems.  By ignoring constant factors, the effects of hardware,
compiler, and operating system disappear completely and you're left
with only differences between algorithms.  It's like one of those
pictures where there's a message written in blue dots, obscured by a
lot of red dots: if you look at it through a red filter, the red dots
disappear and you're left with a clear view of what you were
interested in.

One can introduce ordering operators "<", ">", "<=", ">=", and "="
into the space of functions.  (I'll keep the quotation marks, to
distinguish these things from ordinary numeric ordering operators.)
If function f, in the long run, is at most a constant factor larger
than g, then we say f "<=" g or g ">=" f.  If f "<=" g and
simultaneously g "<=" f, we say f "=" g, and that f and g have the
same order of growth.  If f "<=" g but it's NOT true that g "<=" f,
we say that f "<" g, i.e. that f grows strictly slower than g.  In
fact, these operators act pretty much the way you expect except that
they don't obey trichotomy, i.e. it's quite possible for particular
functions f and g to satisfy none of "<", "=", nor ">".

It turns out that once you agree to ignore constant factors, you are
forced to also ignore slower-growing additive terms, because (for
example) n^2 "<=" n^2+n "<=" 2n^2 "=" n^2.  n^2+n, being pinched in
between two things that are "equal", must be "equal" to both of
them.  (This is intuitive reasoning, but it can also be made rigorous.)

Stephen Bloch