[plt-scheme] (Not) Statistics (V301.5 Speed Up)
Okay, a more general question: How do you guage compiler performance?
Obviously, we can count reductions. If we generate code for a RAM or a
realistic machine (virual or hardware), then there is program size,
memory usage, number of registers needed, but it seems hard to
understand how the various factors interact and how they should be
weighted. (And I haven't said a thing about the underlying hardware).
Another, totally unrelated, question is how one goes about guaging the
performance of idiomatic code (whatever that means). I've been
following the discussion the language shootout on and off over on the
Haskell list, and I think a common concern is that the best performing
code doesn't look at all natural or like idiomatic Haskell. I suppose
the same is true of Scheme: you want a compiler that works well with
the kind of code people really write. But who is to say what is
"natural" or "idiomatic"?
===
Gregory Woodhouse <gregory.woodhouse at sbcglobal.net>
"All truth passes through three stages: First, it is ridiculed.
Second, it is violently opposed. Third, it is accepted as
being self-evident."
--Arthur Schopenhauer