[racket-dev] Surprising timings for loading Typed Racket

From: Matthew Flatt (mflatt at cs.utah.edu)
Date: Thu Jul 12 15:20:00 EDT 2012

At Thu, 12 Jul 2012 13:08:21 -0400, Sam Tobin-Hochstadt wrote:
> On Thu, Jul 12, 2012 at 11:49 AM, Matthew Flatt <mflatt at cs.utah.edu> wrote:
> >
> > I think the difference is GC variance due to the order that modules get
> > loaded. If I try
> >
> >  time racket -W debug -l racket/base -t tnull.rkt
> >  time racket -W debug -l racket/base -t rlnull.rkt
> >
> > then the times are extremely close, and I also see very similar GC times.
> 
> I see these results too.  That's good to know, but somewhat
> frustrating for me, since there's not much I can do about this.  Is
> the difference here something that would be affected by setting the
> nursery size at startup (if Racket had a way to do that)?

The difference becomes smaller with a 50MB nursery, but a 10MB nursery
makes little difference (actually, a little slower on my machine).
Change GEN0_INITIAL_SIZE in "newgc.c" if you want to play with this.

More generally: Suppose that B depends on A. If you ask for B and the
request eventually triggers A, then (parts of) B remain in the
continuation while you do A. In contrast, if you ask for A first, then
only the little plan to ask for B remains in the continuation while A
is in progress. I believe that's the main difference here.

In the long run, generational GC addresses the problem of parts of B
sitting around during A, but this example is too short for a major GC
to kick in.

Also, if you use a 50MB nursery, then there's still some difference.
Possibly it's a matter of other continuation-dependent things, such as
parameter lookup; those things might be constant time in the long run,
but you might see competing constants in the short run.


Of course, we usually optimize for long-term performance, and it's
sometimes tricky to balance short-term and long-term performance. Does
the difference that you see still matter when a program does something
useful?


Posted on the dev mailing list.