<table cellspacing="0" cellpadding="0" border="0" ><tr><td valign="top" style="font: inherit;">Three years ago, when I was taking an AP course in Logan High, I devised a few benchmarks to compare Scheme with my favorite language, which is Clean. Then I decided to compare Scheme implementations. I discovered that PLT was much slower than, for instance, Bigloo or Stalin. I store the fact that PLT is slow somewhere in the back of my brain, and whenever I need to code something in Scheme or Lisp, I rule out PLT.<br><br>Yesterday, I received a request to compare PLT with Bigloo, Larceny, SBCL and Gambit. Therefore I fetched my old benchmarks. To my surprise, PLT 4.2 turned out to be quite fast. In some cases, it was slightly faster than Bigloo compiled with -Obench option, and 30% faster than Gambit or Larceny. In the worst case, it was only three times slower than Bigloo. In most case, it was 30% slower than Bigloo. <br><br>Let us consider the neural network
benchmark. Bigloo runs it in 1.3 seconds in a Pentium quadricore, Windows XP. PLT runs it in 1.15 s. The benchmark has vectors and a lot of floating point calculations. What amazes me is that PLT accepted it as is, i.e., I did not use special vectors (like f64vectors), not compile options (like -Obench, -farithmetics, -copt -O3). The neural net benchmark is attached to this email, so you people can check my claims.<br><br>I would like to know what happened to PLT. Did its performance improve a lot since 2005? How can it compile floating point operations so well without any type declaration, or without special operators like *fl, /fl, etc?<br><br>I noticed also that PLT is not so good at array intensive computations. In one of those Lisp benchmarks designed to show that Lisp can be as fast as C, PLT (7.9s, without f64vector) is twice as much time as Bigloo (4.3s, without f64vector).. Does PLT team intends to improve array
processing?<br><br><br><br><br>(module pnet scheme<br> <br>#| Given a vector to store the weights, and a<br> list ws of indexes, newn builds a neuron. E.g.<br> (let [ (v '#(0 0 0 0)) ]<br> (newn v '(0 1 2)))<br> produces a two input neuron that uses v[0],<br> v[1] and v[2] to store its weights. |# <br><br><br>(define (sig x) (/ 1.0 (+ 1.0 (exp (- x))) ))<br><br><br>(define (newn v ws)<br> (lambda(xs)<br> (sig (let sum ( (i ws) (x (cons 1.0 xs)) (acc 0.0))<br> (if (or (null? i) (null? x)) acc<br> (sum (cdr i) (cdr x) <br> (+
(* (vector-ref v (car i) )<br> (car x)) acc))<br> )<br> )<br> )<br> )<br>)<br><br><br>;; Given a vector vt, (prt vt) creates<br>;; a neuron network that can learn to act<br>;; like a logical port.<br><br>(define in-1 car)<br>(define in-2 cadr)<br><br>(define (gate vt)<br> (let ( (n1 (newn vt '(4 5 6)) ) <br> (ns (newn vt '(0 1 2 3))) )<br> (lambda (i)<br> (if (null? i) vt<br> (ns (list (in-1 i)
<br> (n1 (list (in-1 i) (in-2 i))) <br> (in-2 i) ))) )))<br><br>;; Here is how to create a xor neural network:<br><br>;;(define xor (gate (vector -4 -7 14 -7 -3 8 8)))<br><br>(define xor (gate (vector 2 3 0 3 5 1 8)))<br><br>(define dx 0.01)<br>(define lc 0.5)<br><br>(define *nuweights* (make-vector 90) )<br>(define *examples* #f)<br><br>(define (assertWgt vt I R)<br> (vector-set! vt I R) R)<br><br>(define (egratia eg) <br> (vector-ref *examples* <br> (min eg (- (vector-length *examples*) 1)) )) <br> <br>(define (setWeights vt Qs)<br> (do ( (i 0 (+ i
1)) )<br> ( (>= i (vector-length vt)) vt) <br> (vector-set! vt i <br> (vector-ref Qs i)) )) <br> <br>(define (errSum prt Exs)<br> (let sum ( (e Exs) (acc 0.0))<br> (if (null? e) acc<br> (let* ( (eg (egratia (car e)))<br> (vc (prt (cdr eg)))<br> (v (car eg)) )<br> (sum (cdr e) (+ acc (* (- vc v) (- vc v)) ) )<br> )<br> )<br> )<br>)<br><br><br>(define
(updateWeights prt vt err0 ns Exs)<br> (do ( (i 0 (+ i 1)) ) ((> i ns))<br> (let* ( (v (vector-ref vt i))<br> (v1 (assertWgt vt i (+ v dx)))<br> (nerr (errSum prt Exs))<br> (nv (+ v (/ (* lc (- err0 nerr)) dx) ) ) )<br> (assertWgt vt i v)<br> (vector-set! *nuweights* i nv) ) )<br> (setWeights vt *nuweights*) )<br><br><br>(define (train p exs) <br> (set! *examples* exs )<br> (set! *nuweights* (make-vector 90)) <br> (setWeights (p '()) '#(0 1 0 0 2 0 0))<br> (do ( (vt (p '()))
<br> (exs '(0 1 2 3 3 2 1 0)) )<br> ( (< (errSum p exs) 0.001) )<br> (updateWeights p vt (errSum p exs) <br> (- (vector-length vt) 1) exs) ) )<br><br>(define *exs* <br> '#( (0 1 1) (1 0 1) (1 1 0) (0 0 0)) )<br><br>(define (start args)<br> (time (train xor *exs*))<br> (display (list "1-1=" (xor '(1 1))))<br> (newline)<br> (display (list "1-0=" (xor '(1 0))))<br> (newline)<br> (display (list "0-1=" (xor '(0 1))))<br> (newline)<br> (display (list "0-0=" (xor '(0 0))))<br> (newline)<br>)<br><br>(start 0)<br>)<br>;;(training xor '( (0 1 1) (1 1 0) (1 0 1) (0 0 0))
)<br> <br><br><br><br></td></tr></table><br>
<p class="MsoNormal"> </p>
<tbody><tr>
<td style="padding: 0.75pt;">
<div class="MsoNormal" style="text-align: center;" align="center"><font face="Times New Roman" size="3"><span style="font-size: 12pt;">
<hr align="center" size="1" width="100%">
</span></font></div>
<p class="MsoNormal"><font face="Times New Roman" size="3"><span style="font-size: 12pt;"><img id="_x0000_i1026" src="http://us.i1.yimg.com/us.yimg.com/i/ca/iotg_search.jpg" align="absbottom" border="0" height="25" hspace="4" width="25"><a href="http://ca.toolbar.yahoo.com/" target="_new"><b><span style="font-weight: bold;" lang="NO-BOK">Yahoo!
Canada Toolbar :</span></b><span lang="NO-BOK"> Search from anywhere on
the web and bookmark your favourite sites. Download it now! </span></a>
</span></font><span lang="NO-BOK"><o:p></o:p></span></p>