[plt-scheme] 3rd-8th Grade

From: Bill Wood (william.wood3 at comcast.net)
Date: Mon Mar 20 02:27:58 EST 2006

I've been following this conversation with a great deal of interest.  I
don't consider myself to be very visually oriented, and my standard
response to the idea of "visual programming" is "if the visual metaphor
is so great, why do we have to annotate flawcharts (don't blame me, it's
David Gries' term :-) with text to specify invariants and other
properties of the data being munged?".  Still, two incidents give m
pause.

First, many years ago I was part of a team that developed a GUI-based
code generator for instrument test programs.  The standard language in
use then was ATLAS, which was purported to be a DSL with nouns and
adjectives for power and signal sources and sinks and verbs for
connecting them together into test setup.  There were also constructs
for specifying test sequencing.  As usual in our field, people found
that 100-line programs were wonderfully intuitive, and 10,000-line
programs were a maintenance nightmare.  We developed a system in
Smalltalk that provided a virtual test-bench, with a UUT (Unit Under
Test) socket block and a palette of icons for power and signal sources
and sinks.  The programmer used drag-and-drop to build a test setup
"circuit" on the screen, clicking on the icons to select electrical
characteristics -- voltage, wave-form, etc.  There was a second level of
structure, a connect-the-bubble window in which various tests were
connected together via arcs labeled with success- and failure-mode
conditions.  After building a test sequence, you click on the "Generate"
button and voila! ATLAS code spewed out.  We built it, were awarded a
patent, and shipped it off to the test division.  I later met people who
had used it and thought it a reasonable alternative to ATLAS coding.

Much more recently I was working for a company that was developing a
sort of massively parallel chip containing hundreds of simple CPUs
connected in a grid.  A group of us were trying to figure out how to the
program the thing.  Most of the software types lobbied for an
instruction or two for simple synchronization or communication
primitives in each CPU, and wanted to devise a C-like language with
more-or-less conventional multi-process capabilities.  I looked around a
little, stumbled on Dr. Edward Lee's work at Berkeley on the
heterogeneous simulation system Ptolemy (I think now Ptolemy II).  It
supported a slightly generalized synchronous data-flow model they called
"token flow".  The system included a graphical programming environment
with icons for standard function blocks, mechanisms for specifying
custom function blocks, and drag-and-drop construction of data-flow
networks.

After some work I learned how to think in these networks a little; my
crowning achievement was implementing 32-bit CRC with it.  One of the
most interesting things I discovered was that the software people didn't
seem to get it, but the EE's thought it perfectly natural.  One of the
EE's helped me discover that I could even design for performance!  It
turned out, for example, that short fat networks tended to have high
parallelism and short latency while long skinny networks tended to have
low parallelism and high latency.  Also, the "area" of a network
corresponded to the number of CPU's required.

I'm still not convinced that visual programming environments will work
well for the conventional imperative, relational or functional
computational paradigms.  However, they may be just the thing for other
approaches that we're not so familiar with yet.

 -- Bill Wood




Posted on the users mailing list.