[plt-scheme] The Lambda Calculus behind functional programming

From: Matthias Felleisen (matthias at ccs.neu.edu)
Date: Thu Aug 30 10:52:22 EDT 2007

On Aug 30, 2007, at 12:52 AM, Michael Vanier wrote:

> Matthias Felleisen wrote:
>> On Aug 29, 2007, at 7:10 PM, Michael Vanier wrote:
>>> I've recently been working through the book "An Introduction to  
>>> Lambda Calculi for Computer Scientists":
>>> http://www.amazon.com/Introduction-Lambda-Calculi-Computer- 
>>> Scientists/dp/0954300653/ref=pd_bbs_sr_1/103-0899652-5021466? 
>>> ie=UTF8&s=books&qid=1188428868&sr=8-1
>>> It's a decent book.  Not without flaws, but it gets the job done.
>> I don't know Chris's book. I do know that lazy fp people tend to  
>> have an abusively narrow view of the connection between CS and LC.  
>> So study the source, Luke, is still a good thing.
> Can you elaborate?

You can view LC from a mathematical and a cs perspective. This is  
analogous to using analysis/calculus as a mathematician or physicists.

When I entered the area in the mid 1980s, 'lazy' people were obsessed  
with the 'pure LC'. They somehow thought that the best pl would be  
one that emulates LC to the comma. The "normal form" and "normal  
order" (versus applicative order) discussions are indicative, and  
allusions can be found in numerous papers and text books from this  
time. Indeed, the "Scheme guys" (SICP) weren't immune from this  
debate. I call this the math-centric view.

Plotkin in 1974 (with cbn, cbv, and lc) has spelled out the LC-as-a- 
CS tool perspective in a very elegant manner (though not an easily  
readable one). People just ignored the paper. You found almost no  
citations of the paper back then. In my mind, he started the cs- 
centric view of research on LC.

Abramsky in the early 90s re-discovered some of the essential ideas  
for 'lazy' people and many of them finally understood that evaluating  
CBN to Values was okay. But they insisted on calling them head-normal- 

I had used Plotkin's paper in the meantime to spell out a program of  
research on operational semantics, i.e., reduction systems for  
effects. Also see the LC plus calling conventions paper with Crank in  

To this day, however, some 'lazy' people haven't given up on NORMAL  
form and NORMAl order. Indeed, you find this mistaken math-centric  
view with Schemers, especially SICP-imbued people. This is why I am  
suspicious of books/texts by 'lazy' people on LC -- BUT I haven't  
read Chris's book and have no evidence that his book is math-centric.

My perspective is that you use LC to express/model your favorite  
problem area in PL and then you use those methods from mathematical  
LC that can help you solve it. Time and again, this approach has  
helped me get results. And I believe that my modus operandi heavily  
borrows from a theoretical physicists approach to mathematics. I  
don't know of a book that uses this approach properly and I am hoping  
our Redex book will do it well.

-- Matthias

Posted on the users mailing list.