Saturday, March 18, 2006


Simple lambda evaluator

I've just written a simple lambda calculus parser/evaluator in python - it has been rather fun! You can download it from laurasia. The code is rather raw, but comes with a money back guarantee ;-)

Updated: 28/March
An improved evaluator is now available. Note that the name of the evaluator code file has been changed from to This change was required since lambda is a reserved word in Python. I have also written a simple Lisp-ish front-end for the lambda evaluator - I'm calling this language elle. So far I've been able to calculate 3! (factorial) - all done using the Y combinator (of course!).

I gave a talk on Monday about the lambda calculus, and have put the powerpoint slides online. The talk went well. The main point that I wanted to get across was that there are alternative approaches for programming beyond the world of C, C++, and Java. But the audience wasn't too interested in the λ-calculus formalism, and maybe it would have been better to give a more practical talk focusing on a real functional programming language.

Tuesday, March 14, 2006


Static analysis of K&R C code

I've got a body of C code (mostly K&R) that I'd like to quickly dip into. It would be great if there was a tool that could visualise the (Static) call relationships between functions - I don't expect the tool to be able to meaningfully interpret function pointers, so it would not have to do anything smart with the following code:
int foo( int x ) { return x + 1; }
int bar( int x ) { return x + x; }

int baz( int (*qux)(int), int x )
  return qux( x );

int main( int argc, char ** argv )
  return baz( 1 == argc ? foo : bar );
But the tool should be able to identify that main() calls baz(), and that baz() invokes a function pointer. It would be absolutely fantastic if it identified that baz() calls either foo() or bar(), but I won't hold my breath waiting for this functionality.

Do you know of any tools like this?

A quick search on google did not find any tools like this. I did find:


The Rule of Least Power

Tim Berners-Lee recently wrote about the The Rule of Least Power. The abstract is: When designing computer systems, one is often faced with a choice between using a more or less powerful language for publishing information, for expressing constraints, or for solving some problem. This finding explores tradeoffs relating the choice of language to reusability of information. The "Rule of Least Power" suggests choosing the least powerful language suitable for a given purpose. In principle I agree with what is being said, but an alternative view is given in A Universal Scripting Framework or Lambda: the ultimate "little language". Now I love playing with Lex and YACC and creating little languages - it is a lot of fun - but I'm rapidly coming around to the viewpoint advocated by Olin Shivers in the above paper. Why reinvent mechanisms for basic control etc. when this is already a well developed area? Or more directly, reinventing this stuff is a waste of my time, and gets in the way of more interesting work. Embedding large language interpreters (e.g. Python, Perl, Rexx etc...) into an application strongly dictates the structure of the little language. Indeed, without extensive reworking of the language parser, any constructs specific to the little language will always be second-class citizens in the program. Others have advocated large code-generation frameworks for managing domain-specific languages, but this typically makes a lot of assumptions about the kinds of languages that can be generated. At worst there will be lots of arbitrary constraints on how data can be exchanged between tools, and it is not clear that this approach is structurally sound. But there is an alternative approach, and that is to use a language with minimal syntax and a minimal number of reserved words (e.g. Lisp, Scheme, TCL, Forth, XML). With such an approach it is possible to invent your own control structures, and integrate them into the language in a way that appears identical to the built-in primitives - the new domain-specific are no longer visually second-class. XML is an extreme example - as a declarative language it has no model for imperative actions, and to 'execute' XML requires an interpreter or compiler (i.e. a lot of work!). But it is ideal for modelling data, and often that is sufficient.

History note: the term little languages was (apparently) first used by Jon Bentley in this Programming Pearls column.

Update: This paper looks interesting - added to the 'to-read' list!

Monday, March 13, 2006


References to the pi-calculus

The π-calculus is a process algebra for modelling concurrency. The π-calculus allows dynamic topologies of processes to be modelled, whereas the earlier CCS (calculus of communicating systems) and CSP (communicating sequential processes) process algrebras could only model static (unchanging) topologies. The best introductory work that I've found so far is An Introduction to the pi-Calculus. It is far more approachable than the tutorial by Robin Milner (The Polyadic π-Calculus: A Tutorial). An interesting recent work is Algebraic Process Calculi: The First Twenty Five Years and Beyond, which are the proceedings from a workshop held in 2005. It appears that in the last 25 years there has been an explosion in the number of process algebras, and that there is quite a lot of research being performed in this area. So understanding the π-calculus is really just a beginning!

I've started trying to model the Minix-3 system using the π-calculus. It is mainly a pedagogical exercise - my immediate aim is to become more familiar with the π-calculus, but it would be nice to start using the process algebra to start proving behaviours about Minix-3.

This page is powered by Blogger. Isn't yours?