After posting a video of the Turing Centenary talk that Greg Michaelson and I did, I got sent a question : “I didn’t exactly get it why didn’t you consider both UTC and LC to be equivalent.”

The question is understandable. After all, are we not taught that the Lambda Calculus and the Universal Digital Computer are equivalent?

It depends on what one means by equivalent. In an ideal, or idealist, sense they are equivalent. If a function can be computed by the lambda calculus, the function can be translated into a form that a universal digital computer can also compute and vice versa.

But from a materialist standpoint they are very different.

The lambda calculus by itself can compute nothing. It is only useful when you have a lamda calculus interpreter running on an actual computer. In which case it is not the lamda calculus that actually does the computation. The LC is a virtual machine running on a real machine which does the actual work.

In principle the actual computer that the LC is running on could be a person, a mathematician who has learned the rules of, or who has a crib sheet giving them the rules of the LC. Indeed when Turing was writing in the 1930s the word computer normally referred to a person who did computation.

The machine that Turing proposes is one directly modelled on what a person does when they compute. So Turing was from the start going in at a lower level than Church. Church relied on there being a human computer to do the actual work with his calculus. Turing asks what does a mathematician do when they apply any calculus?

He then proposes a mechanism that mimics the essential steps that the human does. He thus comes up with something more general than Church, since once you build the machine and supply it with energy it can do any calculations – including acting as an interpreter for the LC.

This forces us to consider computation as a material rather than an ideal process, and actually computer designers end up being preoccupied with very material considerations: minimising power use, disposing of waste heat, ventilation etc.

From the Universal Digital Computer you can move on to see that there are material limits to computation over and above the logical limits to computability originally outlined by Turing in his first paper. Limits to speed set by the speed of light, thermodynamic limits (Landauer limits). If you approach the issue from the level of the LC none of this is apparent.

Indeed the idealist abstractions that the LC encourages have in the past led to serious diversions of effort by platonist computer scientists. For example during the Alvey Programme in the 1980s in the UK, the influence of the platonists was so strong that they were able to promote multi million pound programmes to build LC machines that would, they claimed, achieve a high level of parallelism. The high parallelism was promised because it is in principle possible to evaluate the arguments of a pure functional language in parallel. This approach was claimed to hold great promise.

A machine called Alice1 was built using over 100 32 bit processors ( Transputers ). Greg and I were able to get access to it and benchmark it and discovered that it actually performed worse when running a functional language than a single processor made by a competing firm (a HLH Orion).

Actual parallel performance is constrained by ability to transmit data , bandwidth of channels and similar considerations. The ideal functional independence of parameters of LC like languages plays a very small part since the LC like languages have to run as virtual machines on top of a physical machine and it is by concentrating on the parallel physical machine constraints that advances were made. The whole project of functional language as a route to parallelism died a death on this account.

It is the geometry of 3D space within which we must transmit information that imposes the fundamental limits to parallel computation.