Styles of semantics (Programming Languages)
Denotational Semantics
In this approach, the presentation of the semantics of a programming language has two parts: in the first, one identifies a mathematical structure within which the meaning of programs will be found; and in the second part, one defines a function that maps the syntax of the language to elements of this structure. Crucially, this function is defined by structural recursion on the syntax, so that the meaning of each construct is defined in terms of the meanings of its components.
Typically, the mathematical structures that are used are partially ordered sets, with the ordering relating elements if one is more 'defined' than the other. This ordering arises naturally in considering partial functions, and the theory of monotonic functions on partial orders proves the existence of the fixpoints that are needed to give meaning to programs that involve recursion or loops.
As Strachey observed, fixing the domain within which programs in a language have their meanings tells us a great deal about the language, and is a sure guide to the design of the language. Traditional texts on Denotational Semantics have spent more effort on establishing that these domains exist than on using them to describe the features of particular programming languages, a fact that I sometimes imagine to be connected with the fact that Christopher Strachey, the great founder of the denotational approach, died in 1976, while his more theoretical colleagues lived on.
In denotational semantics, we might define the meaning exec [[S]] of a statement S to be a function from states to states, and capture the meaning of while
loops with this equation:
- exec [[while B do S]] = fix (λf -> (λs -> if eval [[B]] s then f (exec [[S]] s) else s))
The crucial point here is that the value of exec [[while B do S]] is defined in terms of the values of eval [[B]] and exec [[S]] using the fixpoint operator fix. One can argue separately that such a fixpoint operator exists for the domains that are being used.
Operational Semantics
In Operational Semantics, one describes the process of executing a program, and says that the meaning of a program consists of the collection of behaviours that are shown by this mechanism. Traditional operational semantics was expressed in terms of abstract machines that executed a program or evaluated an expression step by step, but more modern presentations use inference systems to achieve the same effect in a more abstract way.
For our example of the while loop, we might write E, m => v to mean that when evaluated in memory state m, the expression E has value v, and we might write S, m => m1 to mean that when started in memory state m, the statement S terminates with memory state m1. In these terms we can express the meaning of the while loop with two rules:
- If B, m => false, then (while B do S), m => m.
- If B, m => true and S, m => m1 and (while B do S), m1 => m2, then (while B do S), m => m2.
An alternative way of giving operational semantics might be to write an interpreter in a functional language: we might define
- exec [[while B do S]] m = if eval B m then (let m' = exec S m in exec [[while B do S]] m') else m
Axiomatic Semantics
This approach focusses on describing the set of true claims that can be made about a program. In the usual language of imperative programs, these claims can be expressed as Hoare triples {P} S {Q}, meaning 'if program S is started in a state satisfying relation P, then it is bound to terminate successfully in a state satisfying relation Q.' [Hoare's original paper used partial correctness, not the total correctness that we use here.] In Hoare's style, the meaning of a while loop is given by the following inference rule: for any invariant P and bound function T such that P ==> T >= 0, from
- {P and B and T = t0} S {P and T < t0}
one may deduce
- {P} while B do S {P and not B}.
Dijkstra's weakest preconditions show that axiomatic semantics can be presented in a denotational style. Here, the meaning of a statement S is a 'predicate transformer' wp(S, -) that gives for each post-condition Q the weakest pre-condition wp(S, Q) that is sufficient to guarantee that S will terminate in a state satisfying Q. Thus {P} S {Q} holds if and only if P ==> wp(S, Q).
This weakest pre-condition can be defined by structural recursion on the syntax of S. The meaning of the While loop is given as a fixpoint:
- wp(while B do S, Q)
is the strongest relation R such that
- (B and wp(S, R)) or (not B and Q) ==> R,
that is, the strongest solution of the fixpoint equation F(R) = R, where F(R) is the LHS of the above implication. This definition has the crucial property that wp(while B do S, -) is defined in terms of B and wp(S, -). [Dijkstra's original formulation defined the semantics of While as the limit of an explicit sequence of relations, but it amounts to the same thing: a fixpoint.]
Our approach
Denotational in style. From one point, operational in essence, for we are saying that the meaning of a program is the collection of behaviours exhibited by our interpreter. But if we regard Haskell not as a programming language, but as a notation for writing down denotations, then the link with denotational semantics becomes stronger. And our aim is to describe enough about Haskell axiomatically that we can reason about languages and implementations in a practical way without having constantly to return to the foundations.