So lately I have been working on writing a simple compiler to better understand compiler concepts. Being a diligent reader of stackoverfolow, it seems there is a consensus that writing a compiler in a functional language is easier than an imperative one. To this end I thought I would try and kill two birds and write a compiler in F# to both learn a functional language and write a compiler at the same time.
I have been reading through the dragon book and decided to start with a recursive descent parser written by hand in F#. The dragon book, however, has almost all of the code samples in an imperative style. For instance, the match token function does a significant part of its work via side effect.
So my question is what would a more traditional functional approach to parsing (i.e. few side effects) look like? I know that the Haskell compiler (GHC) is written in Haskell but I would appreciate a somewhat smaller and easier to comprehend code sample.
Second, is it worthwhile to try and adopt a functional approach to parsing, or is it really on optimizations to intermediate code that functional languages shine and I just haven't gotten there yet? That is, should I fuddle through the parsing in F# using an imperative style and switch to a more functional approach later on?
Answer derived from this blog article:
So my question is what would a more traditional functional approach to parsing (i.e. few side effects) look like?
Sounds like you need to separate functional (as in Lisp, Scheme, Standard ML, CAML, OCaml, F#) from purity (absence of side effects, as in Haskell) and incidental language features (algebraic datatypes, pattern matching).
Thanks to algebraic datatypes, pattern matching and higher-order functions, F# is a good for parsing and great for transformations and code generation but most production parsers written in F# are not pure. Historically, the family of languages F# is mostly derived from (the MetaLanguages, or MLs) were bred specifically for this kind of metaprogramming.
Here is a very simple set of mutually-recursive active patterns that parse and evaluate mathematical expressions composed of single digits, + - *
operators and bracketed subexpressions:
> let rec (|Term|_|) = function
| Factor(e1, t) ->
let rec aux e1 = function
| '+'::Factor(e2, t) -> aux (e1 + e2) t
| '-'::Factor(e2, t) -> aux (e1 - e2) t
| t -> Some(e1, t)
aux e1 t
| _ -> None
and (|Factor|_|) = function
| '-'::Factor(e, t) -> Some(-e, t)
| Atom(e1, '*'::Factor(e2, t)) -> Some(e1 * e2, t)
| Atom(e, t) -> Some(e, t)
| _ -> None
and (|Atom|_|) = function
| c::t when '0'<=c && c<='9' -> Some(int(string c), t)
| '('::Term(e, ')'::t) -> Some(e, t)
| _ -> None;;
val ( |Term|_| ) : char list -> (int * char list) option
val ( |Factor|_| ) : char list -> (int * char list) option
val ( |Atom|_| ) : char list -> (int * char list) option
Here is an example of it being used to parse and evaluate an expression:
> let (Term e) = List.ofSeq "1+2*(3-4)*-5";;
val e : int * char list = (11, [])
That's a pure solution that's using pattern matching over lists with F#'s active patterns. In reality, you'll want to define a type for your abstract syntax tree and return a value of that type. This is really easy in F#:
type expr =
| Int of int
| Neg of expr
| Add of expr * expr
| Sub of expr * expr
| Mul of expr * expr
static member (~-) f = Neg f
static member (+) (f, g) = Add(f, g)
static member (-) (f, g) = Sub(f, g)
static member (*) (f, g) = Mul(f, g)
let rec (|Term|_|) = function
| Factor(e1, t) ->
let rec aux e1 = function
| '+'::Factor(e2, t) -> aux (e1 + e2) t
| '-'::Factor(e2, t) -> aux (e1 - e2) t
| t -> Some(e1, t)
aux e1 t
| _ -> None
and (|Factor|_|) = function
| '-'::Factor(e, t) -> Some(-e, t)
| Atom(e1, '*'::Factor(e2, t)) -> Some(e1 * e2, t)
| Atom(e, t) -> Some(e, t)
| _ -> None
and (|Atom|_|) = function
| c::t when '0'<=c && c<='9' -> Some(Int(int(string c)), t)
| '('::Term(e, ')'::t) -> Some(e, t)
| _ -> None
let (Term e) = List.ofSeq "1+2*(3-4)*-5"
Note that only one minor change to the parser was required because the AST can also be constructed using the +
, -
and *
operators.
Second, is it worthwhile to try and adopt a functional approach to parsing, or is it really on optimizations to intermediate code that functional languages shine and I just haven't gotten there yet?
You're talking about purity, not functional programming. Purity is not particularly useful in the context of parsing text and, in fact, can be a real hindrance (e.g. interning symbols is a nightmare in Haskell). However, F# has many other benefits that make it good for this set of problems. In particular, although other languages like OCaml have much better tools for parsing, I think F# is the best .NET language in this context.
That is, should I fuddle through the parsing in F# using an imperative style and switch to a more functional approach later on?
Depends entirely upon what you want to make functional. I'd use fslex and fsyacc with pure code to construct ASTs in the actions but impurities for anything like hash consing or generating unique IDs.
You may appreciate the following articles I have written on this subject at this blog (note paywall):
System.Reflection.Emit
" (31st August 2008).