I am trying to write an interpreter for John Tromp's binary lambda calculus
I have written code to do the following:
What happens then?
Or am I misunderstanding how BLC works?
I would suggest using http://www.ioccc.org/2012/tromp/hint.html as your primary reference. The Wikipedia page is probably fine, but his original notes on BLC are quite good.
On the topic of input and output, he has this to say:
More specifically, it defines a universal machine, which, from an input stream of bits, parses the binary encoding of a lambda calculus term, applies that to the remainder of input (translated to a lazy list of booleans, which have a standard representation in lambda calculus), and translates the evaluated result back into a stream of bits to be output.