[POPLmark] Experience report with Twelf

Dan Licata drl+ at cs.cmu.edu
Thu Aug 18 15:04:09 EDT 2005


> when if Twelf hypothetically had polymorphism we could have just written 
> something like

I meant to address this in response to the original message. If LF had
sums or polymorphism, then Twelf wouldn't be very much like what it is.  
That is, you cannot simply add sums or polymorphism to LF without
compromising the LF methodology.  

Why not?  To represent an object-language judgement (for example--the
same is true for syntax or anything else) in LF, you establish a
bijection between the derivations of the judgement and the canonical
terms of a particular LF type in contexts of a particular form.  The
theorem establishing this bijection is called "adequacy".  In the
dependent function type and base type fragment, the canonical terms of
the type representing your object-language judgement are roughly
constants and variables applied to canonical arguments.  Thus, it's
fairly easy to set these up so they correspond to the "on paper"
object-language syntax and derivations.

If we added sums, then given a constant or a variable in the signature
of sum type, the canonical forms of any type C would include a CASE on
that constant or variable (as long as both branches are inhabited).  We
would have to account for this when giving adequate encodings of an
object language, and we would have to account for it when doing
induction over canonical forms (which is what the meta-theorem apparatus
is based on).  In most cases, this would make adequate encodings and
meta-proofs difficult or impossible.

The same is true for polymorphism.  For example, the following signature:

nat : type.
z : nat.
s : nat -> nat.

b : type.
something : b.

c : all alpha . b -> alpha .

no longer contains an adequate encoding of the natural numbers as the
canonical LF terms of type nat.  That is, there is no natural number
corresponding to (c[nat] something), even though this term is canonical.
Note that, from their types, the bs and cs seem to have nothing at all
to do with the nats, however.

Put another way, the lack of more powerful connectives is a feature, in
that it enables adequate encodings of an object language as canonical
forms of an LF type.  Using canonical forms seems important for proving
meta-theorems, as higher-order syntax and judgements aren't inductively
defined in the usual sense (they contain negative occurrences).

Now, I know of two ways for adding more connectives that are compatible
with the adequacy methodology.  The first is to only add well-behaved
connectives (essentially, limits).  Along these lines, Susmit Sarkar has
done the meta-theory of LF extended with dependent pairs and unit.
Linear LF includes connectives from the multiplicative fragment of
linear logic.  The second is to somehow prevent the misbehaved types
from influencing adequacy.  In Concurrent LF, the colimit types are
restricted to a monad partly for this reason.  I don't know if the
monadic approach could be usefully applied to sums or polymorphism.

Another consideration, aside from adequacy, is that the equational
theory and canonical forms of a type added to the framework must be
tractable, or the type must be isolated so that these properties are not
needed.  That is, adequacy constrains which types we'd want to add, but
even of the types we want, we have to actually be able to work with one
to add it.  I think that part of the motivation for the monad in CLF was
to avoid dealing with commuting conversions.

That said, I'm not arguing in favor of writing all these trivial lemmas
that you mentioned.  I just don't know a way around them that is
compatible with the LF methodology.  And, in my experience, writing a
few trivial equality/falsehood lemmas has been less of a pain than
manually implementing alpha-conversion and capture-avoiding
substitution.

-Dan


More information about the Poplmark mailing list