[TYPES] Typed lambda calculi with weak conversion

Thibaut Balabonski thibaut.balabonski at free.fr
Thu Mar 1 09:09:46 EST 2018


Dear Filippo, dear Gabriel,

One reason the Çağman and Hindley's notion of weak reduction is appreciated might be that it makes the weak lambda-calculus behave exactly like an orthogonal first-order term rewriting system (which basically means ‘simple and nicely confluent’).
In particular it does never hide a redex, like in a term

  (\xy.x)((\lz.z) a)

that would be reduced in normal order reduction to a weak normal form

  \y.((\z.z) a)

where the redex that was once visible in the argument is now hidden.
I assume when you speak of ‘normalization’ you mean ‘strong normalization’, that is requiring that any reduction strategy reaches a normal form (in contrast to ‘weak normalization’ that only requires the existence of one -potentially non-computable- normalizing strategy). Otherwise both variants of weak reduction will definitely be different.

One proof of the equivalence is in Sec. 4 of this paper:
https://www.lri.fr/~blsk/Docs/Balabonski-FullLaziness-POPL12.pdf


Regarding the ‘extrusion’ transformation, this looks really similar to the hoisting of maximal free expressions. The transformation is present in the usual presentation of fully lazy lambda-lifting (see SPJ’s The Implementation of Functional Programming Languages). It is also presented as a program transformation that implements fully lazy reduction using a ‘simply lazy’ evaluation mechanism (see Ariola and Felleisen’s JFP version of the call-by-need lambda-calculus).

Thibaut.



> Le 25 févr. 2018 à 16:34, Gabriel Scherer <gabriel.scherer at gmail.com> a écrit :
> 
> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ]
> 
> # Çağman and Hindley's notion of weak reduction
> 
> For those that would not be familiar with the work of Çağman and Hindley,
> 1998 (it is a nice paper that is still readable today, so don't hesitate to
> have a look), they give a definition of weak reduction (as opposed to
> strong/full reduction) that is a bit stronger (and arguably better-behaved)
> than the standard one: instead of saying "don't reduce under lambdas", they
> say "only reduce closed term" -- this is essentially the same thing, except
> that they allow themselves to reduce closed sub-terms that would occur
> under a lambda. For example:
> 
>  (lam y. (lam x. x) (lam z. z))
> 
> would not reduce for the common notion of weak reduction ((lam y . []) is
> not an evaluation context), but it does reduce to
> 
>  (lam y. (lam z. z))
> 
> for their notion of weak reduction. (Luc Maranget once claimed to me that
> this is the "right" notion of weak reduction.)
> 
> One nice idea in the paper of Çağman and Hindley is that this can be
> characterized very nicely by using the no-variable-capture property of
> substitutions (as opposed to context plugging). Strong/full reduction can
> be defined by saying that, for any head-reduction from A to B (a "head"
> reduction is when A is the redex, instead of merely containing the redex as
> a subterm; if you only have functions, this is when A is of the form ((lam
> x. C) D)):
> 
>  C[A] reduces to C[B]
> 
> for an arbitrary context C. (Just a term with a hole, no particular
> restriction to a family of evaluation contexts). Çağman and Hindley remark
> that their notion of weak reduction can be characterized by saying that,
> for any head-reduction from A to B,
> 
>  C[A/x] reduces to C[B/x]
> 
> using (capture-avoiding) substitution instead of context plugging imposes
> that A and B do not contain a variable bound in C, which precisely
> corresponds to their notion of weak reduction.
> (Consider (lam y . z) and (lam y . y), the former can be written as (lam y
> . x)[z/x], but the latter is not equal to (lam y . x)[y/x].)
> 
> # Normalization of weak reduction
> 
> To answer Filippo's question: I don't think that enlarging the notion of
> weak reduction to reduce closed subterms changes the normalization behavior
> in any way: the two notions of weak reduction have the same normalization
> properties.
> 
> In particular, consider a term of the form
> 
>  C[A]
> 
> where A is a closed sub-term in the sense that is under some lambdas in C,
> but does not use their bound variables. A is not reducible under the
> no-reduction-under-lambda interpretation (the common one), but it is
> reducible under the reduce-closed-subterms interpretation (Çağman and
> Hindley's -- let's call it "CH weak reduction" for short). But this term is
> beta-equivalent to
> 
>  (lam x. C[x]) A
> 
> where A is in reducible position under either notions of reductions. If
> C[A] was non-normalizable for CH weak reduction, then ((lam x. C[x]) A) is
> non-normalizable for standard weak reduction. If the standard calculus has
> a weak-normalization property, then you know that Ch-reducing A in C[A]
> preserves normalization.
> 
> My intuition suggests that this can be turned a general construction that
> extrudes all closed sub-term of a term, and produces from any term T an
> extruded term T' such that the CH-weak normal form of T is the standard
> normal form of T'. (Of course, this is only intuition, one has to do the
> work.) (This extrusion is type-preserving.) This means that any
> normalization result for standard weak reduction implies a normalization
> result for CH weak reduction.
> 
> 
> # Explicit substitutions
> 
> I believe that CH-weak reduction is may be related to calculi with explicit
> substitutions.
> CH-weak reduction is "more confluent" than standard reduction, as reducing
> (lam x. A) B preserves the property that B is in reducible position. (This
> is not true of standard weak reduction, consider the case where A is (lam y
> . C)). This property that head reduction preserves reducibility of their
> subterms also holds for explicit substitution calculi.
> 
> On Sun, Feb 25, 2018 at 12:06 PM, Filippo Sestini <sestini.filippo at gmail.com
>> wrote:
> 
>> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list
>> ]
>> 
>> Dear all,
>> 
>> The notion of weak reduction first appeared in [1], and was later
>> developed into the definition of a weak lambda calculus in, for
>> example, [2]. The crucial property of weak reduction is that it
>> does not validate the ξ rule, but some form of computation under
>> binders is still possible.
>> 
>> I'm looking for references in the literature treating typed
>> calculi with full lambda syntax and weak conversion, in
>> particular regarding normalization. What I've found so far seems
>> to either be limited to a form of combinator syntax, or only
>> consider evaluation to weak head normal form ([3], [4]).
>> Do you have any suggestions?
>> 
>> Thank you
>> 
>> Sincerely
>> 
>> [1] W.A. Howard, Assignment of Ordinals to Terms for Primitive
>> Recursive Functionals of Finite Type, 1970
>> 1970
>> 
>> [2] N. Çağman, J.R. Hindley, Combinatory weak reduction in lambda
>> calculus, 1998
>> 
>> [3] P. Martin-Löf, An Intuitionistic Theory of Types: Predicative
>> Part, 1975
>> 
>> [4] T. Coquand, P. Dybjer, Intuitionistic Model Constructions and
>> Normalization Proofs, 1998
>> 
>> --
>> Filippo Sestini
>> sestini.filippo at gmail.com
>> (+39) 333 6757668
>> 
>> 



More information about the Types-list mailing list