[Provers] Re: [POPLmark] Submissions page and comments updated
Robert Harper
rwh at cs.cmu.edu
Tue May 3 11:36:00 EDT 2005
Hi Peter,
Yes, in our work here we use the technical report version as
definitive; the Milner volume paper was an excerpt written under severe
space limitations. The reason to make explicit the control stack was
to avoid the massive cheat in The Definition of Standard ML to do with
formalizing exceptions. There are, officially, many more dynamic
semantic rules there than meet the eye, whereas in the HS formulation
there are no such dodges. This is particularly important for
mechanization, even as simple a task as simulation, as Gilles Kahn
discovered way back in the late 80's when he was using Centaur to
implement The Definition. Another lesson in the hygienic benefits of
formalization in a logical framework; pencil and paper let you get away
with all sorts of sins!
I have not thought about how best to give semantics to Acute.
The parenthetical remark at the end was an expression of my
exasperation that we haven't even gotten close in this discussion to
what I see as the real issues in our solution to the POPLMark
challenge. On reflection I realize that it is to the overall good to
hash out these preliminary matters explicitly, since they are clearly
not well-understood and since our thinking is at least somewhat at odds
with yours.
The specific issue I mentioned is that when formalizing records, we
would like to mod out by re-ordering of fields, but there is no good
way to do this in LF. The LF methodology provides alpha conversion and
substitution as structural congruences, but that's it. You cannot
impose any other such congruences on terms. It would be useful to
explore whether one can relax this a bit and permit more general forms
of equivalences to be imposed (case in point, re-ordering of record
fields). Roberto Virga's work on rewriting for LF goes part of the
way, but doesn't directly address this problem. I'd be curious to see
whether we can build on that to do better. There is also an issue of
the "technique" we used to solve the transitivity problem. The
argument works, but it feels a bit brittle to me. Similar issues came
up in our formalization of the HS semantics in Twelf, requiring even
dirtier tricks, er I mean more clever techniques, to solve. I'm not at
all certain what to think about this.
Bob
On May 3, 2005, at 9:55 AM, Peter Sewell wrote:
>
>
> Stephan Zdancewic wrote:
>> I'm not sure I understand what you intend here. According to the
>> paper
>> "A Type-Theoretic Interpretation of Standard ML" (Chris Stone and
>> Robert
>> Harper, Milner Festschrifft. 2000), figure 6 on page 8, the
>> operational
>> semantics *are* described using evaluation contexts. Was that
>> formalization unnecessarily complicated by their use?
>
>
> Robert Harper wrote:
>> We used control stacks explicitly, not evaluation contexts (with
>> filling and all that rot).
>>
>> The reason why is that when you do the metatheory (by hand or
>> otherwise) you have to reproduce what the SOS formulation would've
>> given you straight away.
>
>
> Looking at the techreport (CMU-CS-97-147, p30ff) there's a different
> presentation from the Festschrifft, with the same grammar of atomic
> evaluation contexts but with some 35 administrative reduction rules to
> push them on and off an explicit frame stack.
>
> For me personally, that seems notationally heavy - I'd be prepared to
> have slightly more complex metatheory proofs in order to get rid of
> those rules. I wouldn't argue that that isn't a matter of taste,
> though - certainly it isn't fundamental.
>
> A more interesting question is how to best express computation under
> _binding_ contexts, eg under pi-calculus-like new binders, or for
> semantics using instantiation under value-let bindings rather than
> substitution. If you were formalising that, would you still do so
> in the same style, with the administrative rules changing the binding
> structure??
>
> (We had a lot of this kind of thing in our Acute semantics, with
> concurrency and with computation under module definitions; it seemed
> necessary to use several different flavours of binding evaluation
> context to express it concisely.)
>
>
>
> Robert Harper wrote:
>> working on real issues in logical frameworks (such as how to
>> incorporate more interesting structural congruences, a point where a
>> *legitimate* criticism of our solution can be levelled.)
>
> meaning?
>
> Peter
> _______________________________________________
> Poplmark mailing list
> Poplmark at lists.seas.upenn.edu
> http://lists.seas.upenn.edu/mailman/listinfo/poplmark
More information about the Poplmark
mailing list