From yuting at cs.umn.edu Thu Feb 18 19:12:33 2016 From: yuting at cs.umn.edu (Yuting Wang) Date: Thu, 18 Feb 2016 18:12:33 -0600 Subject: [TYPES] HOAS Techniques in Verified Transformations on Functional Programs Message-ID: Verified Transformations on Functional Programs Using Higher-Order Abstract Syntax Techniques I would like to announce the work that I have been doing as part of my doctoral thesis to bring out the benefits of higher-order abstract syntax methods in implementing and verifying compiler transformations commonly used for functional programming languages. Transformations such as closure conversion and code hoisting have to pay special attention to binding structure, an aspect that is given a meta-level treatment in systems such as Abella, Beluga, Twelf and Lambda Prolog. In collaborative work with Gopalan Nadathur that will be presented at ESOP 2016, we have shown how the devices present in Lambda Prolog and Abella, two systems developed collaboratively by our group at the University of Minnesota and the Parsifal group at Inria-Saclay, can used to provide succinct implementations and clear formal proofs for the correctness of these transformations in the context of a representative functional language. The implementation, proofs and accompanying paper are available at the following URL: http://www-users.cs.umn.edu/~yuting/compilation/index.html I would appreciate any comments you might have about the code and the proofs and would also be happy to answer any questions that arise as you look at the material. Best, - Yuting -- Yuting Wang University of Minnesota, Dept. of Computer Science http://www.cs.umn.edu/~yuting From yufei.cai at uni-tuebingen.de Wed Jun 29 09:22:51 2016 From: yufei.cai at uni-tuebingen.de (Yufei Cai) Date: Wed, 29 Jun 2016 15:22:51 +0200 Subject: [TYPES] =?utf-8?q?Fwd=3A_Denotational_semantics_of_F=CF=89_with_t?= =?utf-8?q?ype-_=26_term-recursion?= In-Reply-To: References: Message-ID: Is there a denotational semantics for System F? in literature that supports both recursive types and general recursion? I'm looking for a model of Ralf Hinze's variant of System F? [4]. It has a term-level fixed point combinator fix : ?X. (X ? X) ? X and type-level fixed point combinators ?_? : (? ? ?) ? ?. Neither the ideal model the the interval model seems to fit. In the ideal model [1], the convertibility rule ? T ? T (? T) is not sound, because not all ideal functions have fixed points. In the interval model [2], it is not clear how to make the term-abstraction and term-application rules sound. Martini's interval model for F [3] restricts types to maximal intervals, but such a restriction would leave the inhabited type ? (?X. X) ? ? (?X. X) without any meaning. [1] MacQueen, Plotkin, Sethi. An ideal model for recursive polymorphic types. [2] Cartwright. Types as intervals. [3] Martini. An interval model for second order lambda calculus. [4] Hinze. Polytypic values possess polykinded types. Yufei Cai Reposting cstheory.stackexchange.com/q/36072 From moggi at disi.unige.it Wed Jun 29 15:48:12 2016 From: moggi at disi.unige.it (Eugenio Moggi) Date: Wed, 29 Jun 2016 21:48:12 +0200 Subject: [TYPES] =?utf-8?q?Fwd=3A_Denotational_semantics_of_F=CF=89_with_t?= =?utf-8?q?ype-_=26_term-recursion?= Message-ID: <87oa6jvksj.fsf@unige.it> > Is there a denotational semantics for System F? in literature > that supports both recursive types and general recursion? Domain-theoreic models with a "type of all types" support these features and also some form of dependent types. R. Amadio, K.B. Bruce, G. Longo The finitary projection model for second order lambda calculus and solutions to higher order domain equations A. Meyer (Ed.), Logic in Computer Science, IEEE Computer Soc, Washington, DC (1986), pp. 122?130 Another model where types are "closures" on a universal domain can be found in N. McCracken An Investigation of a Programming Language with a Polymorphic Type Structure Doctoral dissertation, Syracuse University (1979) More domain-theoretic models can be found in Thierry Coquand, Carl Gunter, Glynn Winskel, Domain theoretic models of polymorphism, Information and Computation, Volume 81, Issue 2, 1989, Pages 123-167 Best Regards Eugenio Moggi From aaronngray.lists at gmail.com Mon Jul 11 21:59:58 2016 From: aaronngray.lists at gmail.com (Aaron Gray) Date: Tue, 12 Jul 2016 02:59:58 +0100 Subject: [TYPES] Founding papers on Kinds Message-ID: Hi, I am looking for the founding papers on Kinds and any other interesting papers that review them particularly in regards to their implimentation. Many thanks in advance, Aaron -- Independent Software Engineer and amateur Computer Scientist From aaronngray.lists at gmail.com Tue Jul 26 07:56:55 2016 From: aaronngray.lists at gmail.com (Aaron Gray) Date: Tue, 26 Jul 2016 12:56:55 +0100 Subject: [TYPES] Wanted: A space of retracts, manuscript, Dana Scott 1980 Merton College, Oxford Message-ID: Hi, I am wanting a copy of Dana Scott's A space of retracts, manuscript, he did in 1980 in Merton College, Oxford. Either in paper or electronic form. Hope you can help ! Regards, Aaron From monnier at iro.umontreal.ca Thu Aug 25 00:40:22 2016 From: monnier at iro.umontreal.ca (Stefan Monnier) Date: Thu, 25 Aug 2016 00:40:22 -0400 Subject: [TYPES] Erasure and equality proofs Message-ID: I'm trying to get my head around the practical use of erasure/irrelevant/implicit arguments, as discussed in papers such as The Implicit Calculus of Constructions as a Programming Language with Dependent Types Bruno Barras and Bruno Bernardo, fossacs08 Erasure and polymorphism in pure type systems Nathan Mishra-Linger and Tim Sheard On the strength of proof-irrelevant type theories Benjamin Werner, IJCAR06 One of the desired benefits is to erase equality proofs, but those can't be erased naively, since they're not only used in type annotations but they're fundamentally passed to the corresponding elimination function (or "match" expression). Eliminating them naively could introduce inconsistencies (if we remove them in those cases where the equality proposition is actually an uninhabited type). Benjamin suggests we can erase those equality proofs under one condition: if the elimination rule is adjusted so that it checks equality before performing the reduction. But this, in turn requires that the 2 equal terms are not erasable. Bruno Barras et al. suggests other restrictions: basically he suggests to add a new elimination "eq_ind2" with all args erasable but without reduction rule (i.e. as a mere axiom). So I'm looking for some further info/work on actual uses of such systems, where the practical impact of such restrictions are investigated. I'm especially interested in this, because I have the impression (just gut feeling) that equality is not really special in this respect and that many/most other places where proofs are passed end up needing special treatment as well in order for those proofs to be erasable. IOW while I see that erasable arguments work fine for System-F style parametric type arguments, most of the times when I try to make a proof "erasable", I end up bumping into some place in the code where I need the proof to appear in a non-erasable part of the code (unless I add ad-hoc special rules along the lines suggested above for the equality type). I get the impression that we'd need erasability to be extended along the lines of Coq's special treatment of single-constructor elimination from Prop to Set, but such an extension doesn't seem to be consistent with the use of "convertibility modulo erasure". Stefan From andreas.abel at ifi.lmu.de Thu Aug 25 14:03:26 2016 From: andreas.abel at ifi.lmu.de (Andreas Abel) Date: Thu, 25 Aug 2016 20:03:26 +0200 Subject: [TYPES] Erasure and equality proofs In-Reply-To: References: Message-ID: <0246535e-7c0a-8d4c-c09b-17f4088834f7@ifi.lmu.de> Hi Stefan, Alan Jeffrey used Agda's irrelevance and added an axiom that one can still substitute with irrelevant equality proof. He might be able to tell you more. I am also very much interested in the interaction between irrelevance and case distinction/recursion and equality, so if you get some answers, please keep me in the loop! Best, Andreas On 25.08.2016 06:40, Stefan Monnier wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > I'm trying to get my head around the practical use of > erasure/irrelevant/implicit arguments, as discussed in papers such as > > The Implicit Calculus of Constructions as a Programming Language > with Dependent Types > Bruno Barras and Bruno Bernardo, fossacs08 > > Erasure and polymorphism in pure type systems > Nathan Mishra-Linger and Tim Sheard > > On the strength of proof-irrelevant type theories > Benjamin Werner, IJCAR06 > > One of the desired benefits is to erase equality proofs, but those can't > be erased naively, since they're not only used in type annotations but > they're fundamentally passed to the corresponding elimination function > (or "match" expression). Eliminating them naively could introduce > inconsistencies (if we remove them in those cases where the equality > proposition is actually an uninhabited type). > > Benjamin suggests we can erase those equality proofs under one > condition: if the elimination rule is adjusted so that it checks > equality before performing the reduction. > But this, in turn requires that the 2 equal terms are not erasable. > > Bruno Barras et al. suggests other restrictions: basically he suggests > to add a new elimination "eq_ind2" with all args erasable but without > reduction rule (i.e. as a mere axiom). > > So I'm looking for some further info/work on actual uses of such > systems, where the practical impact of such restrictions are > investigated. I'm especially interested in this, because I have the > impression (just gut feeling) that equality is not really special in > this respect and that many/most other places where proofs are passed end > up needing special treatment as well in order for those proofs to > be erasable. > > IOW while I see that erasable arguments work fine for System-F style > parametric type arguments, most of the times when I try to make > a proof "erasable", I end up bumping into some place in the code where > I need the proof to appear in a non-erasable part of the code (unless > I add ad-hoc special rules along the lines suggested above for the > equality type). > > I get the impression that we'd need erasability to be extended along the > lines of Coq's special treatment of single-constructor elimination from > Prop to Set, but such an extension doesn't seem to be consistent with > the use of "convertibility modulo erasure". > > > Stefan > -- Andreas Abel <>< Du bist der geliebte Mensch. Department of Computer Science and Engineering Chalmers and Gothenburg University, Sweden andreas.abel at gu.se http://www2.tcs.ifi.lmu.de/~abel/ From monnier at IRO.UMontreal.CA Mon Aug 29 13:33:37 2016 From: monnier at IRO.UMontreal.CA (Stefan Monnier) Date: Mon, 29 Aug 2016 13:33:37 -0400 Subject: [TYPES] Erasure and equality proofs In-Reply-To: <0246535e-7c0a-8d4c-c09b-17f4088834f7@ifi.lmu.de> (Andreas Abel's message of "Thu, 25 Aug 2016 20:03:26 +0200") References: <0246535e-7c0a-8d4c-c09b-17f4088834f7@ifi.lmu.de> Message-ID: > Alan Jeffrey used Agda's irrelevance and added an axiom that one can still > substitute with irrelevant equality proof. He might be able to tell > you more. > I am also very much interested in the interaction between irrelevance and > case distinction/recursion and equality, so if you get some answers, please > keep me in the loop! No other answer so far :-( In Benjamin Werner's article, he proposes as crucial the property that erasure and reduction commute, but I think that a way out may be to relax this requirement. Instead of having e ? e' <=> [e] ? [e'] I think intuitively it should be sufficient to have something like e ? e' <= [e] ? [e'] maybe combined with e ? e' => [e] ? [e'] *in an empty context* That would allow us to have more reduction rules (e.g. reduction of eq_ind2 when applied to eq_refl) on the non-erased side of the world. I think this should be safe in the sense that the lack of corresponding reductions in the erased world could simply cause the convertibility check to fail (i.e. it will reject code which it could/should accept), but it should not introduce inconsistencies. This is similar to adding eq_ind2 as an axiom, except that we can still have a reduction rule for it in the non-erased world. It does bring unexpected behaviors, tho: code may be accepted or rejected depending on the order in which erasure and reduction take place. Of course, the question remains whether adding eq_ind2 as an axiom is safe, and/or how it might interact with other axioms. E.g. as mentioned in Benjamin's article erasable equality proofs make the K axiom unnecessary (although I was too weak to actually prove the K axiom itself. I wonder if it's just because of my being feeble minded, or if it's really that such erasability does not really entail the K axiom). Stefan > On 25.08.2016 06:40, Stefan Monnier wrote: >> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >> I'm trying to get my head around the practical use of >> erasure/irrelevant/implicit arguments, as discussed in papers such as >> >> The Implicit Calculus of Constructions as a Programming Language >> with Dependent Types >> Bruno Barras and Bruno Bernardo, fossacs08 >> >> Erasure and polymorphism in pure type systems >> Nathan Mishra-Linger and Tim Sheard >> >> On the strength of proof-irrelevant type theories >> Benjamin Werner, IJCAR06 >> >> One of the desired benefits is to erase equality proofs, but those can't >> be erased naively, since they're not only used in type annotations but >> they're fundamentally passed to the corresponding elimination function >> (or "match" expression). Eliminating them naively could introduce >> inconsistencies (if we remove them in those cases where the equality >> proposition is actually an uninhabited type). >> >> Benjamin suggests we can erase those equality proofs under one >> condition: if the elimination rule is adjusted so that it checks >> equality before performing the reduction. >> But this, in turn requires that the 2 equal terms are not erasable. >> >> Bruno Barras et al. suggests other restrictions: basically he suggests >> to add a new elimination "eq_ind2" with all args erasable but without >> reduction rule (i.e. as a mere axiom). >> >> So I'm looking for some further info/work on actual uses of such >> systems, where the practical impact of such restrictions are >> investigated. I'm especially interested in this, because I have the >> impression (just gut feeling) that equality is not really special in >> this respect and that many/most other places where proofs are passed end >> up needing special treatment as well in order for those proofs to >> be erasable. >> >> IOW while I see that erasable arguments work fine for System-F style >> parametric type arguments, most of the times when I try to make >> a proof "erasable", I end up bumping into some place in the code where >> I need the proof to appear in a non-erasable part of the code (unless >> I add ad-hoc special rules along the lines suggested above for the >> equality type). >> >> I get the impression that we'd need erasability to be extended along the >> lines of Coq's special treatment of single-constructor elimination from >> Prop to Set, but such an extension doesn't seem to be consistent with >> the use of "convertibility modulo erasure". >> >> >> Stefan >> > -- > Andreas Abel <>< Du bist der geliebte Mensch. > Department of Computer Science and Engineering > Chalmers and Gothenburg University, Sweden > andreas.abel at gu.se > http://www2.tcs.ifi.lmu.de/~abel/ From gabriel.scherer at gmail.com Thu Oct 6 14:58:39 2016 From: gabriel.scherer at gmail.com (Gabriel Scherer) Date: Thu, 6 Oct 2016 14:58:39 -0400 Subject: [TYPES] decidability of BCCCs? In-Reply-To: <29DC2B99-808C-4CBC-A3F2-512E0E85E405@cs.harvard.edu> References: <29DC2B99-808C-4CBC-A3F2-512E0E85E405@cs.harvard.edu> Message-ID: Hi all, I'm coming back to this old types-list thread to announce the article Deciding equivalence with sums and the empty type Gabriel Scherer, 2016 https://arxiv.org/abs/1610.01213 which proposes a proof of decidability of contextual equivalence for the simply-typed lambda-calculus with sums and empty types, which corresponds to decidability in the free bicartesian category. Any feedback is of course warmly welcome. # Context of the work The proof builds on an extension of the proof-theoretic technique of focusing. Focusing is a technique to make the proofs of a logic more canonical by reasoning on the invertibility of proof rules, and imposing an ordering between invertible phases (automatic) and non-invertible phases (interesting). In previous work with Didier R?my, we proposed a refinement of focused intuitionistic logic that is "saturated" (non-invertible phases have to perform all possible non-invertible deductions at once, modulo some restrictions to preserve termination), and give quasi-normal forms that are quite close to the normal forms in previous work on equivalence in presence of sums. The goal was to answer the question, in a type system with sums but no empty types (we knew empty types are hard, in large part due to the discussion in this very types-list thread): "which types have a unique inhabitant?", which requires a type system with as little redundant (equivalent) terms as possible. While experimenting with the implementation of an algorithm that computes an answer the following question (given a type, explore its saturated derivation to check whether there is a unique one), we noticed that adding the empty type seemed to work perfectly, although our theory did not account for it. While wrapping up my PhD manuscript ( https://hal.inria.fr/tel-01309712/ ; it contains a detailed introduction to focusing and saturation), I tried to provide a rigorous canonicity proof for the system extended with the empty type, but could not provide a rigorous one in time for submission. The above article now proposes an answer to this question. Many thanks to the participants to this types-list thread, which got me interested in trying to extend the system with empty types. Without it I also wouldn't have found about Arbob Ahmad's work, an (unfortunately unpublished) independent use of focusing for program equivalence which I find very interesting. # Related work I don't yet have a good understanding of the relation between this approach and existing works on Normalization by Evaluation (NbE). NbE requires a good understanding of how to combine/compose fragment of normal forms together while evaluating the original source terms; our normal forms have a very non-local structure that make it difficult (In particular, I think that it's possible to present an equivalence algorithm that compares two focused terms, computing a saturated form on the fly; but starting from un-focused terms and doing focusing on the fly looks rather difficult). Saturated normal forms are related to the notion of "maximally multi-focused proof" (multi-focusing is an extension of focusing that has been studied in the last decade), and this corresponds to the known difficulty of formulating cut-elimination processes that preserves maximal multi-focusing. Giving a normalization-by-evaluation presentation of saturation would thus be very interesting. It is easier to relate to the rewriting-based approaches of Neil Ghani and Sam Lindley; the normalization of focused terms into saturated terms is very close to the rewriting they propose. In rewriting-based approaches, one try to extrude matches of the form (match n with ...) (where n is a neutral term; if n started with a constructor, this would be beta-reducible) to move them closer to the root of the term, and then merge redundant matches. When rewriting a focused term into a saturated term, the syntax has let-bindings for neutrals of positive type (let x = n in ...), and those get extruded up in the term. Then the focusing structure imposes that the corresponding match happen as soon as possible after that (so the "merge" of redundant matches is implied by focusing). (The use of control operators in Balat, Di Cosmo, Fiore also serves the same purpose of finding the earliest possible place to introduce a neutral). On the other hand, saturation will do forward-based proof search to synthesize all definable neutrals, instead of just permuting fragments of the two terms to compare, and this is the key ingredient that makes it gracefully extend to the empty type: one has to go out in the wild looking for proofs of 0. On Sat, Sep 21, 2013 at 10:25 AM, Ryan Wisnesky wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > Hi Roberto, > > BCCCs seem to come up a lot at the intersection between database theory and programming language theory. For example, if you say that a database schema is a finitely presented category, then the category of schemas and mappings between them is a BCCC. For this particular BCCC, I believe that equality is semi-decidable, so I would like to say that its semi-decidability comes from its particular nature, and is not inherited from the free BCCC. BCCCs also come up when you try to connect higher-order logic to the nested relational calculus via their categorical (topos) semantics. I'd be happy to talk more about it offline. > > Thanks, > Ryan > > On Sep 21, 2013, at 9:34 AM, Roberto Di Cosmo wrote: > >> Dear Ryan, >> beta-eta equality in the presence of strong sums is a tricky subject that has been object of some attention for more than 20 years, both "per se" in the rewriting community, that looked for an elegant confluent rewriting system, and as a "tool" for researchers interested in type theory and semantics, that wanted to know for example whether one can characterise object isomorphism in a BiCCC, problem that one can try to attack by studying invertible terms in the corresponding lambda calculus with strong types. >> >> If one stays in the 1,*,-> fragment, all is wonderful: you get a nice confluent and normalising rewriting system for the lambda calculs based on eta expansion (see a blazingly short proof in http://www.dicosmo.org/Articles/POD.ps), the type isomorphisms are finitely axiomatizable >> and exactly correspond to the equations one knows from high schools for natural numbers equipped with product, unit and exponentiation, related to Tarski's High School algebra problem >> (a nice proof through Finite Sets is due to Sergei Soloviev in 1988, I gave one based on invertible terms in my PhD thesis back in 1992). >> >> You may add weak sums to the calculus, and still get a nice confluent and normalising rewriting system, that can also accomodate bounded recursion (http://www.pps.univ-paris-diderot.fr/~kesner/papers/icalp93.ps.gz). >> >> As soon as you add strong sums, though, things get very hairy: Neil Ghani proposed in his PhD thesis back in 1995 a decision procedure for lambda terms with strong 1,+,*,-> (no zero, see >> https://personal.cis.strath.ac.uk/neil.ghani/papers/yellowthesis.ps.gz): a rewriting system is used to bring terms in this calculus to a normal form which is not unique, and then it is shown that all such normal forms are equivalent modulo a sort of commuting conversions, so to decide equality of two terms, you first normalise them, and then check whether they are equivalent w.r.t. conversion (the equivalence classes being finite, this is decidable). The NBE appoach of Thorsten Altenkirch (et al.) of your LICS01 reference seems to provide a more straightforward way to decide equality, even if somebody coming from the rewriting world might like a more intuitive description of what the normal form actually look like. >> >> If you add the zero, you are in for more surpises: looking at the special BiCCC formed by the natural numbers equipped with 0,1,+,* and exponentiation, you discover that equality is no longer finitely axiomatisable, and yet a decision procedure does exist (see http://www.dicosmo.org/Papers/zeroisnfa.pdf), >> which solves in the negative, but with a positive note, Tarski's problem for this system. >> >> But then, in BiCCCs, the connection between numerical equalities and type isomorphisms breaks down and the nice result above says nothing about what happens for the general case. In http://www.dicosmo.org/Papers/lics02.pdf we proved that isos are non finitely axiomatisable in BiCCC, but we know nothing on decidability. >> >> As for deciding beta-eta equality, the work of Marcelo Fiore and Vincent Balat http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf shows how to compute normal forms in full BiCCCs, and I kind of remember that we were convinced at some point that these normal forms must be unique up to some sort of commuting conversions (you can see this in the examples of the paper), but we did not prove it at that moment, and it is true that one would like to see this last step done (or a proof that it cannot be done). >> >> I wonder whether Marcelo, Neil and Thorsten might shed some light on this (Vincent and I started working on quite different areas a while ago) >> >> By the way, out of curiosity, would you mind telling us how this result would be useful your research? >> >> -- >> Roberto >> >> P.S.: for those interested in type isomorphism, http://www.dicosmo.org/Paper/mscs-survey.pdf provides a concise overview, even if no longer up to date. >> >> >> >> 2013/9/20 Ryan Wisnesky >> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >> Hello, >> >> I am trying to find a reference that states whether or not the free bi-cartesian closed category is decidable. That is, I am wondering if beta-eta equality of the simply typed lambda calculus extended with strong 0,1,+,* types is decidable. (In particular, I am interested in the full calculus, including the eliminator for 0, introduction for 1, and eta for all types). >> >> So far, my search through the literature says that the answer to this question is "almost" or "maybe" : >> >> - the STLC with strong 1,+,* types is decidable (omits 0, the empty type): http://www.cs.nott.ac.uk/~txa/publ/lics01.pdf >> >> - the STLC with strong 0,1,+,* types has normal forms, but equivalent terms may have different such normal forms:http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf >> >> - work on mechanized proofs of correctness for deciding co-product equality may or may not include the "absurd" eliminator for the empty type: http://www.cis.upenn.edu/~sweirich/wmm/wmm07/programme/licata.pdf >> >> I feel like the answer to this question is probably well-known, but I can't seem to find it in the literature. Any help would be appreciated. >> >> Thanks, >> Ryan >> >> >> >> -- >> Roberto Di Cosmo >> >> ------------------------------------------------------------------ >> Professeur En delegation a l'INRIA >> PPS E-mail: roberto at dicosmo.org >> Universite Paris Diderot WWW : http://www.dicosmo.org >> Case 7014 Tel : ++33-(0)1-57 27 92 20 >> 5, Rue Thomas Mann >> F-75205 Paris Cedex 13 Identica: http://identi.ca/rdicosmo >> FRANCE. Twitter: http://twitter.com/rdicosmo >> ------------------------------------------------------------------ >> Attachments: >> MIME accepted, Word deprecated >> http://www.gnu.org/philosophy/no-word-attachments.html >> ------------------------------------------------------------------ >> Office location: >> >> Bureau 320 (3rd floor) >> Batiment Sophie Germain >> Avenue de France >> Metro Bibliotheque Francois Mitterrand, ligne 14/RER C >> ----------------------------------------------------------------- >> GPG fingerprint 2931 20CE 3A5A 5390 98EC 8BFC FCCA C3BE 39CB 12D3 > From roberto at dicosmo.org Thu Oct 6 15:32:30 2016 From: roberto at dicosmo.org (Roberto Di Cosmo) Date: Thu, 6 Oct 2016 21:32:30 +0200 Subject: [TYPES] decidability of BCCCs? In-Reply-To: References: <29DC2B99-808C-4CBC-A3F2-512E0E85E405@cs.harvard.edu> Message-ID: <20161006193230.GB508@traveler> That's great news: thanks for sharing, Gabriel, and congratulations for this nice step forward! -- Roberto On Thu, Oct 06, 2016 at 02:58:39PM -0400, Gabriel Scherer wrote: > Hi all, > > I'm coming back to this old types-list thread to announce the article > > Deciding equivalence with sums and the empty type > Gabriel Scherer, 2016 > https://arxiv.org/abs/1610.01213 > > which proposes a proof of decidability of contextual equivalence for > the simply-typed lambda-calculus with sums and empty types, which > corresponds to decidability in the free bicartesian category. Any > feedback is of course warmly welcome. > > > # Context of the work > > The proof builds on an extension of the proof-theoretic technique of > focusing. Focusing is a technique to make the proofs of a logic more > canonical by reasoning on the invertibility of proof rules, and > imposing an ordering between invertible phases (automatic) and > non-invertible phases (interesting). In previous work with Didier > R?my, we proposed a refinement of focused intuitionistic logic that is > "saturated" (non-invertible phases have to perform all possible > non-invertible deductions at once, modulo some restrictions to > preserve termination), and give quasi-normal forms that are quite > close to the normal forms in previous work on equivalence in presence > of sums. The goal was to answer the question, in a type system with > sums but no empty types (we knew empty types are hard, in large part > due to the discussion in this very types-list thread): "which types > have a unique inhabitant?", which requires a type system with as > little redundant (equivalent) terms as possible. > > While experimenting with the implementation of an algorithm that > computes an answer the following question (given a type, explore its > saturated derivation to check whether there is a unique one), we > noticed that adding the empty type seemed to work perfectly, although > our theory did not account for it. While wrapping up my PhD manuscript > ( https://hal.inria.fr/tel-01309712/ ; it contains a detailed > introduction to focusing and saturation), I tried to provide > a rigorous canonicity proof for the system extended with the empty > type, but could not provide a rigorous one in time for submission. The > above article now proposes an answer to this question. > > Many thanks to the participants to this types-list thread, which got > me interested in trying to extend the system with empty types. Without > it I also wouldn't have found about Arbob Ahmad's work, an > (unfortunately unpublished) independent use of focusing for program > equivalence which I find very interesting. > > > # Related work > > I don't yet have a good understanding of the relation between this > approach and existing works on Normalization by Evaluation (NbE). NbE > requires a good understanding of how to combine/compose fragment of > normal forms together while evaluating the original source terms; our > normal forms have a very non-local structure that make it difficult > (In particular, I think that it's possible to present an equivalence > algorithm that compares two focused terms, computing a saturated form > on the fly; but starting from un-focused terms and doing focusing on > the fly looks rather difficult). Saturated normal forms are related to > the notion of "maximally multi-focused proof" (multi-focusing is an > extension of focusing that has been studied in the last decade), and > this corresponds to the known difficulty of formulating > cut-elimination processes that preserves maximal > multi-focusing. Giving a normalization-by-evaluation presentation of > saturation would thus be very interesting. > > It is easier to relate to the rewriting-based approaches of Neil Ghani > and Sam Lindley; the normalization of focused terms into saturated > terms is very close to the rewriting they propose. In rewriting-based > approaches, one try to extrude matches of the form (match n with ...) > (where n is a neutral term; if n started with a constructor, this > would be beta-reducible) to move them closer to the root of the term, > and then merge redundant matches. When rewriting a focused term into > a saturated term, the syntax has let-bindings for neutrals of positive > type (let x = n in ...), and those get extruded up in the term. Then > the focusing structure imposes that the corresponding match happen as > soon as possible after that (so the "merge" of redundant matches is > implied by focusing). (The use of control operators in Balat, Di > Cosmo, Fiore also serves the same purpose of finding the earliest > possible place to introduce a neutral). On the other hand, saturation > will do forward-based proof search to synthesize all definable > neutrals, instead of just permuting fragments of the two terms to > compare, and this is the key ingredient that makes it gracefully > extend to the empty type: one has to go out in the wild looking for > proofs of 0. > > > On Sat, Sep 21, 2013 at 10:25 AM, Ryan Wisnesky wrote: > > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > > > Hi Roberto, > > > > BCCCs seem to come up a lot at the intersection between database theory and programming language theory. For example, if you say that a database schema is a finitely presented category, then the category of schemas and mappings between them is a BCCC. For this particular BCCC, I believe that equality is semi-decidable, so I would like to say that its semi-decidability comes from its particular nature, and is not inherited from the free BCCC. BCCCs also come up when you try to connect higher-order logic to the nested relational calculus via their categorical (topos) semantics. I'd be happy to talk more about it offline. > > > > Thanks, > > Ryan > > > > On Sep 21, 2013, at 9:34 AM, Roberto Di Cosmo wrote: > > > >> Dear Ryan, > >> beta-eta equality in the presence of strong sums is a tricky subject that has been object of some attention for more than 20 years, both "per se" in the rewriting community, that looked for an elegant confluent rewriting system, and as a "tool" for researchers interested in type theory and semantics, that wanted to know for example whether one can characterise object isomorphism in a BiCCC, problem that one can try to attack by studying invertible terms in the corresponding lambda calculus with strong types. > >> > >> If one stays in the 1,*,-> fragment, all is wonderful: you get a nice confluent and normalising rewriting system for the lambda calculs based on eta expansion (see a blazingly short proof in http://www.dicosmo.org/Articles/POD.ps), the type isomorphisms are finitely axiomatizable > >> and exactly correspond to the equations one knows from high schools for natural numbers equipped with product, unit and exponentiation, related to Tarski's High School algebra problem > >> (a nice proof through Finite Sets is due to Sergei Soloviev in 1988, I gave one based on invertible terms in my PhD thesis back in 1992). > >> > >> You may add weak sums to the calculus, and still get a nice confluent and normalising rewriting system, that can also accomodate bounded recursion (http://www.pps.univ-paris-diderot.fr/~kesner/papers/icalp93.ps.gz). > >> > >> As soon as you add strong sums, though, things get very hairy: Neil Ghani proposed in his PhD thesis back in 1995 a decision procedure for lambda terms with strong 1,+,*,-> (no zero, see > >> https://personal.cis.strath.ac.uk/neil.ghani/papers/yellowthesis.ps.gz): a rewriting system is used to bring terms in this calculus to a normal form which is not unique, and then it is shown that all such normal forms are equivalent modulo a sort of commuting conversions, so to decide equality of two terms, you first normalise them, and then check whether they are equivalent w.r.t. conversion (the equivalence classes being finite, this is decidable). The NBE appoach of Thorsten Altenkirch (et al.) of your LICS01 reference seems to provide a more straightforward way to decide equality, even if somebody coming from the rewriting world might like a more intuitive description of what the normal form actually look like. > >> > >> If you add the zero, you are in for more surpises: looking at the special BiCCC formed by the natural numbers equipped with 0,1,+,* and exponentiation, you discover that equality is no longer finitely axiomatisable, and yet a decision procedure does exist (see http://www.dicosmo.org/Papers/zeroisnfa.pdf), > >> which solves in the negative, but with a positive note, Tarski's problem for this system. > >> > >> But then, in BiCCCs, the connection between numerical equalities and type isomorphisms breaks down and the nice result above says nothing about what happens for the general case. In http://www.dicosmo.org/Papers/lics02.pdf we proved that isos are non finitely axiomatisable in BiCCC, but we know nothing on decidability. > >> > >> As for deciding beta-eta equality, the work of Marcelo Fiore and Vincent Balat http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf shows how to compute normal forms in full BiCCCs, and I kind of remember that we were convinced at some point that these normal forms must be unique up to some sort of commuting conversions (you can see this in the examples of the paper), but we did not prove it at that moment, and it is true that one would like to see this last step done (or a proof that it cannot be done). > >> > >> I wonder whether Marcelo, Neil and Thorsten might shed some light on this (Vincent and I started working on quite different areas a while ago) > >> > >> By the way, out of curiosity, would you mind telling us how this result would be useful your research? > >> > >> -- > >> Roberto > >> > >> P.S.: for those interested in type isomorphism, http://www.dicosmo.org/Paper/mscs-survey.pdf provides a concise overview, even if no longer up to date. > >> > >> > >> > >> 2013/9/20 Ryan Wisnesky > >> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > >> > >> Hello, > >> > >> I am trying to find a reference that states whether or not the free bi-cartesian closed category is decidable. That is, I am wondering if beta-eta equality of the simply typed lambda calculus extended with strong 0,1,+,* types is decidable. (In particular, I am interested in the full calculus, including the eliminator for 0, introduction for 1, and eta for all types). > >> > >> So far, my search through the literature says that the answer to this question is "almost" or "maybe" : > >> > >> - the STLC with strong 1,+,* types is decidable (omits 0, the empty type): http://www.cs.nott.ac.uk/~txa/publ/lics01.pdf > >> > >> - the STLC with strong 0,1,+,* types has normal forms, but equivalent terms may have different such normal forms:http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf > >> > >> - work on mechanized proofs of correctness for deciding co-product equality may or may not include the "absurd" eliminator for the empty type: http://www.cis.upenn.edu/~sweirich/wmm/wmm07/programme/licata.pdf > >> > >> I feel like the answer to this question is probably well-known, but I can't seem to find it in the literature. Any help would be appreciated. > >> > >> Thanks, > >> Ryan > >> > >> > >> > >> -- > >> Roberto Di Cosmo > >> > >> ------------------------------------------------------------------ > >> Professeur En delegation a l'INRIA > >> PPS E-mail: roberto at dicosmo.org > >> Universite Paris Diderot WWW : http://www.dicosmo.org > >> Case 7014 Tel : ++33-(0)1-57 27 92 20 > >> 5, Rue Thomas Mann > >> F-75205 Paris Cedex 13 Identica: http://identi.ca/rdicosmo > >> FRANCE. Twitter: http://twitter.com/rdicosmo > >> ------------------------------------------------------------------ > >> Attachments: > >> MIME accepted, Word deprecated > >> http://www.gnu.org/philosophy/no-word-attachments.html > >> ------------------------------------------------------------------ > >> Office location: > >> > >> Bureau 320 (3rd floor) > >> Batiment Sophie Germain > >> Avenue de France > >> Metro Bibliotheque Francois Mitterrand, ligne 14/RER C > >> ----------------------------------------------------------------- > >> GPG fingerprint 2931 20CE 3A5A 5390 98EC 8BFC FCCA C3BE 39CB 12D3 > > -- Roberto Di Cosmo ------------------------------------------------------------------ Professeur (on leave at/detache a INRIA Roquencourt) IRIF E-mail : roberto at dicosmo.org Universite Paris Diderot Web : http://www.dicosmo.org Case 7014 Twitter : http://twitter.com/rdicosmo 5, Rue Thomas Mann F-75205 Paris Cedex 13 France ------------------------------------------------------------------ Office location: Paris Diderot INRIA Bureau 3020 (3rd floor) Bureau C123 Batiment Sophie Germain Batiment C 8 place Aur?lie Nemours 2, Rue Simone Iff Tel: +33 1 57 27 92 20 Tel: +33 1 80 49 44 42 Metro Bibliotheque F. Mitterrand Ligne 6: Dugommier ligne 14/RER C Ligne 14/RER A: Gare de Lyon ------------------------------------------------------------------ GPG fingerprint 2931 20CE 3A5A 5390 98EC 8BFC FCCA C3BE 39CB 12D3 From gabriel.scherer at gmail.com Sat Oct 8 07:13:46 2016 From: gabriel.scherer at gmail.com (Gabriel Scherer) Date: Sat, 8 Oct 2016 07:13:46 -0400 Subject: [TYPES] decidability of BCCCs? In-Reply-To: References: <29DC2B99-808C-4CBC-A3F2-512E0E85E405@cs.harvard.edu> Message-ID: > Hence your construction must exploit simple-typedness. > Can you comment on this? The idea of saturation is to make all possible deductions from a context, that is, to introduce all possible neutral terms of positive type (all left-focusing phases); by doing this as early as possible, we make sure that each neutral is introduced as early as it could and that the result of the case-analysis on it is shared by all following subterms. (One can think of the general algorithm as a mix of backward and forward search, with invertible phases doing backward goal-directed search and saturation phases doing forward context-directed search). Of course in general the set of neutral terms to introduce from the context may be infinite, consider the context (z : X+, s : X+ -> X+) for example (where "X+" denotes an atomic type that is positive). If we asked for full saturation, any saturated term in this context would be infinite. We relax this requirement by allowing to saturate on only a subset of the possible neutrals. But if we allowed arbitrary subsets, we would lose canonicity: if you stop saturation too early you may "miss" a proof of 0 and thus not see that two terms are actually equal. We need a compromise where we introduce enough neutrals to eventually find any possible proof of 0. Our compromise is the following. We ask that the selected subset be complete from a provability point of view: that for any type that could be introduced by saturation at this point, saturation does not stop before this type exists in the context. (In this example saturation can stop immediately as (z : X+) is already in the context. For general positive types the notion of "exist in the context" becomes "provable by a strong neutral", see Section 4.2 of the article.) There, we use simple-typedness: we use the subformula property (in cut-free proofs, the types that appear in sub-derivations of a bigger derivation are sub-formulas of the types of the bigger derivation's judgment; this doesn't hold in presence of polymorphism) to argue that there always exist finite subsets that satisfy this "complete for provability" restriction. In System F or with Pi-types, when instantiating a polymorphic type or a dependent abstraction, there is a potentially infinite space of possible instantiation choices that all result in different types, and we can't know in advance which will or will not be useful to the rest of the computation. On the other hand, focusing itself extends to richer type systems or logics, so I hope that some of the ideas in this work (in particular on the initial question of "which types have a unique inhabitant?") can be extended to richer systems, to get semi-decidable procedures. > isn?t the only thing you get for the empty type exactly this: that is that any two terms > are equal iff their context is logically inconsistent? Note quite, for example in the empty (consistent) context, two terms of type (0 -> Bool) are equal. If you do goal-directed proof search, it is obvious that you should introduce a lambda-abstraction here and thus discover an inconsistent context. But a given equivalence algorithm works from the two terms to compare, and it may or may not decide to look under that function type. Thanks for the questions! . On Sat, Oct 8, 2016 at 4:05 AM, Thorsten Altenkirch wrote: > Hi Gabriel, > > this is certainly an impressive achievement: congratulations. > > We know that your construction also contains an algorithm to decide > logical consistency in intuitionistic propositional logic, that is G |- > true = false iff G is logically inconsistent. That shows in particular > that decidability can only hod for simply types, but not for dependent > types because here we would decide logical consistency for intuitionistic > predicate logic which is undecidable. Hence your construction must exploit > simple-typedness. Can you comment on this? > > I also wonder about the following: if you have an algorithm to decide > equality for non-empty sums, isn?t the only thing you get for the empty > type exactly this: that is that any two terms are equal iff their context > is logically inconsistent? That would provide a way to extend our > algorithm to empty types but I suspect proving this proposition is > non-trivial. > > Cheers, > Thorsten > > On 06/10/2016, 19:58, "Types-list on behalf of Gabriel Scherer" > gabriel.scherer at gmail.com> wrote: > >>[ The Types Forum, >>http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >>Hi all, >> >>I'm coming back to this old types-list thread to announce the article >> >> Deciding equivalence with sums and the empty type >> Gabriel Scherer, 2016 >> https://arxiv.org/abs/1610.01213 >> >>which proposes a proof of decidability of contextual equivalence for >>the simply-typed lambda-calculus with sums and empty types, which >>corresponds to decidability in the free bicartesian category. Any >>feedback is of course warmly welcome. >> >> >># Context of the work >> >>The proof builds on an extension of the proof-theoretic technique of >>focusing. Focusing is a technique to make the proofs of a logic more >>canonical by reasoning on the invertibility of proof rules, and >>imposing an ordering between invertible phases (automatic) and >>non-invertible phases (interesting). In previous work with Didier >>R?my, we proposed a refinement of focused intuitionistic logic that is >>"saturated" (non-invertible phases have to perform all possible >>non-invertible deductions at once, modulo some restrictions to >>preserve termination), and give quasi-normal forms that are quite >>close to the normal forms in previous work on equivalence in presence >>of sums. The goal was to answer the question, in a type system with >>sums but no empty types (we knew empty types are hard, in large part >>due to the discussion in this very types-list thread): "which types >>have a unique inhabitant?", which requires a type system with as >>little redundant (equivalent) terms as possible. >> >>While experimenting with the implementation of an algorithm that >>computes an answer the following question (given a type, explore its >>saturated derivation to check whether there is a unique one), we >>noticed that adding the empty type seemed to work perfectly, although >>our theory did not account for it. While wrapping up my PhD manuscript >>( https://hal.inria.fr/tel-01309712/ ; it contains a detailed >>introduction to focusing and saturation), I tried to provide >>a rigorous canonicity proof for the system extended with the empty >>type, but could not provide a rigorous one in time for submission. The >>above article now proposes an answer to this question. >> >>Many thanks to the participants to this types-list thread, which got >>me interested in trying to extend the system with empty types. Without >>it I also wouldn't have found about Arbob Ahmad's work, an >>(unfortunately unpublished) independent use of focusing for program >>equivalence which I find very interesting. >> >> >># Related work >> >>I don't yet have a good understanding of the relation between this >>approach and existing works on Normalization by Evaluation (NbE). NbE >>requires a good understanding of how to combine/compose fragment of >>normal forms together while evaluating the original source terms; our >>normal forms have a very non-local structure that make it difficult >>(In particular, I think that it's possible to present an equivalence >>algorithm that compares two focused terms, computing a saturated form >>on the fly; but starting from un-focused terms and doing focusing on >>the fly looks rather difficult). Saturated normal forms are related to >>the notion of "maximally multi-focused proof" (multi-focusing is an >>extension of focusing that has been studied in the last decade), and >>this corresponds to the known difficulty of formulating >>cut-elimination processes that preserves maximal >>multi-focusing. Giving a normalization-by-evaluation presentation of >>saturation would thus be very interesting. >> >>It is easier to relate to the rewriting-based approaches of Neil Ghani >>and Sam Lindley; the normalization of focused terms into saturated >>terms is very close to the rewriting they propose. In rewriting-based >>approaches, one try to extrude matches of the form (match n with ...) >>(where n is a neutral term; if n started with a constructor, this >>would be beta-reducible) to move them closer to the root of the term, >>and then merge redundant matches. When rewriting a focused term into >>a saturated term, the syntax has let-bindings for neutrals of positive >>type (let x = n in ...), and those get extruded up in the term. Then >>the focusing structure imposes that the corresponding match happen as >>soon as possible after that (so the "merge" of redundant matches is >>implied by focusing). (The use of control operators in Balat, Di >>Cosmo, Fiore also serves the same purpose of finding the earliest >>possible place to introduce a neutral). On the other hand, saturation >>will do forward-based proof search to synthesize all definable >>neutrals, instead of just permuting fragments of the two terms to >>compare, and this is the key ingredient that makes it gracefully >>extend to the empty type: one has to go out in the wild looking for >>proofs of 0. >> >> >>On Sat, Sep 21, 2013 at 10:25 AM, Ryan Wisnesky >>wrote: >>> [ The Types Forum, >>>http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >>> >>> Hi Roberto, >>> >>> BCCCs seem to come up a lot at the intersection between database theory >>>and programming language theory. For example, if you say that a >>>database schema is a finitely presented category, then the category of >>>schemas and mappings between them is a BCCC. For this particular BCCC, >>>I believe that equality is semi-decidable, so I would like to say that >>>its semi-decidability comes from its particular nature, and is not >>>inherited from the free BCCC. BCCCs also come up when you try to >>>connect higher-order logic to the nested relational calculus via their >>>categorical (topos) semantics. I'd be happy to talk more about it >>>offline. >>> >>> Thanks, >>> Ryan >>> >>> On Sep 21, 2013, at 9:34 AM, Roberto Di Cosmo >>>wrote: >>> >>>> Dear Ryan, >>>> beta-eta equality in the presence of strong sums is a tricky >>>>subject that has been object of some attention for more than 20 years, >>>>both "per se" in the rewriting community, that looked for an elegant >>>>confluent rewriting system, and as a "tool" for researchers interested >>>>in type theory and semantics, that wanted to know for example whether >>>>one can characterise object isomorphism in a BiCCC, problem that one >>>>can try to attack by studying invertible terms in the corresponding >>>>lambda calculus with strong types. >>>> >>>> If one stays in the 1,*,-> fragment, all is wonderful: you get a nice >>>>confluent and normalising rewriting system for the lambda calculs based >>>>on eta expansion (see a blazingly short proof in >>>>http://www.dicosmo.org/Articles/POD.ps), the type isomorphisms are >>>>finitely axiomatizable >>>> and exactly correspond to the equations one knows from high schools >>>>for natural numbers equipped with product, unit and exponentiation, >>>>related to Tarski's High School algebra problem >>>> (a nice proof through Finite Sets is due to Sergei Soloviev in 1988, I >>>>gave one based on invertible terms in my PhD thesis back in 1992). >>>> >>>> You may add weak sums to the calculus, and still get a nice confluent >>>>and normalising rewriting system, that can also accomodate bounded >>>>recursion >>>>(http://www.pps.univ-paris-diderot.fr/~kesner/papers/icalp93.ps.gz). >>>> >>>> As soon as you add strong sums, though, things get very hairy: Neil >>>>Ghani proposed in his PhD thesis back in 1995 a decision procedure for >>>>lambda terms with strong 1,+,*,-> (no zero, see >>>> >>>>https://personal.cis.strath.ac.uk/neil.ghani/papers/yellowthesis.ps.gz): >>>> a rewriting system is used to bring terms in this calculus to a normal >>>>form which is not unique, and then it is shown that all such normal >>>>forms are equivalent modulo a sort of commuting conversions, so to >>>>decide equality of two terms, you first normalise them, and then check >>>>whether they are equivalent w.r.t. conversion (the equivalence classes >>>>being finite, this is decidable). The NBE appoach of Thorsten >>>>Altenkirch (et al.) of your LICS01 reference seems to provide a more >>>>straightforward way to decide equality, even if somebody coming from >>>>the rewriting world might like a more intuitive description of what the >>>>normal form actually look like. >>>> >>>> If you add the zero, you are in for more surpises: looking at the >>>>special BiCCC formed by the natural numbers equipped with 0,1,+,* and >>>>exponentiation, you discover that equality is no longer finitely >>>>axiomatisable, and yet a decision procedure does exist (see >>>>http://www.dicosmo.org/Papers/zeroisnfa.pdf), >>>> which solves in the negative, but with a positive note, Tarski's >>>>problem for this system. >>>> >>>> But then, in BiCCCs, the connection between numerical equalities and >>>>type isomorphisms breaks down and the nice result above says nothing >>>>about what happens for the general case. In >>>>http://www.dicosmo.org/Papers/lics02.pdf we proved that isos are non >>>>finitely axiomatisable in BiCCC, but we know nothing on decidability. >>>> >>>> As for deciding beta-eta equality, the work of Marcelo Fiore and >>>>Vincent Balat http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf >>>>shows how to compute normal forms in full BiCCCs, and I kind of >>>>remember that we were convinced at some point that these normal forms >>>>must be unique up to some sort of commuting conversions (you can see >>>>this in the examples of the paper), but we did not prove it at that >>>>moment, and it is true that one would like to see this last step done >>>>(or a proof that it cannot be done). >>>> >>>> I wonder whether Marcelo, Neil and Thorsten might shed some light on >>>>this (Vincent and I started working on quite different areas a while >>>>ago) >>>> >>>> By the way, out of curiosity, would you mind telling us how this >>>>result would be useful your research? >>>> >>>> -- >>>> Roberto >>>> >>>> P.S.: for those interested in type isomorphism, >>>>http://www.dicosmo.org/Paper/mscs-survey.pdf provides a concise >>>>overview, even if no longer up to date. >>>> >>>> >>>> >>>> 2013/9/20 Ryan Wisnesky >>>> [ The Types Forum, >>>>http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >>>> >>>> Hello, >>>> >>>> I am trying to find a reference that states whether or not the free >>>>bi-cartesian closed category is decidable. That is, I am wondering if >>>>beta-eta equality of the simply typed lambda calculus extended with >>>>strong 0,1,+,* types is decidable. (In particular, I am interested in >>>>the full calculus, including the eliminator for 0, introduction for 1, >>>>and eta for all types). >>>> >>>> So far, my search through the literature says that the answer to this >>>>question is "almost" or "maybe" : >>>> >>>> - the STLC with strong 1,+,* types is decidable (omits 0, the empty >>>>type): http://www.cs.nott.ac.uk/~txa/publ/lics01.pdf >>>> >>>> - the STLC with strong 0,1,+,* types has normal forms, but equivalent >>>>terms may have different such normal >>>>forms:http://www.cl.cam.ac.uk/~mpf23/papers/Types/tdpe.pdf >>>> >>>> - work on mechanized proofs of correctness for deciding co-product >>>>equality may or may not include the "absurd" eliminator for the empty >>>>type: http://www.cis.upenn.edu/~sweirich/wmm/wmm07/programme/licata.pdf >>>> >>>> I feel like the answer to this question is probably well-known, but I >>>>can't seem to find it in the literature. Any help would be appreciated. >>>> >>>> Thanks, >>>> Ryan >>>> >>>> >>>> >>>> -- >>>> Roberto Di Cosmo >>>> >>>> ------------------------------------------------------------------ >>>> Professeur En delegation a l'INRIA >>>> PPS E-mail: roberto at dicosmo.org >>>> Universite Paris Diderot WWW : http://www.dicosmo.org >>>> Case 7014 Tel : ++33-(0)1-57 27 92 20 >>>> 5, Rue Thomas Mann >>>> F-75205 Paris Cedex 13 Identica: http://identi.ca/rdicosmo >>>> FRANCE. Twitter: http://twitter.com/rdicosmo >>>> ------------------------------------------------------------------ >>>> Attachments: >>>> MIME accepted, Word deprecated >>>> http://www.gnu.org/philosophy/no-word-attachments.html >>>> ------------------------------------------------------------------ >>>> Office location: >>>> >>>> Bureau 320 (3rd floor) >>>> Batiment Sophie Germain >>>> Avenue de France >>>> Metro Bibliotheque Francois Mitterrand, ligne 14/RER C >>>> ----------------------------------------------------------------- >>>> GPG fingerprint 2931 20CE 3A5A 5390 98EC 8BFC FCCA C3BE 39CB 12D3 >>> > > > > > > This message and any attachment are intended solely for the addressee > and may contain confidential information. If you have received this > message in error, please send it back to me, and immediately delete it. > > Please do not use, copy or disclose the information contained in this > message or in any attachment. Any views or opinions expressed by the > author of this email do not necessarily reflect the views of the > University of Nottingham. > > This message has been checked for viruses but the contents of an > attachment may still contain software viruses which could damage your > computer system, you are advised to perform your own checks. Email > communications with the University of Nottingham may be monitored as > permitted by UK legislation. > From mail at kenkubota.de Sat Oct 15 17:12:04 2016 From: mail at kenkubota.de (Ken Kubota) Date: Sat, 15 Oct 2016 23:12:04 +0200 Subject: [TYPES] Foundations of Mathematics: Type Theory after Church's Simple Theory of Types (1940) Message-ID: <1521826C-32CF-4A73-8E3F-180A9564943E@kenkubota.de> Dear Members of the Research Community, In an attempt to create a genealogy of the foundations of mathematics, the main approaches based upon Church's Simple Theory of Types (1940), - Gordon's HOL (including origins like Huet's and Paulson's LCF and the descendants by Konrad Slind, John Harrison, and Larry Paulson) as well as - Andrews' Q0 (including Tom Melham's extension of HOL drawing on the work of Andrews) are compared online at http://dx.doi.org/10.4444/100.111 The purpose is to make design decisions transparent in order to systematically characterize and compare logistic systems, which either explicitly or implicitly try to express most or all of formal logic and mathematics. Hints, comments, corrections and critique are welcome. Other systems (e.g., Coq) shall be considered at a later time. I wasn't sure concerning the history of Cambridge LCF, as its descriptions slightly differ. In one version, Paulson further develops the (finished) result by Huet, in another they directly collaborate. ("Huet?s Franz Lisp version of LCF was further developed at Cambridge by Larry Paulson, and became known as ?Cambridge LCF?.", http://sourceforge.net/projects/hol/files/hol/kananaskis-10/kananaskis-10-description.pdf, p. 4, vs. "Paulson and Huet then established a collaboration and did a lot of joint development of LCF by sending each other magnetic tapes in the post.", http://www.cl.cam.ac.uk/~mjcg/papers/HolHistory.pdf, p. 3.) In the overview, it is summarized as "Cambridge LCF by Larry Paulson and G?rard Huet (around 1985)", as it seems important that both contributed, and Paulson's name later appears in the context of Isabelle, and Huet's in the context of Coq. Due to limited space, unfortunately not all people contributing to existing projects could be mentioned. For example, I am well aware that Michael Norrish maintains and contributes to HOL4. I apologize for having to restrict the presentation, mentioning only the initiators or project leaders. Kind regards, Ken Kubota ____________________ Ken Kubota doi: 10.4444/100 http://dx.doi.org/10.4444/100 From carette at mcmaster.ca Tue Nov 1 22:43:44 2016 From: carette at mcmaster.ca (Jacques Carette) Date: Tue, 1 Nov 2016 22:43:44 -0400 Subject: [TYPES] Types theories with first-class definitional extensions Message-ID: I am looking for literature on (higher-order, potentially dependent) type theories where a context (telescope) can contain not just declarations, but also definitions. Of course, from a semantic point of view, this is entirely unnecessary, as such definitions can simply be expanded away. However, from a categorical standpoint (amongst others), things are not so trivial: what is the proper meaning of morphisms (substitutions) in such a case? The motivation comes from the remark that the category of contexts for a type theory is the opposite category of that of theory presentations (aka algebraic theories, algebraic specifications, etc). By applying the 'little theories' approach, it is really the theory morphisms that carry the most interesting information. And that, in practice, one uses "conservative extensions" of theories all the time. Some concrete examples: - in Agda, in a parametric record definition, one can also define generic functions which 'depend' on the fields of the record; - in Haskell, in a typeclass definition, one can define default implementations; - in Scala, in a trait, one can similarly define default implementations. These 3 things are essentially the same idea, of a conservative extension of a theory. The problem arises when one wants to build theory morphisms (which would be typeclass-to-typeclass functions in Haskell, or trait-to-trait functions in Scala, neither of which are available AFAIK): what to do with the definitions? Again, from a semantic point-of-view, there is no intrinsic difficulty, and multiple different designs are equivalent. The question only really becomes interesting when considering the pragmatics. Some designs lead to huge combinatorial explosions of definitions once your theory library gets into the 100s of theories, never mind the 1000s, which is what a realistic library of mathematics+CS needs. This is not entirely straightforward. For example, the old theorem prover IMPS actually has 4 different ways to handle the transport of definitional extension along theory morphisms. I have tried to look into the literature for related work, but I have not been successful. Thus this query. Jacques From rwh at cs.cmu.edu Wed Nov 2 09:51:30 2016 From: rwh at cs.cmu.edu (Robert Harper) Date: Wed, 2 Nov 2016 09:51:30 -0400 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: References: Message-ID: <146AFA07-E41D-4AC6-B048-797B842BE170@cs.cmu.edu> See my papers with Chris Stone, and his dissertation, on singleton kinds/types, and also David Aspinall?s work on singletons. Bob Harper > On Nov 1, 2016, at 22:43, Jacques Carette wrote: > > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > I am looking for literature on (higher-order, potentially dependent) type theories where a context (telescope) can contain not just declarations, but also definitions. > > Of course, from a semantic point of view, this is entirely unnecessary, as such definitions can simply be expanded away. However, from a categorical standpoint (amongst others), things are not so trivial: what is the proper meaning of morphisms (substitutions) in such a case? > > The motivation comes from the remark that the category of contexts for a type theory is the opposite category of that of theory presentations (aka algebraic theories, algebraic specifications, etc). By applying the 'little theories' approach, it is really the theory morphisms that carry the most interesting information. And that, in practice, one uses "conservative extensions" of theories all the time. > > Some concrete examples: > - in Agda, in a parametric record definition, one can also define generic functions which 'depend' on the fields of the record; > - in Haskell, in a typeclass definition, one can define default implementations; > - in Scala, in a trait, one can similarly define default implementations. > These 3 things are essentially the same idea, of a conservative extension of a theory. > > The problem arises when one wants to build theory morphisms (which would be typeclass-to-typeclass functions in Haskell, or trait-to-trait functions in Scala, neither of which are available AFAIK): what to do with the definitions? Again, from a semantic point-of-view, there is no intrinsic difficulty, and multiple different designs are equivalent. The question only really becomes interesting when considering the pragmatics. Some designs lead to huge combinatorial explosions of definitions once your theory library gets into the 100s of theories, never mind the 1000s, which is what a realistic library of mathematics+CS needs. > > This is not entirely straightforward. For example, the old theorem prover IMPS actually has 4 different ways to handle the transport of definitional extension along theory morphisms. > > I have tried to look into the literature for related work, but I have not been successful. Thus this query. > > Jacques > From Michael.Greenberg at pomona.edu Wed Nov 2 11:58:42 2016 From: Michael.Greenberg at pomona.edu (Michael Greenberg) Date: Wed, 2 Nov 2016 15:58:42 +0000 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: References: Message-ID: <23B8ED66-D79B-432C-8127-F58F0FDAEBEB@pomona.edu> Hi Jacques, Most refinement type systems can do something _like_ adding equalities using ?selfified? types (Ou et al., TCS 2004) like {x:Int | x = 5}, i.e., x is an Int such that x is 5. The Sage language (Knowles et al., Scheme Workshop 2006) adds equalities to the context in a more direct way. Knowles and Flanagan have a technical report that uses existentials to tame the holes in the dyke of abstraction introduced by dependent types . Cheers, Michael > On Nov 1, 2016, at 7:43 PM, Jacques Carette wrote: > > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > I am looking for literature on (higher-order, potentially dependent) type theories where a context (telescope) can contain not just declarations, but also definitions. > > Of course, from a semantic point of view, this is entirely unnecessary, as such definitions can simply be expanded away. However, from a categorical standpoint (amongst others), things are not so trivial: what is the proper meaning of morphisms (substitutions) in such a case? > > The motivation comes from the remark that the category of contexts for a type theory is the opposite category of that of theory presentations (aka algebraic theories, algebraic specifications, etc). By applying the 'little theories' approach, it is really the theory morphisms that carry the most interesting information. And that, in practice, one uses "conservative extensions" of theories all the time. > > Some concrete examples: > - in Agda, in a parametric record definition, one can also define generic functions which 'depend' on the fields of the record; > - in Haskell, in a typeclass definition, one can define default implementations; > - in Scala, in a trait, one can similarly define default implementations. > These 3 things are essentially the same idea, of a conservative extension of a theory. > > The problem arises when one wants to build theory morphisms (which would be typeclass-to-typeclass functions in Haskell, or trait-to-trait functions in Scala, neither of which are available AFAIK): what to do with the definitions? Again, from a semantic point-of-view, there is no intrinsic difficulty, and multiple different designs are equivalent. The question only really becomes interesting when considering the pragmatics. Some designs lead to huge combinatorial explosions of definitions once your theory library gets into the 100s of theories, never mind the 1000s, which is what a realistic library of mathematics+CS needs. > > This is not entirely straightforward. For example, the old theorem prover IMPS actually has 4 different ways to handle the transport of definitional extension along theory morphisms. > > I have tried to look into the literature for related work, but I have not been successful. Thus this query. > > Jacques > From wjb at williamjbowman.com Wed Nov 2 12:26:30 2016 From: wjb at williamjbowman.com (William J. Bowman) Date: Wed, 2 Nov 2016 12:26:30 -0400 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: References: Message-ID: <20161102162630.GB1538@williamjbowman.com> On Tue, Nov 01, 2016 at 10:43:44PM -0400, Jacques Carette wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > I am looking for literature on (higher-order, potentially dependent) type > theories where a context (telescope) can contain not just declarations, but > also definitions. I may misunderstand your question, but doesn't CIC have this (and other languages with dependent let)? The typing rule for dependent let adds a definition to the context: ?;? ? e : t ?;?,x = e :t ? e' : t ---------------------- ?;? ? let x = e in e' : t[e/x] https://coq.inria.fr/refman/Reference-Manual006.html This also reminds me of translucency. A translucent type add a definition to the type declaration: (x = e : t) -> t' ? (x = e : t). t The original work on translucent sums: https://www.cs.cmu.edu/~rwh/theses/lillibridge.pdf I've also seen translucent functions here, which has a good explanation of translucency and more citations to chase: http://dl.acm.org/citation.cfm?id=237791 -- William J. Bowman Northeastern University College of Computer and Information Science From frederic.blanqui at inria.fr Thu Nov 3 03:17:35 2016 From: frederic.blanqui at inria.fr (=?UTF-8?B?RnLDqWTDqXJpYyBCbGFucXVp?=) Date: Thu, 3 Nov 2016 08:17:35 +0100 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: <20161102162630.GB1538@williamjbowman.com> References: <20161102162630.GB1538@williamjbowman.com> Message-ID: Hello. For pure type systems with definitions, see the LFCS'94 paper of Poll & Severi: http://dx.doi.org/10.1007/3-540-58140-5_30. For a module calculus for PTSs, see the TLCA'97 paper of Courant, http://dx.doi.org/10.1007/3-540-62688-3_32, and its journal version in JFP'07: http://dx.doi.org/10.1017/S0956796806005867. Best regards, Fr?d?ric. Le 02/11/2016 ? 17:26, William J. Bowman a ?crit : > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > On Tue, Nov 01, 2016 at 10:43:44PM -0400, Jacques Carette wrote: >> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >> I am looking for literature on (higher-order, potentially dependent) type >> theories where a context (telescope) can contain not just declarations, but >> also definitions. > I may misunderstand your question, but doesn't CIC have this (and other languages with dependent let)? > > The typing rule for dependent let adds a definition to the context: > > ?;? ? e : t > ?;?,x = e :t ? e' : t > ---------------------- > ?;? ? let x = e in e' : t[e/x] > > https://coq.inria.fr/refman/Reference-Manual006.html > > This also reminds me of translucency. > A translucent type add a definition to the type declaration: > > (x = e : t) -> t' > ? (x = e : t). t > > The original work on translucent sums: > https://www.cs.cmu.edu/~rwh/theses/lillibridge.pdf > > I've also seen translucent functions here, which has a good explanation of translucency and more > citations to chase: > http://dl.acm.org/citation.cfm?id=237791 > From carette at mcmaster.ca Wed Nov 2 16:21:04 2016 From: carette at mcmaster.ca (Jacques Carette) Date: Wed, 2 Nov 2016 16:21:04 -0400 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: <20161102162630.GB1538@williamjbowman.com> References: <20161102162630.GB1538@williamjbowman.com> Message-ID: <581A4AB0.2080404@mcmaster.ca> Thanks for the many answers. Some of my comments below could have been attached to other answers as well - I am not targetting CIC or transculent sums in particular! On 2016-11-02 12:26 PM, William J. Bowman wrote: > On Tue, Nov 01, 2016 at 10:43:44PM -0400, Jacques Carette wrote: >> [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >> I am looking for literature on (higher-order, potentially dependent) type >> theories where a context (telescope) can contain not just declarations, but >> also definitions. > I may misunderstand your question, but doesn't CIC have this (and other languages with dependent let)? > > The typing rule for dependent let adds a definition to the context: > > ?;? ? e : t > ?;?,x = e :t ? e' : t > ---------------------- > ?;? ? let x = e in e' : t[e/x] > > https://coq.inria.fr/refman/Reference-Manual006.html > > This also reminds me of translucency. > A translucent type add a definition to the type declaration: > > (x = e : t) -> t' > ? (x = e : t). t > > The original work on translucent sums: > https://www.cs.cmu.edu/~rwh/theses/lillibridge.pdf > > I've also seen translucent functions here, which has a good explanation of translucency and more > citations to chase: > http://dl.acm.org/citation.cfm?id=237791 > This seems really close, but there is something I don't see: how are the morphisms defined in the Contextual Category that corresponds to Coq's CIC's term language? The 'obvious' definition of substitution works, but has very unpleasant properties, especially if you are trying to build a nice user interface. The language of "with" for type signatures is very impoverished, and not up to the job of doing that. It is not so much just having definitional extensions which are of interest (as that's been around for a long time), but having good behaviour, especially good *syntactic* behaviour, under repeated transport. For an extreme example of this, see Adrian Mathias, A Term of Length 4,523,659,424,929 , Synthese 133 (2002) 75--86. https://www.dpmms.cam.ac.uk/~ardm/inefff.pdf Buried in an answer on http://mathoverflow.net/questions/14356/bourbakis-epsilon-calculus-notation you can also find similar numbers for ZFC. Without care, the same can happen in a type theory, if definitional extensions are not treated as having important syntactic content, along with its (trivial!) semantic content. Jacques From herman at cs.ru.nl Thu Nov 3 15:08:27 2016 From: herman at cs.ru.nl (Herman Geuvers) Date: Thu, 3 Nov 2016 20:08:27 +0100 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: References: <20161102162630.GB1538@williamjbowman.com> Message-ID: <5999010e-6d19-e271-e1c2-493274fa222b@cs.ru.nl> We define and treat a system lambda-D in our book (Calculus of Constructions with definitions): Rob Nederpelt and Herman Geuvers Type Theory and Formal Proof, An Introduction, Cambridge University Press, December 2014. See also http://www.win.tue.nl/~wsinrpn/book_type_theory.htm Another source is Fairouz Kamareddine, Twan Laan and Rob Nederpelt: A Modern Perspective on Type Theory, From its Origins until Today, Kluwer Academic Publishers, Applied Logic Series, Vol. 29, 2004 ? Chapter 10: Pure Type Systems with parameters and definitions. Best Herman Geuvers On 11/03/2016 08:17 AM, Fr?d?ric Blanqui wrote: > [ The Types Forum, > http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > Hello. > > For pure type systems with definitions, see the LFCS'94 paper of Poll & > Severi: http://dx.doi.org/10.1007/3-540-58140-5_30. > > For a module calculus for PTSs, see the TLCA'97 paper of Courant, > http://dx.doi.org/10.1007/3-540-62688-3_32, and its journal version in > JFP'07: http://dx.doi.org/10.1017/S0956796806005867. > > Best regards, > > Fr?d?ric. > > > Le 02/11/2016 ? 17:26, William J. Bowman a ?crit : >> [ The Types Forum, >> http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >> >> On Tue, Nov 01, 2016 at 10:43:44PM -0400, Jacques Carette wrote: >>> [ The Types Forum, >>> http://lists.seas.upenn.edu/mailman/listinfo/types-list ] >>> >>> I am looking for literature on (higher-order, potentially dependent) >>> type >>> theories where a context (telescope) can contain not just >>> declarations, but >>> also definitions. >> I may misunderstand your question, but doesn't CIC have this (and >> other languages with dependent let)? >> >> The typing rule for dependent let adds a definition to the context: >> >> ?;? ? e : t >> ?;?,x = e :t ? e' : t >> ---------------------- >> ?;? ? let x = e in e' : t[e/x] >> >> https://coq.inria.fr/refman/Reference-Manual006.html >> >> This also reminds me of translucency. >> A translucent type add a definition to the type declaration: >> >> (x = e : t) -> t' >> ? (x = e : t). t >> >> The original work on translucent sums: >> https://www.cs.cmu.edu/~rwh/theses/lillibridge.pdf >> >> I've also seen translucent functions here, which has a good >> explanation of translucency and more >> citations to chase: >> http://dl.acm.org/citation.cfm?id=237791 >> > From fp at cs.cmu.edu Sat Nov 12 15:56:47 2016 From: fp at cs.cmu.edu (Frank Pfenning) Date: Sat, 12 Nov 2016 15:56:47 -0500 Subject: [TYPES] Types theories with first-class definitional extensions In-Reply-To: References: Message-ID: Since you are mentioning the pragmatics, we did quite a bit of work in Twelf to handle notational definitions efficiently, which doesn't seem to be widely known. First, there is the notion of *strict definitions* Frank Pfenning and Carsten Sch?rmann. Algorithms for equality and unification in the presence of notational definitions , TYPES'98. which, in the implementation, is combined with a "depth index" to steer minimal incremental expansion of definitions for the purpose of equality testing and unification. This has been extended to encompass proof irrelevance. Jason Reed, Proof Irrelevance and Strict Definitions in a Logical Framework , TR CMU-CS-02-153, June 2002 and there also has been quite a bit on redundancy elimination in dependent type theories, which interacts with notational definitions. Jason Reed. Redundancy Elimination for LF . LFM'04. - Frank On Tue, Nov 1, 2016 at 10:43 PM, Jacques Carette wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list > ] > > I am looking for literature on (higher-order, potentially dependent) type > theories where a context (telescope) can contain not just declarations, but > also definitions. > > Of course, from a semantic point of view, this is entirely unnecessary, as > such definitions can simply be expanded away. However, from a categorical > standpoint (amongst others), things are not so trivial: what is the proper > meaning of morphisms (substitutions) in such a case? > > The motivation comes from the remark that the category of contexts for a > type theory is the opposite category of that of theory presentations (aka > algebraic theories, algebraic specifications, etc). By applying the > 'little theories' approach, it is really the theory morphisms that carry > the most interesting information. And that, in practice, one uses > "conservative extensions" of theories all the time. > > Some concrete examples: > - in Agda, in a parametric record definition, one can also define generic > functions which 'depend' on the fields of the record; > - in Haskell, in a typeclass definition, one can define default > implementations; > - in Scala, in a trait, one can similarly define default implementations. > These 3 things are essentially the same idea, of a conservative extension > of a theory. > > The problem arises when one wants to build theory morphisms (which would > be typeclass-to-typeclass functions in Haskell, or trait-to-trait functions > in Scala, neither of which are available AFAIK): what to do with the > definitions? Again, from a semantic point-of-view, there is no intrinsic > difficulty, and multiple different designs are equivalent. The question > only really becomes interesting when considering the pragmatics. Some > designs lead to huge combinatorial explosions of definitions once your > theory library gets into the 100s of theories, never mind the 1000s, which > is what a realistic library of mathematics+CS needs. > > This is not entirely straightforward. For example, the old theorem prover > IMPS actually has 4 different ways to handle the transport of definitional > extension along theory morphisms. > > I have tried to look into the literature for related work, but I have not > been successful. Thus this query. > > Jacques > > -- Frank Pfenning, Professor and Head Department of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3891 http://www.cs.cmu.edu/~fp +1 412 268-6343 GHC 7019 From maietti at math.unipd.it Wed Dec 28 05:59:52 2016 From: maietti at math.unipd.it (Maria Emilia Maietti) Date: Wed, 28 Dec 2016 11:59:52 +0100 Subject: [TYPES] independence of AC! or AC from CIC Message-ID: <58639B28.6030903@math.unipd.it> Dear all is it known whether the axiom of unique choice (AC!) or even the only axiom of choice (both as formulas in Prop) are independent from the Calculus of Inductive Constructions ? Any reference? For the pure Calculus of Constructions the independence of AC! and AC was shown in T. Streicher " Independence of the induction principle and the axiom of choice in the pure calculus of constructions." TCS, 1992, vol.103. Many thanks for your cooperation Milly (Maria Emilia Maietti) From streicher at mathematik.tu-darmstadt.de Thu Dec 29 14:10:49 2016 From: streicher at mathematik.tu-darmstadt.de (streicher at mathematik.tu-darmstadt.de) Date: Thu, 29 Dec 2016 20:10:49 +0100 Subject: [TYPES] independence of AC! or AC from CIC In-Reply-To: <58639B28.6030903@math.unipd.it> References: <58639B28.6030903@math.unipd.it> Message-ID: <0643f30ccd48e991a8cf2f35326f2eb5.squirrel@webmail.mathematik.tu-darmstadt.de> Dear Milly, my model in loc.cit. interpretates types as assemblies and thus all inductive types as required by ICC. Only Prop is interpreted alternatively in order to falsify Axiom of Unique Choice. Thomas > is it known whether the axiom of unique choice (AC!) or even > the only axiom of choice (both as formulas in Prop) > > are independent from the Calculus of Inductive Constructions ? > > Any reference? > > For the pure Calculus of Constructions the independence of AC! and AC > was shown > in > T. Streicher " Independence of the induction principle and the axiom of > choice in the pure calculus of constructions." TCS, 1992, vol.103. > > Many thanks for your cooperation > Milly > (Maria Emilia Maietti) >