Blog Archives
Topic Archive: proof theory
Cut Elimination
Gentzen proved a famous cut elimination theorem using constructive methods. It is also possible to prove it non-constructively, with a conceptually simpler argument, but a constructive proof is important for a number of reasons. I will present the constructive argument for cut elimination, for classical propositional logic, formulated using tableaus. This is purely expository; there is nothing new. But it is a proof that every logician, or logician-to-be, should know, and I will cover the basics using a detailed set of slides which will be made available afterward, on my web site.
Some abstract versions of Goedel’s Second Incompleteness Theorem based on non-classical logics
We study abstract versions of Goedel’s second incompleteness theorem and formulate generalizations of Loeb’s derivability conditions that work for logics weaker than the classical one. We isolate the role of the contraction and weakening rules in Goedel’s theorem and give a (toy) example of a system based on modal logic without contraction invalidating Goedel’s argument. On the other hand, Goedel’s theorem is established for a version of Peano arithmetic without contraction. (Joint work with Daniyar Shamkanov)
The consistency of Peano Arithmetic
In 1936, only a few years after the incompleteness theorems were proved, Gentzen proved the consistency of Peano arithmetic by using transfinite induction up to the ordinal epsilon_0. I will give a short proof of the result, based on on the simplification introduced by Schutte, and discuss some of the consequences.
The proof/plan correspondence in databases
We consider answering queries over datasources (for logicians: finite relational structures) that are connected to one another by integrity constraints (logical relationships). A relation in a datasource may have multiple means of access with different required arguments, and the means of access within and across relations may differ radically in cost. We present a method for generating the lowest cost query plan, where a plan is a reformulation of the query referencing only the allowable access methods, equivalent to the original query on all datasources satisfying the integrity constraints. A solution to this problem can be applied in many database contexts, including answering queries using views, optimizing queries using constraints, and querying on top of web services or web forms. It could also be applied to give a more logic-based foundation for traditional database query planning.
Proof-driven query reformulation generates plans by applying interpolation procedures to a “proof that the query can be answered,” an approach that goes back at least to Craig’s proof of the Beth Definability theorem via interpolation. A few additional ingredients are needed to apply interpolation to database problems, including:
— variations of Beth Definability that allow us to target plans (explicit definitions) of specific forms
— an extension of interpolation that considers the “access patterns” within quantification
— specialized proof procedures for classes of integrity constraints of interest to databases
— cost-based search procedures for concurrently exploring the collection of proofs and plans
The resulting query planning system blends automated deduction, query optimization, and heuristic search.
This is joint work with Balder ten Cate, Julien Leblay and Efi Tsamoura.
Reflection Principles Involving Provability and Explicit Proofs
Reflection principles are classical objects in proof theory and the areas studying Gödel’s Incompleteness. Reflection principles based on provability predicates were introduced in the 1930s by Rosser and Turing, and later were explored by Feferman, Kreisel & Levi, Schmerl, Artemov, Beklemishev and others.
We study reflection principles of Peano Arithmetic involving both Proof and Provability predicates. We find a classification of these principles and establish their linear ordering with respect to their metamathematical strength.
Higher-order Reverse Mathematics: Where existence meets computation via infinitesimals.
Classically, the existence of an object tells us very little about how to construct said object.
We consider a nonstandard version of Ulrich Kohlenbach’s higher-order Reverse Mathematics
in which there is a very elegant and direct correspondence between, on one hand, the existence
of a functional computing an object and, on the other hand, the classical existence of this object
with the same standard and nonstandard properties. We discuss how these results -potentially-
contribute to the programs of finitistic and predicativist mathematics.