ExtenDD brings together two areas of intensive research that - surprisingly - so far have rarely come together: complex terms and proof theory. By complex terms we understand expressions that purport to denote objects like proper names, but that, unlike proper names, exhibit structure that contributes to their meaning. The most important examples of such expressions are definite descriptions. Proof theory is the mathematical study of proofs themselves. ExtenDD applies the tools of proof theory to proofs that appeal to propositions containing complex terms. A definite description is an expression of the form 鈥榯he F鈥: the author of 鈥極n Denoting鈥, the author of Principia Mathematica, the present King of France. A definite descriptions aims to denote an object by virtue of a property that only it has: 鈥榯he F鈥 aims to denote the sole F. The first example denotes Russell, as he was the sole author of 鈥極n Denoting鈥. Sometimes a definite description fails, because nothing or more than one thing has the property F, as in the second and third example: Principia Mathematica had two authors, Whitehead and Russell, and France is presently a republic. If a unique object has the property F, the definite description 鈥榯he F鈥 is called proper; otherwise it is improper. Pride of place for the first formalisation of a theory of definite descriptions as part of a mathematical investigation belongs to Frege鈥檚 Grundgesetze der Arithmetik. But it was Russell who brought definite descriptions to prominence. Since his article 鈥極n Denoting鈥, in which Russell put forward his celebrated theory of definite descriptions, these expressions occupy a central place in philosophical research and many deep and detailed studies have been carried out in philosophical logic, epistemology, metaphysics and philosophy of language. Russell was prompted by reflections on improper definite descriptions: what do sentences like 鈥楾he present King of France is bald鈥 mean? 鈥楾he present King of France鈥 denotes no one, so the truth or falsity of 鈥楾he present King of France is bald鈥 is not determined by whether the present King of France is bald or not. As there is no present King of France, 鈥楾he present King of France is bald鈥 should not be true, as otherwise we are stating that a certain person has a certain property.
Russell accepted the principle of bivalence that every sentence is either true or false, hence the 鈥楾he present King of France is bald鈥 should be false. But then it would appear that 鈥楾he present King of France is not bald鈥 should be true and the same problem arises once more: if so, we are stating that a certain person fails to have a certain property. Russell concluded that sentences of the form 鈥楾he F is G鈥 do not have the subject-predicate form they appear to have. According to Russell no meaning is assigned to the expression 鈥榯he F鈥 standing alone, but only in the context of complete sentences in which it occurs. 鈥楾he F is G鈥 means 鈥楾here is exactly one F and it is G鈥. Thus the complex term 鈥楾he F 鈥 disappears upon logical analysis. There are then two negations of 鈥楾he F is G鈥: the external negation 鈥業t is not the case that the F is G鈥, meaning 鈥業t is not the case that there is exactly one F and it is G鈥, and the internal negation 鈥楾he F is not G鈥, meaning 鈥楾here is exactly one F and it is not G鈥. The former is true, the latter false. Russell鈥檚 problem is solved.
Russell鈥檚 theory of definite descriptions had enormous influence on the development of analytic philosophy in the twentieth century, as it was applied, by Russell himself and those influenced by him, to epistemology, metaphysics and philosophy of language. Scores of text books still use Russell鈥檚 theory as their official theory of definite descriptions. Despite its brilliance, Russell鈥檚 account has drawbacks. It appears to assign wrong truth values to sentences. Diogenes Laertius reports that Thales, having discovered his theorem, offered a sacrifice of oxen to the god at Delphi. According to Russell鈥檚 theory, what Diogenes reports is false, as it contains an improper definite description. That sounds wrong, if Thales did indeed offer a sacrifice at Delphi. Some philosophers, like Strawson, argued that 鈥楾he present King of France is bald鈥 should be neither true nor false, as, due to the improper definite description, it is somehow defective. The second half of the 20th century saw the development of new approaches to definite descriptions. In Frege鈥檚 and Russell鈥檚 classical logic, it is assumed that every singular term denotes an object. In free logic, originating in the work of Hintikka and Lambert, this assumption is given up. Negative free logic takes a Russellian line according to which atomic formulas containing non-denoting terms are false. Neutral free logics pick up Strawson鈥檚 view that they are neither true nor false. Positive free logic permits them to be true. There is now a plethora of theories of definite descriptions, but very rarely have they been investigated from a proof theoretic perspective.
Proof Theory was founded by Hilbert at the beginning of the 20th century as the mathematical study of proofs. Initially it focused on axiomatisations of mathematical theories and proving their consistency. It was soon extended to the study of proofs outside mathematics, in particular to formal logic, which in turn was applied to the analysis of ordinary language and arguments in general. G枚del鈥檚 limitative results showed that the greatest ambition of Hilbert鈥檚 programme, to prove the consistency of arithmetic by finitary means, could not be achieved as envisaged. Gentzen鈥檚 groundbreaking work led to a substantial redefinition of the aims and tools of proof theory. It now occupies a centre stage in modern formal logic. His sequent calculus and systems of natural deduction represent the actual proofs carried out by logicians and mathematician much better than Hilbert鈥檚 axiomatic systems. These tools permit the deepest study of proofs and their properties. Following Gentzen鈥檚 work, a consensus developed over what constitutes a good proof system and the form rules of inference should take. The rules for each logical constant, expressions such as 鈥榥ot鈥, 鈥榓ll鈥 and 鈥榠f-then鈥, exhibit a similar form, a remarkable systematic, as Gentzen notes. Each constant is governed by two kinds of rules for their use in proofs, and these rules govern only that constant. This form permits a precise study of where and how the logical constants are used, whether this use is essential or may be eschewed, and it is the basis for cut elimination in proofs in sequent calculus and normalisation of proofs in natural deduction. These results relate to a common phenomenon of mathematics: proofs are combined to establish new results, as lemmas are used in proofs of theorems from which corollaries are inferred in turn. Sometimes a theorem or corollary is simpler than the lemmas or theorems from which it is deduced. Gentzen鈥檚 Hauptsatz (cut elimination theorem) establishes that there is also a direct proof that proceeds from simple to complex 鈥榳ithout detours鈥, as Gentzen says. As it is the latter in terms of which the correctness of proofs is defined, Gentzen鈥檚 result establishes that combining proofs in the said way is guaranteed to give a correct proof. In addition to these technical achievements, Gentzen鈥檚 ideas led also to the development of the more philosophically oriented proof-theoretic semantics, where the meanings of constants are defined in terms of their rules of inference.
ExtenDD will combine the paradigm of philosophy that is the theory of definite descriptions with the paradigm of proof theory that are Gentzen鈥檚 calculi. Despite the long history of research into definite descriptions, the methods developed by Gentzen have rarely been applied in research on definite descriptions. Only a small effort has so far been put into the adequate treatment of definite descriptions in this framework. The same counts for other complex singular terms such as set abstracts and number operators. ExtenDD fills this important gap in research. Applying the methods of proof theory to definite descriptions is profitable to both sides. Competing theories of definite descriptions and complex terms in general, their advantages and shortcomings, are shown in a new light. The behaviour of complex terms needs subtle syntactical analysis and requires enriching the toolkit of proof theory. ExtenDD deals with both challenges: it develops formal theories of definite descriptions and modifies the machinery of proof theory to cover new areas of application. ExtenDD will also formulate new theories of definite descriptions and other terms with an eye on proof-theoretic desiderata. ExtenDD aims to have a significant impact on proof theory, the philosophy of logic and language, linguistic, computer science and automated deduction.