FOUNDATIONS OF THE FORMAL SCIENCES IV

The History of the Concept of the Formal Sciences


Rheinische Friedrich-Wilhelms-Universität Bonn
Mathematisches Institut

February 14th to 17th, 2003



Abstracts:



From the Forms of Life to the Life of Forms: Neo-Darwinism, Artificial Life, and Biosemiotics


Stefan Artmann (Jena)


Formal modelling was and still is of utmost importance for the development of a neo-Darwinian theory of evolution. For example, without the decisive advances in theoretical population genetics in the 1920s (Fisher, Haldane, Wright) the so-called "Modern Synthesis" of biological knowledge in the framework of a theory of natural selection would not had been possible during the 1930s and 40s (Dobzhansky, Huxley, Mayr). To define both the concept of inclusive fitness as basic measure of agential evolutionary effectiveness (Hamilton) and the concept of gene as basic unit of natural selection (Williams), it is necessary to develop a coherent formal model of consistent biological histories. In this respect, the post-war period of fundamental research in biology was profoundly influenced by information-theoretical work on living systems. Since the 1980s, research into Artificial Life draws the radical consequences from this tradition. Artificial Life can be defined as the project to simulate evolutionary processes in computers, and presupposes that the material differences between natural and artificial systems are only of secondary relevance to the definition of systems susceptible to evolution. But a full-fledged theory of life as form has to acknowledge the semiotical nature of information and must investigate formally not only the syntactical, but also the semantical and the pragmatical dimension of living systems. A comparison between the histories of information theory and of semiotics diagnoses a convergence of both leading in biology, since the 1990s, towards the so-called "biosemiotical" approach to life. Its usefulness can be substantiated by analysing Hamilton's concept of inclusive fitness.


The Formal Character of Logic


Jean-Yves Béziau (Neuchâtel)

Logic can be considered as formal in many ways. According to historians, the expression "formal logic" is due to Kant. However logic is formal since Aristotle in the sense that a typical feature of the syllogistic is that the validity of an argument depends on its form and not on its matter or content. And modern logic is still formal in this sense although the characterization of what is the form of reasoning has changed.

In this talk I will discuss the evolution of this notion of logical form, from Aristotle, Kant, Boole, Frege up to the exact mathematical characterization given by J.Los and R.Suszko in the 1950s through the concept of structural consequence operator. I will also discuss some trends challenging the formal character of logic: the quasi-formal logic of Destouches-Février and logics taking in account the notion of meaning like relevant logic.


Ontology and Mathematical Practice


Jessica Carter (Odense)

In this talk I shall present a position on the ontology of mathematics that is based on a case study in part of modern mathematics.

The case study concerns the beginnings of the now established discipline, K-theory. K-theory was first presented as a theory by Sir Michael Atiyah in the late 1950's. Atiyah's work is based on the work of Alexander Grothendieck, who introduced the first K-group in connection with a generalization of the so called theorem of Riemann-Roch in 1957.

It can be seen in the case study that, inspired by developments in various branches of mathematics, Grothendieck suggested a generalization of the theorem of Riemann-Roch. But to formulate and prove this generalization, it was necessary to introduce a new object, the $K$-group. This object was formed as a quotient of already accepted mathematical objects. With other objects from the case study, it can also be seen that they are constructed by accepted construction methods from already existing objects.

Thus, one conclusion that can be drawn from the case study is that the mathematical objects are constructed or introduced by mathematicians.

Another conclusion is that, after a mathematical object has been introduced, it exists as an objectively accessible abstract object. An abstract object is here not to be understood in the usual sense as an object, which exists outside of time and space, but rather as a conceptual object which is created by human beings. Objective accessibility means that after a mathematical object has been introduced, it is then possible for other mathematicians to gain access to the object. This follows from the fact that, when introducing a new object, the mathematician is required to give a unique characterization of it.

In the talk I will scetch the outlines of this position, and argue for the above two claims with examples from the case-study.


Frege as formal scientist: pro et contra


Yury Chernoskutov (St.Petersburg)

Thesis: a) Frege is the founder of modern first-order logic, he was first to represent what is called "formal system", but b) in historical perspective he is not formal philosopher, but rather traditional neokantianist.

Outlines of argumentation.

Understanding the Formal Sciences


Kevin de Laplante (Ames IA)


A traditional way of classifying the sciences is as follows. Within the "natural" sciences we find disciplines like physics, chemistry, biology, geology and astronomy. These are contrasted with the "social" sciences, such as anthropology, sociology, linguistics, and economics. Certain disciplines, like psychology, have sub-fields that may more naturally fall into one or the other category (e.g., cognitive neuroscience versus social psychology). The "empirical" natural and social sciences are typically contrasted with the "formal" disciplines of logic and mathematics. Question: where do the so-called "formal" sciences - disciplines such as control theory, network theory, game theory, artificial intelligence, artificial life, self-organization theory, and so on - fit within this traditional scheme of classification? On the one hand, they appear to be essentially mathematical in nature; on the other hand, they are regularly used to study real-world physical, chemical, biological and social phenomena. Are they just branches of formal mathematics that happen to have useful applications, or are they something different, a distinctive kind of empirical science that defies traditional classification? The question is not an easy one, as it turns on assumptions concerning the nature of mathematics and its relation to physical reality. In this paper I examine and evaluate several attempts to answer the question of how to understand the formal sciences. I argue for a view that places the formal sciences on a continuum between pure mathematics and empirical science, and that emphasizes the contrasting roles of "formal" and "physical" constraints on the behavior of natural systems.


Mathematics as a General Science of Structure


Friedrich Dudda (Bochum)

In the form of a connective conceptual analysis (in Strawson's sense) I will try to show that mathematics can be understood as a general science of structure. At first, I will try to defend the traditional view that in mathematics an assertion is justified if and only if a proof of the asserted proposition is known. With regard to the notion of mathematical proof I will contradict Lakatos' account of non-formal proofs and defend the view that an argument is a mathematical proof if and only if the conclusion of the argument is derived from a satisfiable set of premises by truth-preserving steps. By a mathematical theory we may understand a satisfiable set of postulates, but obviously not every satisfiable set of postulates is a mathematical theory. In order to clarify the notion of a mathematical theory, I will introduce the terms "mathematical structure" and "instance of a mathematical structure". As in mathematics a subject is described only down to the level of isomorphic instances, it follows that not instances of structures, but structures themselves and types of structures are the subjects of mathematical research. In consequence, a mathematical theory may be defined as a satisfiable set of postulates defining a mathematical structure or a type of mathematical structures. I will argue that in mathematics the difference between two theories which contain contradictory postulates cannot be considered as a difference between a right and a wrong theory. On the contrary, every possible structure is a legitimate subject of mathematical interest. As the real world is only one out of many possible worlds, I will claim that mathematical structures applied in the empirical sciences are only a proper subset of all mathematical structures. Paraphrasing a famous statement of Descartes I will conclude that there is a general science which tries to explain all that can be known about structure, considered independently of any application to a particular subject, and that, indeed, this science has its own proper name, consecrated by long usage, to wit, mathematics. Finally, I will argue that the presented view of mathematics is incompatible with any kind of mathematical realism, including Putnam's quasi-realism.


Medieval Logic and the Modern Notion of 'Formal'


Catarina Dutilh Novaes (Leiden)

As a consequence of the mathematical turn taken by logic in the 19th century (Cf. Frege), the notion of logic as a discipline has changed drastically. Therefore, investigations that received the name 'logic' in ancient Greek philosophy, in medieval philosophy and even in modern philosophy up to Kant, hardly seem to deserve this title from our modern viewpoint. Could it be that there is virtually no common ground between systems of logic of the past and the present ones? Is it a case of mere lexical ambiguity (the same word being used for two distinct notions)?

I propose to turn to medieval logic in order to shed light on this issue. As widely known, medieval logic was an item of the Ars curriculum: it was part of the trivium along with rhetoric and grammar. Hence logic had strong connections with language investigations in general. In fact, many of the 'logical' inquiries undertaken by medieval logicians can nowadays at best be placed under the designation of philosophy of logic and/or philosophy of language.

However, if one bears in mind that the most fundamental trait of logic is its formal character (so that it is said to be a 'formal science'), the disparities between medieval logical systems and our accepted notion of logic seem to be attenuated. To substantiate this claim, I shall examine one specific logical system, Ockham's theory of suppositio, in the light of the concept of formal understood as abstraction from content (not necessarily symbolic). By means of a formal reconstruction, I will argue that this system is a formal theory for the analysis and interpretation of sentences, whose machinery generates all possible readings of a given (written/spoken) sentence.

So, even though medieval logic is roughly speaking much closer to the Arts than modern logic is (besides being expressed in prose instead of in symbolic language), the distinctive formal character of some medieval systems seems to justify their classification as logic, even according to our current notion thereof.


The infinite in mathematics - a regulative idea?


Johannes Emrich (Erlangen)

In 1925 David Hilbert proposed the role of the infinite in mathematics to be that of a regulative idea in the sense of Kant. Although there can be found striking similarities between Kant's ideas and Hilberts ideal elements indeed (see, e.g., Detlefsen 1993, Majer 1993), Hilbert's claim seems quite modest and can even be seen as inadequate with respect to the simplest type of mathematical infinity: For denumerably infinite sets, as e.g. the set of all natural numbers, the operation of enumerating its elements obviously constitutes the set in its whole extension. Whereas denumerable objects thus are operationally constitutable, objects of higher infinities are not, and even properties of denumerable objects are not determined operationally in general. There are always properties of the set of all natural numbers which are not decidable in a given operational framework, but only in an extended one, as Gödel has shown. So in these cases it could make sense to take the infinite as a regulative idea guiding us in our search for adequate extensions of our current framework. But what does that mean? Is there really an idea of the infinite which guides us in doing mathematics? It seems quite unclear how we as finite beings should have any kind of access to such an idea. In the talk it will be argued that it is more appropriate to look at the denumerable infinite as something completely constituted by our possibilities of accessing it, and that over and above that what guides us in extending mathematics can be seen as a regulative idea indeed, but not in an idea of the infinite than rather in an application to mathematics of "the only ideal proper of which human reason is capable" (Kant 1787, B604): the principle of thoroughgoing determination of every existing thing, which is closely related to the idea of the sum of all possible predicates (B599ff.). The paradigmatic case of this application of it to the mathematical sphere, which of course has to be justified, is that to the set of all natural numbers. The idea that this set is determined in all its properties transcends every given operational framework and can thus guide us in searching for extensions. Furthermore it provides a kind of justification of classical logic: The principle is equivalent, as Kant asserts, to the principle of all disjunctive judgements, especially the principle of excluded middle.


The Platonic universe of ideal objects in mathematics


Anthony Gardiner (Birmingham)

"Platonism in mathematics is widely held to be (a) untenable, and (b) unavoidable. There does not seem to be any shared universe where Platonic mathematical objects could be located; yet those who work in mathematics cannot easily avoid the fact that the objects they work with appear to have a life and an existence of their own.

I would like to formulate (and illustrate) the modest (and possibly not original) thesis that "Platonic objects exist as a common limit of historical and individual prototypes, which may be seen as partial, or imperfect approximations to an ideal (never achieved) Platonic limit." That is, that the Platonic universe of ideal objects exists as an uncompleted (potentially complete) universe akin to the actually completed constructions of ideal objects in mathematics itself."


Reasoning styles in modern mathematics or how to consider the way of discovery and learning next to the epistemic virtues of justification


Norma B. Goethe (Cordoba)

In the 17th century many mathematicians showed dissatisfaction with the deductive mode of presentation of mathematical results on the grounds that this obscured the true way the results were actually obtained. It was argued that the deductive mode was not only hiding the true way of discovery, but that such a characteristic mode of textbook presentation was useless in the context of learning and cognition in general. Yet, in the early modern period such arguments were primarily based on methodological considerations.

But in recent decades, some critics oppose loudly what may be called the ``mainstream philosophy of mathematics'' from a culturalistic perspective. These critics show dissatisfaction with the search for more rigorous approaches to mathematics on the grounds that too much emphasis is put upon proof and justification, while many other import aspects of mathematical practice, its own history and the way mathematics is actually learned are left unaccounted for.

In my paper I shall look at some of the issues underlying recent debates concerning mathematics as a cultural product.


History or heritage? Historians and mathematicians on the history of mathematics


Ivor Grattan-Guinness (London)


(Talk at the BIGS PhD Student Afternoon)

Mathematics shows much more durability in its attention to concepts and theories than do other sciences. for example Galen may not be of much use to modern medicine, but one can still read and use Euclid.

One might expect that this situation would make mathematicians sympathetic to history, but quite the opposite is the case; As Philip Davis puts it, they despise history unless a priority dispute is at hand. Their normal attention to history is with genealogy; that is, how did we get here? Old results are modernised in order to show their current place; but the historical context is ignored and thereby quite distorted. By contrast, the historian is concerned with what happened in the past, whatever be the modern place. The difference between these two approaches will be discussed, with examples exhibited: these will include Euclid, set theory, limits, and applied mathematics in general. The implications for mathematics education will also be aired.


How it means: What do mathematical theories say when they are used in physical theories?


Ivor Grattan-Guinness (London)


Mathematics seems to play a dual role in knowledge. It has its own life, as "pure" mathematics though enjoying applications of one branch of the subject to others; but there is also the (so-called) "applied" (better, "mixed" or "empirical") side, where it is used in some science. But how much of the detail of the mathematical theory are to be interpreted in the empirical situation? For example, Fourier series are used in heat theory; they are composed of periodic functions; therefore does one have to interpret heat diffusion as a waval property? Maybe, but not necessarily; Fourier himself did not do so. The situation is especially interesting when the scientific theory is refuted. was the mathematics as such wrong, or the application made of it, or both?

In this lecture I shall discuss some of the philosophical issues involved, using examples from classical mechanics and mathematical physics. Consequences for other branches of mathematics will be briefly noted.


On the Concepts of Formal, Pure and Naturalistic Philosophy


Leila Haaparanta (Tampere)

In the analytic tradition, a philosopher's activities have often been described as efforts to apply formal or semiformal tools to the subfields of philosophy, efforts to solve philosophical problems by those tools or efforts to show by those means that the problems are pseudoproblems.

In the present paper the idea of philosophy as a formal science is analysed from a perspective, which is inspired by Frege's ways of making the distinctions between analytic, synthetic, a priori and a posteriori truths in the Grundlagen. By paying special attention to philosophers' argumentation strategies, the present paper suggests ways of making the distinction between pure and naturalistic philosophy, on the one hand, and between formal and the rest of pure philosophy, on the other. It is argued that the distinctions involve the normative foundations of philosophical practice, that is, different answers to the question concerning what one is obliged or permitted to do in philosophical discussion.

It is also argued that the different ways of practising philosophy, which are labelled as formal, pure and naturalistic philosophy, must face the same problem. That is, if one wishes to express in language the metaphilosophical position to which one is committed, one needs a view from nowhere. Finally, a few remarks are made on how formal philosophy can cope with the problem.


Between Ambrose and the Arians: Augustine and the Value of Logic as a Formal Science


Stefan Heßbrüggen-Walter (Marburg)

It is certainly difficult to name all those traits which make contemporary logic a formal science. But there should be widespread agreement on the fact that one of those aspects which make logic a formal science consists in the fact that it generates only 'formal truth': Its inferences are truth-preserving, but do not themselves guarantee the truth of the premisses.

In ancient times, some philosophers, most of them Neo-Platonists, held a different view of what they called dialectic (which contained some aspects of what we call today formal logic but was not identical with it): They believed that this science was in a position to generate material insights into the ontological structure of the world.

One very prominent thinker of late antiquity, Augustine, started out as a neoplatonist (concerning the question of what dialectic can and cannot do). Later on, in his book on christian teaching (De Doctrina Christiana) he presents an outlook much closer to our present-day understanding of the problems involved: Logic preserves the truth of true premisses in valid inferences but gives us no clue as to the truth of the premisses themselves.

The paper gives a sketch of this development in order to correct some common misconceptions concerning the role of dialectic in Augustine's thought. The fact that Augustine abandons the idea that logic can generate material insights has either not been remarked at all or is being described as a "decline" of the role the science plays in Augustine' s philosophical system. Besides correcting these misapprehensions the paper wants to achieve a further end: It tries to put Augustine's reflection on dialectic in the context of early-church-controversies concerning the role of pagan thought in Christian theology. This field is of particular interest because the most important heresy of the early church, Arianism, used dialectical means to further their theological intentions. It can be shown that Augustine's view developed as an answer to the wrong alternative posed by dialectical optimists (neoplatonists and Arians) on the one and anti-dialectical sceptics (like e. g. Ambrose, bishop of Milan) on the other hand.


Bronze Age Formal Sciences


Jens Høyrup (Roskilde)


Writers on ethnomathematics (in the interpretation "non-literate cultures' mathematical thought") point out that "mathematics is our concept". The ancient Greeks, on the other hand, had an explicit notion of mathematics that is not too distant from ours, and at least from Aristotle onwards it was possible for them to understand mathematics as formal knowledge, even though the pragmatic expression of this insight in the more concrete terms of "measure, number and weight" prevailed into the incipient Modern age.

Where does that leave the preceding high cultures of the Bronze Age? More explicitly, are "Babylonian mathematics" and "Egyptian mathematics" also our concepts, conflations of a variety of practices that had no mutual connection in the eyes of the ancient authors, or do they refer to recognized coherent cognitive domains? And if so, can these be understood in any way as "formal"?


Being Formal For the Sole Reason of Being Formal: Conditions of Adequacy for Formal Theories


Franz Huber (Konstanz)


A theory is defined to be formal just in case at least one of its central concepts is defined by a set of functions. Independently of what the theory is about, it has - or so it is argued - to satisfy (among others) three conditions of adequacy: (1) Non-Arbitrariness, (2) Comprehensibility, and (3) Computability in the limit.

  1. A (concept C defined by a) set of functions F is arbitrary iff there are at least two functions f,g in F and at least two arguments x,y in Dom(F) such that: f(x) > f(y) and g(y) > g(x), where > is a strict order on Range(F). A formal theory is arbitrary just in case at least one of its central concepts is defined by an arbitrary set of functions.
  2. A theory is comprehensible iff all its primitive concepts are, where the concepts of logic (PL1=) and set-theory (ZF) are assumed to be comprehensible. The justification for this assumption is pragmatic in the sense that it relys on the fact that no (set of) function(s) can be defined without the notions of PL1= and ZF.
  3. A (concept C defined by a) set of functions F is computable in the limit iff for every function f in F there is a method a that stabilizes to the correct value f(x) for every x in Dom(f), where a method a stabilizes to b for c iff there is a step m such that for all later steps n: the output of a for c at step n is b. A theory is computable in the limit iff all its central concepts which are defined by a set of functions are computable in the limit.

Arguments are presented for each of these conditions of adequacy (why just these conditions; why no stronger/weaker ones?): A major point is that that violating such a condition yields a theory useless for practical purposes.

Finally, examples of formal philosophical theories are considered that (surprisingly) violate these conditions.


Dialectica est ars artium. Aspects of medieval logic as a formal science


Christoph Kann (Düsseldorf)


In the middle ages logic or dialectic is usually not characterized as a scientia formalis but as a scientia sermocinalis. The scientia sermocinalis comprehends three parts - grammar, rhetoric, and logic - and is concerned with operations by means of which the soul attains to knowledge. Medieval logic as a scientia sermocinalis is not exclusively concentrated on the form or the formal elements of propositions, but it also considers the materia enuntiationis. This materia or matter of propositions made up by the relationship of the subject term and the predicate term is of three kinds, viz. natural, contingent, and separate. This distinction is important for the way medieval logic analyses propositions of different quantities and their mutual relationships.

Medieval logic provides us with a detailed analysis of the formal elements of language, especially in the treatises of syncategoremata, a type of expression which is equivalent to the logical constants or topic-neutral expressions in modern logic. But it also offers a subtle theory of the material elements of language, i.e. the categorematic or descriptive words. The way a descriptive element of language refers depends on its context, and the theory of supposition captures these types of referring in a systematic manner.

Medieval logic also provides us with a rich theory of consequences. These were commonly divided into the consequentia formalis, which is valid by virtue of the logical form of the component sentences, and the consequentia materialis, which holds not on formal grounds alone, but by virtue of the matter or content specified by the descriptive terms of the component propositions.

Bearing this in mind, how can we still justify the characterization of medieval logic as a formal science? At first glance it represents a central stage within the history of the formal sciences. On the other hand, it overarches the genuine framework of these sciences in more than one respect. As a scientia sermocinalis medieval logic is a comprehensive discipline of formal and descriptive elements of natural language, all of which are relevant for meaning and argumentation. If we take it this way, medieval logic can figure as a formal science only in a deficient manner. But there is more to the formality of medieval logic than that. In the middle ages logic is well known as the ars artium or scientia scientiarum. It does not restrict itself to particular formal elements of language, but also uses a formal approach to the analysis of descriptive elements and the context. Thus it suggests a comparative basis for all sciences and pushes logic to the highest level of abstraction and generality.


On the Foundations of Formal Epistemology


Ladislav Kvasz (Bratislava)


One of the consequences of the success of modern logic was a radical shift in the filosophy of mathematics from approaches founded on intuition (Kant, Riemann, Poincare) to approaches founded on logic. As a consequence of this developmet geometry started to be marginalised in the philosophical reflections of mathematics and in this way philosophy developed a rather unbalanced representation of the mathematical landscape in which arithmetic and set theory played a dominant role, while geometry was rearly even mentioned.

The aim of formal epistemology is to develop formal tools for a philosophical analysis of the pictorial aspect of mathematics (see our paper in Synthese 1998 and in Philosophia mathematica 2000). Using the picture theory of meaning of Wittgenstein from the Tractatus we try to interpret the pictures contained in the mathematical texts as a special kind of formal language. This pictorial language (we call it iconic language) is a counterpart of the symbolic languages used in mathematics. The reconstruction of the changes in the syntax and semantics of these iconic languages offers a strong tool for the analysis of different questions in the philosophy of mathematics.

We also tried to apply the approach of formal epistemology to the reconstruction of the development of physical sciences. One interesting result of such an application was a new reconstruction of Cartesian physics. Most historians of science approach Cartesian physics with the correspondence theory of truth in mind, and they find that most of Descartes' assertions are incorrect, and therefore they exclude them from the mainstream history of physics. On the other side historians of philosophy use the coherence theory of truth, and so they can show, that Cartesian physics was a coherent metaphysical system. Formal epistemology makes it possible to develope a middle position between these two approaches in showing, that the laws of Cartesian physics are limits of the Newtonian laws for some special choice of limit transitions. Thus from a formal point of view we can show, that Cartesian physics, even if not totally correct, is still more than only a coherent metaphysical system. It is a limit case of a theory which coresponds to the phenomena, and so even if it has not direct correspondence with real phenomena, nevertheless there is an indirect correspondence.


Symbolic Knowledge and Formal Science in Frege and Schröder


Javier Legris (Buenos Aires)


The idea of developing an universal scientific language played a decisive role in the origins of modern logic in the second half of the 19th Century. The work of Peirce, Peano, Ernst Schröder and also Frege attests this fact. In this context the notions of characteristica universalis and calculus ratiocinator, both stemming from Leibniz, served to characterize different features of that idea. Moreover, Jean van Heijenoort used them to draw an influential distinction between two different perspectives in modern logic. So, Frege described his Begriffsschrift (conceptual notation) not merely as a calculus ratiocinator, but a lingua characterica, and Schröder considered his algebra of relatives as the "most rational designation system" and also as a form of Begriffsschrift.

Closely related to these notions is the idea -also originally conceived by Leibniz- of symbolic knowledge (cogitatio symbolica or cogitatio caeca), that is, knowledge obtained by means of symbolic manipulation within a symbolic system, where symbols must be understood as physical objects. According to this idea, symbolic systems are not merely auxiliary devices in science, but they provide knowledge stricto sensu, they are constitutive of knowledge in a way opposite to knowledge obtained by intuition. Obviously, this idea of symbolic knowledge is quite general and it can receive various interpretations.

In this paper, I explore the application of the notion of symbolic knowledge to assess the role of the idea of an universal scientific language in the different conceptions of formal science held by Frege and Schröder. My aim is to determine the underlying conception of symbolic knowledge in both authors. In particular, the place of the conceptual notation in Frege´s logicism and the relation of Schröder´s abstract algebra of relatives to mathematics will be discussed. In the discussion, the notion of symbolic knowledge will be also compared with other epistemological notions like a priori knowledge. In general, it can be said that the paper deals with epistemological problems related to the concepts of universal scientific language and formal science.


The Formalization of Prediction in Antiquity


Daryn Lehoux (Halifax NS)


What is the relationship between logic and the natural sciences in antiquity? Are the two clearly distinct, or do they overlap to some extent? If so how? To answer these questions, this paper examines the ways in which formal solutions to the problems of predictive inference developed in the ancient world, and the ways in which these formal solutions were eventually influenced by ancient physics. The interaction between the formal and natural sciences placed constraints on explanations and justifications offered in the predictive sciences, but it also determined certain constraints -derived ultimately from physical considerations- on the framing of the predictive syllogism. This doesn't quite blur the distinction between the formal and natural sciences in antiquity, but it does show how the two were mutually interdependent when faced with the problems of prediction in particular.

In the Hellenistic period, prognosis and predictability were seen largely as a problem of logic, and spawned loud and lively (if quite technical) debates around the formal relationship between observed phenomena (symptoms, signs) and predicted phenomena (prognoses, omens). At one point, it was said that even the crows on the roofs of houses were cawing about the nature of conditionals. In the Mesopotamian tradition the solution had been quite different, and had consisted in the building up of massive law-like collections of particular precedents, in the form of long ordered lists of conditional statements, that could later be consulted in order to derive predictions of future outcomes from observed phenomena. These lists have recently been shown to share salient characteristics with contemporary mathematical practices.

The disjunction between the enumerative and formal approaches came to a head in Hellenistic philosophy, when Cicero dismissed enumerative arguments in favour of formal (syllogistic) solutions, but with a strong restriction on the kinds of relationships allowable between the antecedent and consequent terms of a predictive syllogism: such relationships must be physically causal. This restriction on the requirements of formal solutions applicable in the predictive sciences is essentially a two-way street: it shows how the formal science of logic was drawing on what we might anachronistically call mechanistic considerations from ancient physics, and it shows how the results of that process in turn influenced the physics of prediction.


C. S. Peirce's Lattice Theory as a Formal Framework for his Pragmatic Account of Meaning


Justus Lentsch (Bielefeld)


The American logician and philosopher C. S. Peirce (1839-1914) conceived logic to be the science of reasoning. Improving on G. Boole's symbolic algebra and A. De Morgan's logic of relations, Peirce developed mathematical logic as a tool for the understanding and improvement of reasoning, and not, like the logicists, to secure the foundations of mathematics. But how can the formal concepts developed by Peirce in the course of his work on symbolic algebra and the logic of relations really serve the understanding and improvement of reasoning and empirical knowledge acquisition? How is it that logically valid reasoning can be about the same world we both experience and act upon? This problem is treated in a series of essays in "Popular Science Monthly" titled "Illustrations of the Logic of Science" (1877/78) where Peirce suggests a methodological rule for the clarification of the meaning of concepts which later on became famous as the "Pragmatic Maxim".

In the present paper it is shown that Peirce's Pragmatic Maxim implicitly employs as formative principles the formal concepts of a lattice and a partial ordering Peirce invented in his papers on symbolic algebra and the logic of relations between 1870 and 1883. These concepts constitute a formal framework for both, the axiomatization of his logic of relations and his pragmatic account of meaning. In Peirce's pragmatism (but also in his theory of categories and his semiotics) they gain philosophical significance as implicit formative principles for the design of theories about the actual processes connecting reasoning, experience and cognitive actions with our environment. Pragmatism, then, operationalizes theses formal concepts as a methodology for clarifying the meaning of concepts by placing them in the lattice of already held beliefs about the same object with respect to the conceivable consequences it might give rise to.

Pragmatism thus clarifies the meanings of concepts by establishing a relation between the formal and material properties of reasoning and empirical knowledge acquisition processes, implicitly drawing on the formal framework of order- and lattice-theoretical concepts.


Die Formalwissenschaften aus dem Gesichtspunkt der Phänomenologie Husserls


Dieter Lohmar (Köln)


Die Phänomenologie ist eine an der Sinnlichkeit orientierte Philosophie. Sie sieht den Anfang aller Gegenstandsbezüge in der sinnlichen Gegebenheit. Das sinnlich Gegebene wird dann in einem synthetischen Akt als Gegenstand aufgefaßt (sozusagen interpretiert bzw. gedeutet). In einem solchen eher empiristisch orientierten Ansatz stellt sich die Frage nach dem Gegen-standscharakter der Formalwissenschaften zunächst in der Form: Wie kann Mathematik (Logik) überhaupt Erkenntnis sein und was sind es für Gegenstände, die hier erkannt werden (im Gegensatz zu realen Dingen)? Husserl war selbst Fachmathematiker, er studierte und promovierte in Berlin bei Weierstrass und Kronecker. Seine Sicht der Mathematik ist bereits an der axiomatisch-deduktiven Reformu-lie-rung der mathematischen Disziplinen und an der englischen algebraischen Logik des 19. Jahrhunderts orientiert. Husserl interpretiert die Formalwissenschaften im ganzen als eine universale formale Wissenschaftslehre, die den auf Einzelnes gerichteten, konkreten Disziplinen vorangeht.

Husserl thematisiert auch veränderte Verhältnis von Logik und Mathematik, das durch die mathematische Reformulierung der Logik entstanden ist. Logik gilt zwar in der Tradition als eine Disziplin, die sich mit Ableitungsverhältnissen zwischen Aussagen beschäftigt, aber sie läßt sich in der algebraischen Reformulierung ebenso als eine axiomatische Disziplin verstehen, deren Gegenstände (ebenso wie die der Mathematik) in der Form von Variablen gemeint sind als ein 'Etwas-Überhaupt' unter festen Operationsgesetzen. Husserl legt die formale Logik daher als eine Disziplin aus, die einen doppelten Sinn besitzt: einerseits ist sie eine formale Urteilslehre (formale Apophantik) andererseits ist sie zugleich auch eine formale Lehre von Gegenständen überhaupt (formale Ontologie).


The Status of Logic in the Seventeenth Century


Jaap Maat (Amsterdam)


In the seventeenth century, logic was a well-established discipline forming a standard part of learning. In the course of the century however, the prestige of logic rapidly declined after suffering vigorous attacks by a series of influential writers. Although logic handbooks continued to be written and studied, John Locke, towards the end of the century, voiced a then widespread opinion when he dismissed logic as 'learned ignorance'.

The paper examines how the various criticisms of logic were underpinned, and distinguishes several types of arguments. One of these types is aimed at finding fault with the way in which syllogistic logic formalizes reasoning. Arguing along these lines, van Helmont and Webster claimed that Aristotelian logic erroneously characterizes certain inferences as valid. Another type of argument stressed the limited use to be had from studying logic. Thus Bacon and many of his followers believed that the formal patterns provided by syllogistic logic, although impeccable by themselves, were useless as long as the contents on which these patterns were supposed to operate were ill-defined. And Descartes maintained that 'the natural light of reason', unaided by the precepts of logic, was quite capable of finding all the truths it might need. Descartes also put forward a third type of argument, according to which logic might put a threat to clear thinking precisely because of its formal nature. In providing mechanical deduction procedures, Descartes said, logic allowed the mind to 'go on holiday' rather than keeping it alert. Finally, an almost unanimously accepted type of argument maintained that logic was incapable of guiding one to the discovery of new knowledge, being at most helpful in demonstrating to others what was already known.

The paper seeks to explain why none of these types of arguments, with the exception of the first, could be easily refuted by those writing in defence of logic.


Concept Script: From Logic of Language to Language of Logic


Ángel Nepomuceno-Fernández (Sevilla), Fernando Soler-Toscano (Sevilla)


Our main aim in this paper is to present the Frege's Concept Script system as a symbolic system that is abstracted from natural language and leads up to modern formal methods in logic.

By means of a reconstruction of the symbolism defined in Concept Script, taking into account that the real or logical form of a proposition is always associated with its grammatical form, we obtain a symbolic language and define an axiomatic system that accounts for the class of the "judgements of the pure thought". Such language, that studies the most relevant logical constants from a classical point of view, is not formal, since its semantics is the ordinary one, but it has not ambiguities, so common in natural languages, and is very suitable to express logical forms as the vehicle of inference necessary to study any inferential context. It can be shown by representing the square of opposition of the categorical proposition.

On the other hand, we prove that the symbolic system, in its first order version, taking into account the Fregean semantics, a seminal doctrine about sense and reference, has two important properties: soundness and completeness. Finally, we show that the Concept Script system, taken as a second order system with a semantics in general sense, could have become an early framework to drive the logicist project.


Transforming concepts of Science into the Humanities: objective possibility in the times of 2nd German empire


Martin Neumann (Osnabrück)


The problem to identify the meaning of probabilistic statements was crucial ever since probability theory emerged.. In this talk I will first present a solution of this problem developed in the times of the 2nd German empire and second show how this formal theory relates to the natural sciences and the humanities.

In 1886 the German Physiologist Johannes v. Kries developed a concept of objective possibility in order to clearify the meaning of probabilistic statements. v. Kries was situated in the German field of discourse dominated by the philosophy of science of the so called Neokantianism. Within this context deterministic causality was seen as the core of science. But if deterministic causality is fundamental, how should we interpret probability theory? To reconcile probability with deterministic causality, v. Kries worked out a concept of objective probability, called "Spielräume" (i.e. scope, range). According to v. Kries, a scientific explanation is built out of two sources: the natural laws on the one hand, and the circumstances of a single event on the other. The concept of "Spielräume" is relevant for the latter: For example an object might be described by the word "red". Then you have an infinite range of possibilities reaching from the one case this notion is clearly false to the other that it is clearly true. This is measured by the "Spielräume".

However, this formal concept was deeply embedded in the context of the contemporary empirical sciences. In the 2nd German empire this context was significantly different from Western European Statistics: Originally, v. Kries intended this theory to fill the explanatory gap between the reversible laws of mechanics and the irreversible 2nd law of thermodynamics. For this purpose he wanted to formalise Boltzmann's use of probability theory. Thus, it was a concept of science. But, surprisingly, it proved to be useful also in the theory of law. Finally it could even help the sociologist Max Weber to explain the operation called "Verstehen". By this the concept of "Spielräume" spread out to the field of the humanities. Thus, this formal theory served as a tool for the transformation of concepts of science into the humanitites.


What is Divine in the Divine Proportion?


Olaf Neumann (Jena)


Since the time of Renaissance, many practitioners and theoreticians of art have emphasized the "Divine Proportion" (or "Golden Section") (x-1):1 = 1:x with the ratio x = (1 + sqrt(5))/2 = 1.618... It is well-known that x is approximated in an distinguished way by the ratios of the consecutive members of the Fibonacci series 1, 1, 2, 3, 5, 8, 13, 21, 34,... Among others Sir Th. Cook and Le Corbusier stressed the iteration of the "Divine Proportion". In mathematical terms this comes down to take into account the powers x2, x3, ... of the ratio x. In the talk there will be discussed the approximation of those powers by ratios of integers. This way one will get some arguments in favor, e. g., of the "Modulor" by Le Corbusier.


Platonisms: Gödel, Quine and the Image of Mathematics


Terese M.O.Nielsen


In this talk, I discuss some of the very different shapes that platonism - belief in the existence of abstract, mathematical entities - can take by contrasting the platonism of Gödel with that of Quine.

Mathematics is often described as the queen and servant of the sciences. This is supposed to capture the special role of mathematics as simoultaneously independent of and deeply entangled in the sciences. Any sensible philosophical account of mathematics must elucidate both this seeming independence and the applicability of mathematics. As will be demonstrated, this double role of mathematics gives rise to some of the many differences between Gödel and Quine; Gödel primarily stresses the independence of mathematics, while Quine pays significantly more heed to its applicability, and this leads to very different accounts.

Traditionally, the question of applicability has been the harder part for the platonist. The reason for this is the apparent inability of abstract entities to participate in causal processes and hence the incapability to interact with physical objects. Presently, however, the applicability of mathematics plays a major part in the most succesful argument supporting platonism, namely the so-called Quine-Putnam indispensability argument, coming out of the empiricist tradition. Surprisingly, the basic ingredients of this argument are also found in Gödel's writings, in spite of his leanings towards dualism and rationalism and life-long contempt for empiricism.

The comparison of Gödel's and Quine's positions leads to an attempted analysis of the minimal requirements for platonism. I argue that 1) the bare statement of the existence of mathematical entities stands in need of serious expansion, if it is to count as a philosophy of mathematics at all, and 2) if there is any common feature to be found in the many views called platonism, it is the adherence to a certain form of argument, rather than a uniquely specified set of claims.


On the Notion of a Formal Object


Rainer Osswald (Hagen)


One of the most elaborate approaches to specifying formal systems and formal objects has been given by H. B. Curry. Although Curry is mainly known as a proponent of the formalist viewpoint on mathematics, his conception of a formal system turns out to show constructivist and structuralist elements as well. In particular, Curry points out that the exact nature of formal objects does not matter. This view is in accordance with a Quinean conception of structuralism, which comes along with a relative notion of ontology.

On the other hand, there are crucial differences between Curry's and Quine's ontological commitments concerning formal objects; for example, Curry regards inductively generated classes of objects as intuitively given by means of inductive specifications, whereas Quine abstains from drawing on the intuitively given when it comes to ontology. After discussing these issues in detail we in addition reflect on the way formal objects and systems as well as their ontological relativity appear nowadays in computer science under the guise of abstract data types, which are typically specified by algebraic and categorical methods.

References:

A = B: Darstellen und Erklären in den formalen Wissenschaften


Michael Otte (Bielefeld)


In der Logik und Mathematik haben sich seit Bolzano, Grassmann oder Riemann Extension und Intension mathematischer Terme gegeneinander verselbständigt. Frege ist dann gescheitert bei dem Versuch die Beziehung zwischen beidem in einer absoluten formal-logischen Weise zu bestimmen. Dabei ist man auf die Grenzen der extensionalen Sichtweise gestoßen, man denke etwa an die logisch-semantischen Paradoxien. Auch die neuen kategorientheoretischen Ansätze in der mathematischen Grundlagenforschung verlassen die extensionale Sichtweise nicht. Auf der anderen Seite gibt es grundlegende Bereiche mathematischer Erkenntnis und mathematischen Denkens - man denke etwa an die moderne Axiomatik - die zunächst intensional verstanden werden müssen. Hier hat man sich in Analogie zu den Naturwissenschaften eine objektive Grundlage durch Bezug auf bestimmte Vorstellungen, wie die einer "Uniformität der Realität" und Ähnlichem verschaffen wollen. Bei diesen, die objektive Begründung in den Vordergrund stellenden Bemühungen ist der große Vorteil der Mathematik und die Quelle ihrer außerordentlichen Erkenntnisdynamik, die darin liegt, einen permanenten Perspektivewechsel auf ein und dasselbe zu provozieren und zu ermöglichen, vergessen worden. Das Wesentliche in der Mathematik besteht darin, ein A als ein B zu sehen: A=B!

Jede Erklärung überhaupt ist nichts anderes als ein zweiter Blick auf dieselbe Sache. Dieser Akt macht den grundlegenden Unterschied zwischen einem bloßen Faktum und einer in symbolischen, auf Erklärungen ausgehenden Darstellung aus. Jede Erklärung beruht auf einer Beschreibung oder Datenaufnahme und ändert sich mit derselben natürlicherweise. Zuweilen wird sogar gesagt, dass die Erklärung selbst überhaupt keine neuen Informationen enthält, keine Informationen, die sich von dem unterscheiden würden, was bereits in der Beschreibung enthalten war. Erklärung ist aber "sicherlich von enormer Bedeutung und scheint gewiß ein Zuwachs an Einsicht zu vermitteln, der über die Inhalte der Beschreibung hinausgeht", schreibt G. Bateson (Bateson 1982, 103). Bateson vermutet, dass dieser Zuwachs an Einsicht schon allein mit einer zweiten, veränderten Darstellung der Inhalte gegeben ist. Er fährt dann fort: "In der Wissenschaft werden diese beiden Typen der Organisation von Daten (Beschreibung und Erklärung) durch das verbunden, was man technisch als Tautologie bezeichnet. Beispiele für Tautologien reichen von dem einfachsten Fall der Behauptung 'wenn p wahr ist, dann ist p wahr' bis hin zu hoch entwickelten Strukturen wie die Geometrie Euklids, wo man sagen kann: 'wenn die Axiome und Postulate wahr sind, dann ist Pythagoras' Theorem wahr'" (a.a.O., 104). Bateson beschreibt hier offenbar dasselbe, was Reichenbach als den Kern der Methode der neuzeitlichen Wissenschaft charakterisiert hatte.

Für die naturwissenschaftliche Forschung hat die Deduktion überragende Bedeutung erst im Zusammenhang der Mathematisierung der Naturphänomene gewonnen. Gleichzeitig erwies sie sich in diesem Zusammenhang als Kernstück der Methodologie der wissenschaftlichen Revolution des 19. Jahrhunderts. Hans Reichenbach beschreibt die "Doppelnatur der klassischen Physik", die unser Erkenntnisideal und unsere Epistemologie bis heute grundlegend geprägt hat. Die wissenschaftliche Revolution des 17. Jahrhunderts beginne mit Newtons Gesetz der Anziehungskraft der Massen. "Dieses Gesetz, das unter dem Namen Gravitationsgesetz bekannt ist, hat die Form einer ziemlich einfachen mathematischen Gleichung. Logisch gesprochen, stellt es eine Hypothese dar, die man nicht direkt verifizieren kann. ... Newton selbst sah deutlich, dass die Bestätigung seines Gesetzes von der Verifikation der daraus gezogenen Folgerungen abhängt. Um diese Folgerungen abzuleiten, hatte er eine neue mathematische Methode, die Differentialrechnung erfunden. Aber trotz dieser glänzenden Leistung mathematischer Deduktion blieb er unbefriedigt. Er wollte einen quantitativen empirischen Beweis haben" (Reichenbach 1977, 119 f.).

Diese Verifikation hing von den Messungen ab, die die Astronomen ihm lieferten. Diese Geschichte "von Newtons Entdeckung stellt eine überzeugende Illustration der modernen wissenschaftlichen Methode dar. Das Beobachtungsmaterial ist der Ausgangspunkt der Methode, aber Beobachtungen erschöpfen die Methode nicht. Sie wird ergänzt durch eine mathematische Erklärung, die weit über das Beobachtete hinausgeht; dann wird die Erklärung mathematischen Ableitungen unterworfen, die ihre verschiedenen Folgerungen deutlich machen, und erst diese Folgerungen werden durch Beobachtungen geprüft. ... Wenn man von der empirischen Wissenschaft spricht, darf man nicht vergessen, dass Beobachtung und Experiment nur deswegen in der Lage waren, die moderne Wissenschaft aufzubauen, weil sie sich auf mathematische Deduktionen stützten. Newtons Physik unterscheidet sich tiefgehend von dem Bild der induktiven Wissenschaft, das zwei Generationen früher von Francis Bacon gezeichnet worden war. Eine bloße Sammlung von beobachteten Tatsachen, so wie Bacons Tafeln sie enthalten, hätte dem Wissenschaftler nie die Entdeckung des Gravitationsgesetzes ermöglicht. Mathematische Deduktion in Verbindung mit Beobachtung ist das Instrument, das den Erfolg der modernen Wissenschaft möglich gemacht hat" (Reichenbach a.a.O., 120/121). Das tautologischen Element, welches eben darin besteht, diesen zweiten Blick zu ermöglichen, wird der reinen Mathematik und er axiomatischen Methode zugeschrieben. In diesem Sinne beschreibt auch Reichenbach, wie wir gerade gesehen haben, den entscheidenden Beitrag der Mathematik zur wissenschaftlichen Revolution der Neuzeit.

Aber es kann nicht darum gehen, die Tautologie immer genauer analysieren zu wollen, obwohl das insbesondere angesichts der Möglichkeiten und Versprechungen einer maschinellen Implementation große ökonomische Bedeutung hat, sondern es muss auch darum gehen, die Objektivität im kognitiven und semiotischen Prozess selbst aufzuspüren.


The Historiography of Mathematics and Logic


Volker Peckhaus (Paderborn)


(Talk at the BIGS PhD Student Afternoon)

The lecture surveys the role of logic for the historiography of mathematics. If one of the tasks of the historiography of mathematics is seen in doing research on periods in the development of mathematics marked by great upheavals, the close interrelationship between logic and mathematics becomes evident. Logic comes into play in connection with experiences of a crisis in foundations and with universal programs of general mathematics. This is shown by considering the following examples: Leibniz's programme of a universal mathematics, Ernst Schröder's algebra of logic, Frege's logicism, and the role of logic within Hilbert's foundational programme and his metamathematics. It is argued that the creation of modern mathematical logic and the development of modern structural mathematics run parallel not only by accident, but that they are intertwined. The lecture ends with some considerations on the situation of research in history and philosophy of logic in Germany.


On the formalistic approach to the dynamic of scientific theories. The structural point of view


Tommaso Perrone (München)


Since 60’ the structural point of view seems to be the most well-known and rigorous form of formalistic approach to scientific theories and the logical comprehension of the dynamic of them – W. Stegmueller, Sneed and Ramsey are the most significativ exponents. It represents essentially an answer to the sceptical claims of science, which after T.S. Kuhn’s work “The Structure of Scientific Revolutions” got one of the most diffused and serious attack to the formal foundations of science.

Many philosophers suppose that scepticism is irrefutable; theat we cannot vindicate our claim to know anything. But we have not to be all all impressed by this very “pessimistic” conclusion. Some shrug scepticism aside as mere curiosity; a philosophical (and methodological) aberration. Many factors combine to explane this seemingly attitude. The pratical implementation of the sceptical doctrine encouters insourmountable difficulties. Its proponents (here we intend to show it) are to be considered frivolous und insincere - above all about the formal nature of science.

Through the use of some typical “structural” concepts and some foundamental principles of the mathematical set-theory, as for example “theory as set with a structure and application’s domain (T = (S,A))”, “Frame”, “Core”, we should try to prove that all sceptical positions against formal science on a “metascientific” and logical level are inconsistent. Namely on one hand scientific theories in their own structur, as formal theories, are not to be involved in any sceptical confutation; and on the other hand sceptic’s challenge to enfeeble and annihilate the logic relations of scientific theories is properly to be dismissed.

Gliederung des Vortrags:


Mathematics - cultural product or epistemic exception?


Susanne Prediger (Bremen)


In contemporary philosophies of mathematics, mathematics is seen as a cultural product. But in this view, it is not easy to explain the high level of coherence of mathematical theories and concepts and the wide spread consensus among mathematicians. As Bettina Heintz cites these phenomena to negate the cultural relativity of mathematics and to claim the epistemic exception, it is worth to search for explanations. In the talk, I will give attempts of explanations by looking at the historical development of mathematical concepts and theories. In this way, the theory of the epistemic exception mathematics is confuted.


Formalwissenschaftliches Denken bei Johann Heinrich Lambert


Günter Schenk (Halle)


Formalwissenschaftliches Denken bei Johann Heinrich Lambert (1728-1777) Lambert gehört zu den modernen Denkern, dessen Logik noch den Zusammenhang zwischen logistischer und philosophischer Fragestellung sichtbar werden läßt (Logik, Logistik, Logikalkül). Er denkt philosophisch, wenn er imme i den Erweiterungen der Logik das Allgemeine und Gemeinsame sucht, um es schließlich selbst logisch zu erfassen. Das logisch erfaßte Allgemeine ist der Begriff. So geht bei Lambert letztlich um Repräsentationsformen des Begriffes, hier berühren sich Merkmalslogik (Ausmessen der Begriffe), mit der Algebra (Gleichungstheorie) und Mereologie (holistische Denkweise). Dabei spielt die Sprache eine besondere Rolle: einerseits dient sie dazu, zu sagen und zu beschreiben, was und wie die Dinge sind und umgekehrt können auch die Dinge für das stehen, was man durch Sprache sagen und beschreiben möchte (bei Lambert eine gewisse Parallelität von ontologischen und begrifflichen Strukturen). Auf diese Weise wird das Thema der Wissensrepräsentation im modernen Sinn angesprochen. Am Beispiel Lambert soll das Problem des Verhältnisses von Logischer Idee (Logik), Analysesprache (Instrument), außersprachlichem Symbolismus (logische Form der Sprache) diskutiert werden.


The axiomatic method in the light of 20th century philosophy


Dirk Schlimm (Pittsburgh PA)


This paper is intended as a contribution to the question of how formal science have been perceived through history. The aspect of formal science that I focus upon is the "axiomatic method" (in what sense one can speak of a "method" will be touched upon briefly). The time frame of my investigation is the 20th century, and as perceivers I consider (analytic) philosophers of science and of mathematics.

Ever since its inception Euclid's "Elements" have been heralded as a prime example of a great intellectual achievement. Its form of presentation, the axiomatic method, has been regarded for a long time as an ideal with respect to clarity and rigor. In fact, innumerable scientist, mathematicians and natural philosophers alike, modeled their work upon the "Elements." The invention of modern logic together with the desire for rigorous proofs without appeal to intuition have led to an increased attention to the axiomatic method in the late 19th century, which culminated in Hilbert's vision of the "formal" axiomatization of all mathematics and natural sciences. Today, the axiomatic method belongs to the toolbox of every scientific enquirer.

With this development as the background, I will discuss how the axiomatic method was viewed in 20th century philosophy of science (e.g. Carnap, Hempel, Hanson, Hesse, Kuhn, Lakatos, van Fraassen) and philosophy of mathematics (e.g. the Benacerraf/Putnam collection, Lakatos, Polya, Kitcher). Initially, axiomatizations were mainly thought of as providing epistemological foundations. However, in light of the possibility of different axiomatizations of the same theory this was put into question. I will show that in the context of scientific methodology, in particular regarding its use in the context of discovery, the axiomatic method has received only very little attention. This is a rather surprising result, since the axiomatic method has been employed extensively in mathematics, science, and also by the philosophers themselves. Possible reasons for and shortcomings of this development in philosophy will be discussed.


Conceptual harmony and the semantics of programming languages


Charles A. Stewart (Dresden)


I will begin by examining a strand of ideas in the philosophical work of Frege, Gentzen, Tarski, Wittgenstein, Prawitz and Dummett on the relationship of logic to thought with especial attention paid to the idea that logic can be defined or captured formally, and the controversy between psychologism and anti-psychologism.

I will relate this conceptual history to the emergence of proof theory and model theory as normal tools in the science of logic, especially focussing upon the relationship between logical harmony as described by Dummett and the attempt to justify the logical laws on a proof theoretic basis undertaken by Gentzen and Prawitz.

I will then show how these philosophical ideas reappear in the semantics of programming languages, emphasising the conceptual parallels and technical differences between model theory and domain theory on the one hand and proof theory and algebraic semantics on the other hand, and sketch how these lead to importantly different conceptions of what operational semantics is.


Hilbert's Axiomatic Method and Its Defective R eception by Logical Empiricists


Michael Stöltzner (Bielefeld)


Only recently, David Hilbert's program to axiomatize the sciences according to the pattern of geometry has left the shade of his formalist program in the foundations of mathematics. This relative neglect - which is surprising in view of the enormous efforts Hilbert had himself devoted to it - was certainly influenced by Logical Empiricists' almost exclusively focusing on his contributions to the foundational debates. My paper investigates the stand of two core members of the Vienna Circle who had studied with Hilbert at Göttingen, the mathematician Hans Hahn and the theoretical physicist Philipp Frank. At bottom of their neglect of Hilbert's axiomatic method stands their conviction that reconciling Ernst Mach's empiricist heritage with modern mathematics required to draw a rigid boundary between mathematics and physics and to subscribe to logicism, according to which mathematics consisted in tautologous logical transformations.

In this way, they missed the substantial difference between the logical structure of a particular axiom system and the axiomatic method as a critical study of arbitrary axiom systems. If this distinction is not properly observed - and admittedly Hilbert himself did deliberately obscure it at places - a core concept of the axiomatic method, "deepening the foundations" (Tieferlegung), becomes metaphysical because it might appear as an ontological reduction of basic physical concepts to mathematical ones rather than - as Hilbert intended - an epistemological reduction availing itself of the unity of mathematical knowledge. To be sure, Logical Empiricists considered the goal of axiomatizing the sciences as an important task, but in the way how they set it up axiomatization became much closer tied to a success of the foundationalist program for all mathematics than Hilbert's axiomatic method ever was.


What is it like to be formal?


Christian Thiel (Erlangen)


What is the precise meaning of "formal" when we speak of "formal sciences", of "formal systems", "formalization" and "formalism", and of logical form and abstract structure? The answers given to these questions in the historical development of logic and mathematics have been very different, exhibiting a growing understanding of the underlying problems as well as an increasingly perfect mastery of the latter by ever more adequate and refined techniques. The paper will give a survey of the most significant steps of the historical unfolding of the aspects of form, and analyze their systematical status and importance.


On the roots of Hilbert's proof theory
Some remarks in consideration of the canceled 24th problem


Rüdiger Thiele (Leipzig)


By finitary reasoning Hilbert's program (proof theory and metamathematics) attempted to secure the infinitary parts of mathematics in particular classical analysis. The program is divided into two parts: the finit part consisting of meaningful, true propositions and their justifying (finitary) proofs to be regulated and the part serving as regulator. In detail this program was formulated after world war I, it had already begun to take shape in the Zürich talk in 1917 "Axiomatisches Denken". In the end, however, the underlying general ideas of that program had been formulated in Hilbert's famous Paris talk in 1900 (the general foundation problems in nos. 1, 2, and 10, the foundation of specific areas in nos. 3, 4, 5 and (unsolved) 6).

Before the development of formalisation we can already find the roots of proof theory in a further and later canceled 24th problem of Hilbert's Paris lecture. In this unknown problem which concerns proofs we have more than only a glimmer of the ideas of the later developed proof theory. I will discuss some aspects of the technical development and the philosophical clarification of Hilbert's program from the viewpoint of the canceled problem.

References

Nitecki's Metaphorical Geometry and Unity of Knowledge


Joanne Twining (Denver CO)


The formal sciences are concerned with the abstract, or intangible phenomenon perceived by the mind. Library Information Science (LIS) is concerned with abstracts of abstracts, or metaabstractions, and has the simultaneous goals of aggregating and differentiating information- and knowledge-bearing artifacts and instances. Is a unity of knowledge and interdisciplinary access possible in a linguistic system reliant on rigid subject categorizaton and strictly controlled vocabularies? This discussion presents J.Z. Nitecki's (1993) Metalibrarianship, a three-dimensional, nine faceted metaphorical geometry of the knowledge creation process, and mines for clues for abstracting existing knowledge systems to satisfy the organization and retrieval needs of the hyperdimensional mind.


Existence, Identity, and the Algebra of Logic


Risto Vilkko (Helsinki)


One of the most interesting open problems in the history of philosophy concerns the genesis of contemporary logic epitomized by the Frege-Russell theory of quantifiers. One of the cornerstones of this theory is the distinction between the allegedly different meanings of ordinary-language verbs for being. According to the received view, such verbs are multiply ambiguous between the is of predication, the is of existence, the is of identity, and the is of subsumption. This assumption (a.k.a. Frege-Russell ambiguity thesis) is built into the notations that have been used in logic since Frege and Russell, in that the allegedly different meanings are expressed in the usual logical notations differently. It turns out that no philosopher before the 19th century assumed the Frege-Russell thesis.

It can be shown that Aristotle considered the Frege-Russell distinction but rejected it. He treated existence as a part of the force of a predicate term. Some people have ascribed it to Kant. However, it is false to say that Kant created, or maintained, the Frege-Russell thesis. His discussion of existence is often said to include a criticism of the idea that existence is a predicate. Strictly speaking it includes a stronger criticism, viz. the rejection of the idea that existence could be as much as a part of the force of a predicate term. Hence, after Kant the notion of existence became an orphan, as far as the logical representation of different propositions in syllogistic logic was concerned.

The next main development in logical theory was the algebra of logic that originated in England around the mid-19th century. The following two ideas came to the forefront: (1) the operators corresponding to our universal quantifier and existential quantifier were treated as duals; (2) universal quantifier expressions were taken to be relative to some universe of discourse, and was inevitably taken as the non-existence of exceptions in that domain. Because of the duality, existential quantifier expressions came to express existence. The orphaned notion of existence thus found a home, no longer in the predicative is but in the existential quantifier. This helps to explain the independent discovery of quantifiers by Frege and by Peirce.

This paper concentrates on what happened to the notion of existence after Kant and before Frege. Particular attention is paid to the English developments around mid-19th century and to the work of George Boole and Augustus De Morgan in particular.


The formal aspect of fourteenth century's theory of consequences


Stephanie Weber (Göttingen)


The theory of consequences is one of the most important developments in the logic of the Middle Ages. One of the first tracts we know is Burleigh's De consequentiis, written about 1300. It is followed by numerous tracts of different lengths and complexity, e.g. the treatises concerning consequences in Ockham's Summa logicae or Burleigh's De puritate artis logicae. Apart from these very elaborated works, there are shorter tracts belonging to the British tradition and written for the use at universities like compilations of different rules of inference, commentaries and even detailed explanations.

In the definitions of a consequence given in these tracts, the authors distinguish between a formal consequence (consequentia formalis) and a material consequence (consequentia materialis). The question discussed in this paper will be: What is meant with a "formal" consequence or what is the "formal" aspect of a consequence as distinguished from a "material" consequence? The discussion will be based on a typical compilation about consequences (De consequentiis) by the British author Richard Billingham, well known for his Speculum puerorum, a famous and influential text in the 14th and 15th century logic curriculum discussing rules for proving all kinds of propositions. Apart from Billingham's school tract about the consequences, the paper will consider contemporary school tracts of the British tradition as well as Ockham's and Burleigh's concept of a formal consequence.


A neglect of semantics: What Frege might have objected to Gödel


Markus Werning (Düsseldorf)


The formalism of Hilbert's arithmetical period proposes that a system of axioms is to be treated as nothing but a network of syntactic relations while the non-logical as well as the logical terms are to be emptied from any semantic value. This view is closely linked to Hilbert's programme of proving the consistency of arithmetics in a finitary, purely syntactic manner. Gödel's second incompleteness theorem is often regarded as a defeat of Hilbert's programme. This paper, though not questioning the soundness of Gödel's incompleteness theorems, argues that their acceptability might in fact be dependent on Hilbert's neglect of semantic issues. For this purpose, formalism is contrasted with a view proposed by Frege that takes semantics seriously. At its heart is the principle of semantic compositionality according to which the semantic value of a syntactically complex term is determined by the semantic values of its syntactic parts. The paper shows that this principle conflicts with the view that Gödelization is a means to semantically denote formulae of the object language. It also points out that some theorems that presuppose semantic compositionality conflict with certain other theorems whose proofs employ Gödelization. It is concluded that the heritage of Hilbert's formalism, the neglect of semantics, might have have deeper implications than one might have thought of, yet.


Last changed: February 12th, 2003