Presentation   Lectures   Events   Links                                                                                    Home
  Lectures 2006-07
                                       [abstract] [text:English/French] .
  • March 26, 2007: José Ferreirós, "Mathematical Knowledge and the Interplay of Practices: The Case of Sets and Natural Numbers" [abstract]. (as a part of PMP2007)
 

  Abstracts

  • Gabriel Sandu, "Logics of Dependence and Independence":

10 years ago Jaakko Hintikka published The Principles of Mathematics Revisited (PMR). I will discuss some of the latest developments inspired by the book, in particular the prospects it opened for a systematic account of different notions of (formal, logical, informational, random) dependence and independence. I will also evaluate the philosophical agenda behind the book (based on results about the definability of truth in the logic introduced in PMR) with an eye on the criticisms and debates recently published in The Philosophy of Jaakko Hintikka, The Library of Living Philosophers, 2006.

  • Daniel Vanderveken, "Foundations of the Logic of Attitudes":
Descartes in his treatise on les passions de l’âme analyzed a large number of attitudes.  His work is a major contribution to modern philosophy of mind. Contemporary logic and analytic philosophy are confined to a few paradigmatic attitudes such as belief, knowledge, desire and intention. Could we use Cartesian analysis to develop a larger theory of all kinds of attitudes directed at objects and facts? Searle in Intentionality criticized Descartes who tends to reduce all attitudes to beliefs and desires. Many different kinds of attitudes such as fear, sadness and regret reduce to the same conjunctions of beliefs and desires. Moreover, our intentions are much more than a desire to do something with a belief that we are able to do it. Of course, all cognitive attitudes contain beliefs and all volitive attitudes desires. But we need more than the two traditional basic categories of cognition and volition in order to analyze attitudes.
I will first proceed to an explication of the nature of psychological modes. In my analysis, psychological modes have other components than the basic categories of cognition and volition. Complex modes also have a proper way of believing or desiring, proper conditions on their propositional content or proper preparatory conditions. Thanks to these other components one can well distinguish stronger and weaker modes. I will recursively define the set of all modes of attitudes that human agents can have towards facts. As Descartes anticipated, the two primitive psychological modes are those of belief and desire. They are the simplest modes. Other more complex modes are obtained from the two primitives by adding to them special cognitive and volitive ways, special conditions on the propositional content or special preparatory conditions.
There are more complex attitudes than propositional attitudes with a psychological mode and a propositional content. So are denegations of other attitudes like discontent and disagreement, conditional attitudes like intentions to defend oneself in the case of an attack and conjunctions of attitudes like doubt and fear. I will define inductively the conditions of possession and of satisfaction of all these kinds of attitudes. To that end, I will exploit the resources of a non standard predicative logic that distinguishes propositions with the same truth conditions that do not have the same cognitive value. On that matter see my collective book Logic Thought & Action Springer, 2005. We need to consider subjective as well as objective possibilities in the logic of attitudes and action in order to account for the fact that human agents are not perfectly but only minimally rational. At the end of my talk I will present an axiomatization of my logic of attitudes.
  • Stephan Hartmann, "Modeling in Philosophy of Science":

Models are a principle instrument of modern science. They are built, tested, compared, and revised in the laboratory, and subsequently, introduced, applied and interpreted in an expansive literature.  Throughout this talk, I will argue that models are also a valuable tool for the philosopher of science. In particular, I will discuss how the methodology of Bayesian Networks can elucidate two central problems in the philosophy of science. The first thesis I will explore is the variety-of-evidence thesis, which argues that the more varied the supporting evidence, the greater the degree of confirmation for a given hypothesis. However, when investigated using Bayesian methodology, this thesis turns out not to be sacrosanct. In fact, under certain conditions, a hypothesis receives more confirmation from evidence that is obtained from one rather than more instruments, and from evidence that confirms one rather than more testable consequences of the hypothesis. The second challenge that I will investigate is scientific theory change.  This application highlights a different virtue of modeling methodology. In particular, I will argue that Bayesian modeling illustrates how two seemingly unrelated aspects of theory change, namely the (Kuhnian) stability of (normal) science and the ability of anomalies to over turn that stability and lead to theory change, are in fact united by a single underlying principle, in this case, coherence. In the end, I will argue that these two examples bring out some metatheoretical reflections regarding the following questions: What are the differences between modeling in science and modeling in philosophy? What is the scope of the modeling method in philosophy? And what does this imply for our understanding of Bayesianism?

  • Øystein Linnebo, "Bad Company Disciplined":
One of the most serious problems facing the neo-logicist project of basing mathematics on abstraction principles is the so-called “the bad company problem.” The problem is that a great variety of “bad” abstraction principles are mixed in among the “good” ones. A classic example of a “bad companion” is Frege’s inconsistent Basic Law V, which is logically quite similar to the neo-logicists’ favorite abstraction principle, Hume’s Principle (which is consistent and gives rise to all of ordinary arithmetic). The bad company problem shows that a deeper understanding is needed of the conditions under which abstraction is permissible.
The aim of this paper is to explore a new attempt to provide such an understanding. This attempt is based on the very general idea that the process of individuation must be well-founded. Since an abstraction principle is naturally regarded as a device for individuating one class of objects in terms of another, the idea that the process of individuation must be well-founded can be used to motivate restrictions on our theory of abstraction. I explore some different restrictions of this kind and show that they result in consistent theories. A common and very surprising feature of all of these theories is their maximally permissive line on abstraction: anyform of abstraction on anyconcept is permitted. Paradox is instead avoided by imposing severe restrictions on what concepts there are. This creates a safe environment for abstraction where what used to be the bad companions now re-emerge as perfectly good. Moreover, I argue that this safe environment is attractive because the restrictions imposed on concept formation are in fact quite natural and intuitive.
  • José Ferreirós, "Mathematical Knowledge and the Interplay of Practices: The case of Sets and Natural Numbers":
The aim of this talk is to describe a new approach to the analysis of mathematical knowledge, currently being developed by the author in a book provisionally entitled “The Interplay of Mathematical Practices: From numbers to sets”.
The emphasis is on mathematical practices in the plural, for it is a key thesis of this approach that several different levels of knowledge and practice are coexistent and that their links and interplay are crucial to mathematics. Crucial, that is, for the constitution of meaningful concepts, the determination of admissible principles, and the development of mathematical knowledge through the rise of new practices. Being an approach that emphasizes the links between diverse practices, it is naturally centred upon the mathematician as an epistemic agent that establishes those links.
The paper will explore basic features of that perspective, based on some of the simplest traits of such an account. Even if we disregard subtler aspects of a mathematical practice such as the images of mathematics it incorporates, and the values that are being promoted by participants in the practice, we are still left with sufficient material for an interesting analysis of the constitution of mathematical knowledge. This will be shown by focusing on two exemplary cases which in fact are interwoven: the constitution of the concept of a natural number from the interplay of non-scientific practices and new symbolic practices; and the way in which previous mathematical practices (in particular arithmetical ones) have conditioned the admissible principles of set theory.
  • Jean Bricmont, "Does Quantum Mechanics invalidate realism?":
It is often said that Quantum Mechanics has refuted realism, namely the idea that there exists a physical world whose properties are independent of human beings, of human perceptions or of human consciousness and that are discovered through scientific investigation.
The goal of the talk will be to examine critically the arguments put forward in favour of this idea. The basic thesis that will be defended is that Quantum Mechanics does not refute realism, as understood here, but it does refute a naive form of realism which is implicit in much of the talk about "measurement" and, moreover, it refutes our local view of the universe (no action-at-a-distance).