Colloquia and Events

Fall 2011

November 17, 2011, Pure and Applied Logic Colloquium

Dana Scott, Carnegie Mellon University
Mixing Modality and Probability
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: Many models for various modal logics have been given over the years. The author saw that the complete Boolean algebra of measurable subsets of the unit interval modulo sets of measure zero retains a topological structure.  This is possible because working modulo null sets does not always identify open and closed sets.  Therefore, a non-trivial Boolean-valued model of the Lewis modal system S4 is obtained (recently proved complete for propositional logic by others).  The author has spoken about this semantics at CMU in the past, but he still feels there should be applications of the logic to basic ideas of probability.  For this presentation a simplified description of semantics for a second-order system will be given which allows some possible insights into modeling randomness in a general, logical way.

November 19-20, 2011, Center for Formal Epistemology Colloquium

Commemorative Colloquium for Horacio Arlo-Costa



Cleotilde Gonzales, Carnegie Mellon, Social and Decision Sciences
Jeff Helzner, Columbia University
Vincent Hendricks, University of Copenhagen, Editor in Chief Synthese
Eric Pacuit, University of Maryland, CFE Fellow
Rohit Parikh, CUNY
Paul Pedersen, Carnegie Mellon University
Scott Shapiro, Yale Law School
Gregory Wheeler, University of Lisbon

December 8, 2011, Philosophy Colloquium

Risto Hilpinen, University of Amsterdam
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Past Events

November 5, 2011, Center for Formal Epistemology Workshop

In Search of Answers: The Guiding Role of Questions in Discourse and Epistemology
4625 Wean Hall

Schedule and abstracts.


Jeroen Groenendijk, University of Amsterdam
Craige Roberts, Ohio State University
Mandy Simons, Carnegie Mellon University
Hanti Lin, Carnegie Mellon University
Kevin T. Kelly, Carnegie Mellon University

November 3, 2011, Philosophy Colloquium

Jeroen Groenendijk, University of Amsterdam
Inquisitive Semantics and Pragmatics
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: In this talk I will provide an introduction to inquisitive semantics, but I will largely do so by not considering inquisitive semantics, but proto-inquisitive semantics. This is how I now call the partition semantics for a first-order logical language with questions that can be found in The Logic of Interrogation, my 1999 SALT-paper. In this language there are indicatives and interrogatives, only sentences in the rst category can be informative, and only those in the second category can be inquisitive. In inquisitive semantics not all questions correspond to partitions, which I will briefly motivate by considering conditional questions. Also, logical languages that the semantics is applied to are standard languages in which no syntactic distinction between indicatives and interrogatives is made. Instead, assertions and questions are characterized semantically. And, not all sentences belong to one of these two categories, some sentences are hybrid, sentences which are both informative and inquisitive. Plain disjunction and existential quantification deliver such hybrid sentences. For several reasons this is interesting from a linguistic point of view. What I will do in the talk is reformulate proto-inquisitive semantics using the concepts and tools from inquisitive semantics. This is useful as such, because what results, though it remains equivalent, is a conceptually more transparant and a more standard system. More importantly, having the two systems in the same format makes it easier to compare them, and detect precisely what the real di erences are. This leads to a better understanding of both proto- and real inquisitive semantics. An important feature of both proto- and real inquisitive semantics is that they come with logical pragmatical notions concerning the coherence of discourse, relatedness of sentences to each other, other than the standard logical relation of entailment. Answerhood is just a special case of what is covered by these notions. I will present a new version of the logical notion of a compliant response to an initiative that the comparison of the two semantics has led to. Central to the talk is that I want to show that and how inquisitive semantics offers a new notion of meaning that is richer than that of a classical proposition modeled as a set of possible worlds, the worlds where a sentence is true. Our propositions, which cover both informative and inquisitive content, are modeled as sets of possibilities, which correspond to states of the common ground where a sentence is supported. We can now look upon a proposition as a proposal to the participants in a conversation to update the common ground in one or more ways, the alternative possibilities covered by the proposition. Arguably, this is a notion of meaning that, unlike the classical one, is inherently linked to the interactive communicative use of language. I will further motivate and sharpen this new concept of meaning in the talk.

October 13, 2011, Pure and Applied Logic Colloquium

Rick Statman, Carnegie Mellon University
A New Type Assignment for Strongly Normalizable Terms
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: Intersection types are very interesting especially in their use for proving untyped terms to be strongly normalizable. However, we view them first as types and second, via the Curry-Howard iso-morphism, as formulae. That is, we consider the "types as formulae" direction of the isomorphism. We would like to extend the "formulae as types" direction of Curry-Howard to include all strongly normalizable terms. We shall do this by considering a natural deduction formulation of a definitional extension of the intuitionistic theory of monadic predicates and show that the untyped lambda terms with types (under the Curry-Howard isomorphism) are precisely the strongly normalizable terms.

October 6, 2011, Center for Ethics and Policy Colloquium,

Adina Roskies, Dartmouth College
Rethinking the threat from brain scans in the courtroom
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: Both empirical data and philosophical considerations suggest that brain scans used as evidence in the courtroom may be biasing or misleading. However, recent studies suggest this view is mistaken. In this talk I explain the reasons for the expectation that neuroimages may be misleading, and review the studies that contradict it. I offer an explanation for the totality of the seemingly contradictory evidence, and argue that this has implications for the admissibility of neuroimaging in the courtroom.

September 29, 2011, Center for Ethics and Policy Colloquium,

Darrel Moellendorf, Claremont Graduate School
Climate Change and Human Rights: Assessing Some Philosophical Challenges
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: In this paper I set out an argument, invoking human rights, in defense of the duties to mitigate and provide adaptation to climate change. I look at five challenges to the human rights argument, three of which have been pressed in the literature on conceptual grounds, and two of which I develop on normative grounds. I present what I think are satisfactory responses to the three conceptual challenges but I argue that the normative challenges are more compelling. The human rights argument does not help us to understand well our duties to future generations to mitigate and provide adaptation for climate change. The problems with the human rights argument suggest that a more promising approach is to understand these duties as matters of intergenerational distributive justice.

September 28, 2011, Informal Afternoon Talk

Yasuo Deguchi, Kyoto University
Kant and Segner: Kant's philosophy of Mathematics and 18th Century German Arithmetic
3:30-5:00 BH 150

Abstract: Kant is taken as one of the first persons who used the word 'construction' in the current philosophical sense, and therefore is deemed to be an originator of constructivism. In my view, what he did was to generalize the idea of `geometrical construction' to cover other basic operations in different fields of mathematics; particularly and crucially, the 'successor operation' in arithmetic. In other words, it was a decisive step forward for 'the birth of construction in Kant' to view the 'successor operation' as being analogous to geometrical construction in a sense. This crucial step was backed up by Kants' conception of the
'successor operation' or 'number' as a 'schema'. His concept of
number-as-schema was not standard in 18th century German arithmetic. But we can find its prototype in some writings of Johann Andreas von Segner (1704-77), a German mathematician/scientist, whose 'Arithmetik' Kant referred to, in a crucial passage. My talk will focus on how Segner's view on number contributed to the birth of construction in Kant'.

September 28, 2011, Informal Lunchtime Talk

Yukinori Takubo, Kyoto University
Modal Questions in Korean and Japanese
12:00-1:30 BH 150

Abstract: In this talk we will discuss constraints on questions with modals in Korean and Japanese.  In Korean there are two expressions for expressing future: -keyss and -ul kes-i-(future adnominal form+ thing+copula).  They can be used more or less in the same way, as in (1)a,b, when they express intention. 

1 )  a.  Nayil  tangsin-un  o-keyss-upni-kka?
tomorrow you-TOP  come-keyss-HON-Q
'Are you coming tomorrow?
b.  Nayil  tangsin-un  o-l  kes-i-pni-kka?
tomorrow you-TOP come-l kes-i-HON-Q
'Are you coming tomorrow?

The differences begin to emerge when they are attached to non-volitional predicates as in (2).

2 )  a. (Situation:the speaker greets a farmer looking up at the sky)
Nayil  pi-ka  o-keys-ssupni-kka?
tomorrow rain-NOM  come-keyss-HON-Q
'Will it rain tomorrow?'
b. (same as in a)
*Nayil   pi-ka  o-l kes-i-pni-kka?
tomorrow rain-NOM come-l kes-i HON-Q
'Will it rain tomorrow?'

(2)b cannot naturally be interpreted as an epistemic question asking if the addressee thinks it will rain, in contrast to (2)a, which can.  In a very special context, (2)b can be coerced to be interpreted as asking someone who can control rain, e.g. a deity or a scientist who is conducting an artificial rain experiment.
The distribution of -ul kes i- can be accounted for if we assume that it expresses epistemic necessity.  As can be seen in (3)-(4), sentences expressing epistemic modality cannot be questioned in English.

3 ) a.  It {must/may} rain in the afternoon.
b.  ??{Must/May} it rain in the afternoon?

Sentences expressing epistemic modality can be made into a question by changing it into a meta-question, i.e. a question asking about the modal force itself: whether it is necessary, possible or not.  In Japanese, such a meta-question can be formed by adding no, the complementizer.  We will show that the only way that ul kes-i can be interpreted as a meta-question is to interpret it as expressing the predetermined future, thereby accounting for the special coerced interpretation of ul kes-i .  

September 26, 2011, Center for Ethics and Policy Colloquium,

Alex Voorhoeve, London School of Economics
Decide as You Would with Full Information! An Argument against the Ex Ante Pareto Principle
Venue: TBA

Abstract: The ex ante Pareto principle requires that if a first alternative has greater expected value for each person than second alternative, the first alternative ought to be preferred. We examine cases in which a first alternative has greater expected value for each person, but we know that under this alternative one person will, ex post, end up worse off than others. We argue that, in such cases, the ex ante Pareto principle is of doubtful validity, because it relies on incomplete information about what is in the interests of each person. We argue that, whenever possible, it is better to rank alternatives as we *would* rank them if we had full information about how individuals will be affected. 

September 22, 2011, Philosophy Colloquium,

Jan-Willem Romeijn, CFE Fellow, University of Groningen
Observations and Objectivity in Statistics
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: Observations are generally agreed to be laden with theory, and hence not entirely objective. It may be thought that if the data are objective anywhere, it is in statistics. In this paper I argue against this and reveal two ways in which statistical inference is affected by the theory-ladenness of observations. The first of these concerns well-known violations of the likelihood principle, namely in hypothesis testing and optional stopping. It appears that we can represent these violations as cases in which the likelihood principle is adhered to. But to achieve this, we have to accept that the content of the observations depends on the statistical hypotheses under consideration. Another way in which statistical data may be theory-laden concerns the influence of priors on how the observations affect our judgment over the hypotheses. I will discuss two cases in which the implicit or explicit adoption of a prior has specific implications for what is concluded from the observations, one in regression analysis and one in causal modelling. Rather than seeing these results in a negative light, as damaging to the objectivity of statistical methods, I think that they invite us to rethink the role of theory-ladenness. I argue that it is exactly because of the theory-ladenness that we can learn from the data. In grand philosophical terms, I argue for a rationalist twist to the empiricist orientation of the philosophy of statistics.

September 15, 2011, Philosophy Colloquium:

Eric Pacuit, CFE Fellow, University of Maryland
Dynamic Logics of Evidence-Based Beliefs
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: A rational belief must be grounded in the evidence available to an agent. However, this relation is delicate, and it raises interesting philosophical and technical issues. Modeling evidence requires richer structures than found in standard epistemic semantics where the accessible worlds aggregate all reliable evidence gathered so far. Even recent more finely-grained plausibility models ordering the epistemic ranges identify too much: belief is indistinguishable from aggregated *best* evidence. In this talk, I will discuss a recent paper by myself and Johan van Benthem where we add evidence structure to standard models of belief, in the form of families of sets of worlds. We show how these more fine-grained models support natural actions of ``evidence management'', ranging from update with external new information to
internal rearrangement.

September 8, 2011,Philosophy Colloquium:

Graham Priest, University of Melbourne and CUNY Graduate Center
Logical Disputes and the a Priori.
Reception: 4:00-4:35 DH 4301, Talk: 4:45-6:00 BH A53

Abstract: How should one decide which logic is correct? I will answer the question by giving a very general account of rational theory choice, and arguing that it applies to the choice of logic. A feature of the account is that it has no role for the a priori. This feature will come in for special discussion.

September 9, 2011,Informal Lunchtime Talk

Graham Priest, University of Melbourne and CUNY Graduate Center
Between the Horns of Idealism and Realism: The Middle Way of Madhyamaka
12:00-1:00 DH4303

Abstract: Madhyamaka was one of the two major schools of Indian Mahayana Buddhist philosophy, which had an enormous impact on all subsequent Mahayana Buddhisms. An important aspect of their position was that there is no ultimate ground to reality, every thing being dependent on other things. This allows them to shape a metaphysics that is neither realist nor idealist, but "goes between the horns" of these two views. In this talk I will explain all this. (I will not presuppose any background knowledge of Asian philosophical traditions.)

June 24-26  Episteme Conference, held at CMU.Local arrangements.

June 22, 2011, Informal Lunchtime Talk: 

Emmanuel J. Genot, University of Lund
A Little Semantics is a Dangerous Thing
12:00-1:00 DH4303

Abstract: The aim of this paper is to show that the dismissal of logical models of reasoning, as empirically inadequate, and relying on too strong idealizations, comes from an insufficient understanding of the resources of formal semantics. I first review arguments from cognitive psychology, backing the use of semantic methods. Then I propose a new semantic account of Hintikka's interrogative tableaux, capturing reasoning of agents with very limited cognitive resources. [Stenning&Van Lambalgen,2007] show that the often argued non-logicality of human reasoning results from the assumption of uniqueness of logic, despite the plurality of logics. From a pluralistic point of view, choice of an appropriate logic is context-dependent. When contextual parameters are unclear, distinct empirical agents may "default" to different logics. Conversely, when they are clearly set, agents' performances become uniform, conforming to the same logic. The appropriate logical standard is in turn identified through its semantic, with the semantic structure mirroring relations between objects reasoned about. Logical pluralism supports the objectivity of logic, and its relevance for empirical cognitive science. Hintikka also has argued, on semantic grounds, that the Semantic Tableau Method yields a model of reasoning, once extended to represent "questions" as means of information gathering.(see [Hintikka et al., 1999]). The resulting Interrogative Tableaux embody reasoning about epistemic alternatives (or scenarios). Hintikka furthermore sketches a game-theoretic epistemic semantics, without however fully developing it. His ideas depart from standard game theory, in assuming that agents use "local" strategic principles. A game-theoretic account of semantic entailment assuming players with limited awareness of the future history of the game, following e.g. [Halpern&Rêgo2006], yields a two-player game between an initial verifier of the premises, Abelard, and an initial verifier of the conclusion, Eloise, where
Eloise has a winning strategy iff every model of the premises Abelard can choose during the game verifies the conclusion. When this is so, the game reaches a fixed-point, which is identifiable by the players given only the past history of the game. If Abelard wins, then the game also reaches a fixedpoint, which cannot however always be identified, since it depends on the future of the game. Constraining the Abelard's choices by information (answers) about the underlying state of Nature, captures Hintikka's interrogative reasoning. Only minimal awareness of the future history of the game, is required for "local" strategies to matc exactly the interrogative tableau rules. The existence and identifiability of fixed-points introduce
consideration pertaining to formal learning theory. I conclude that interrogative logic, endowed with this epistemic game-theoretic semantics, is at its core learning-theoretic, and fully vindicates Hintikka's claim that interrogative logic is an empirically adequate model of human reasoning.

May 14-15  CSLI-CFE-San Francisco State workshop on Logic and Formal Epistemology  held at CSLI, Stanford.

May 5, 2011, Philosophy and CMU-Pitt Program in Computational Biology Joint Colloquium

Ioannis Tsamardinos, University of Crete
Toward Integrative Causal Analysis of Heterogeneous Datasets and Prior Knowledge
4:15-6:00 BH A53

Abstract: Modern data analysis methods for the most part, concern the analysis of a single dataset. The conclusions of an analysis are published in the scientific literature and their synthesis is left up to a human expert. Integrative Causal Analysis (INCA) aims at automating this process as much as possible. It is a new, causal-based paradigm for inducing models in the context of prior knowledge and by co-analyzing heterogeneous datasets in terms of measured variables, experimental conditions, or sampling methodologies. INCA is related to, but is fundamentally different from statistical meta-analysis, multi-task learning, and transfer learning. In this talk, we illustrate the enabling INCA ideas, present INCA algorithms, and give proof-of-concept empirical results. Among others, we show that the algorithms are able to predict the existence of conditional and unconditional dependencies (correlations), as well as the strength of the dependence, between two variables Y and Z never measured on the same samples, solely based on prior studies (datasets) measuring either Y or Z, but not both. The algorithms accurately predict thousands of dependencies in a wide range of domains, demonstrating the universality of the INCA idea. The novel inferences are entailed by assumptions inspired by causal and graphical modeling theories, such as the Faithfulness Condition. The results provide ample evidence that these assumptions often hold in many real systems. The long term goal of INCA is to enable the automated large-scale integration of available data and knowledge to construct causal models involving a significant part of human concepts.

April 22, 2011, Informal Lunchtime Talk

Rachel Briggs, University of Sydney and NYU
Two Interpretations of the Ramsey Test

Abstract:  According to the Ramsey test, a person should accept a conditional to the extent that she would accept the consequent on the supposition that the antecedent holds. There are two attractive ways of interpreting the Ramsey test: Adams’ thesis, which states that the probability of a conditional is the conditional probability of the consequent given the antecedent, and Stalnaker semantics, which states that a conditional is true at a world w just in case its consequent is true at all closest antecedent worlds to w. Unfortunately, a well-known class of triviality theorems shows that when the two interpretations of the Ramsey test are combined, they entail seemingly absurd triviality results. Stefan Kaufmann has proposed (for reasons largely independent of the triviality theorems) a revised version of Adam's; thesis, which I call Kaufmann's thesis. I prove that combining Kaufmann&'s thesis with Stalnaker's semantics leads to local triviality results, which seem just as absurd as the original triviality results. Luckily, Stalnaker semantics can be revised too: in particular, it can be replaced with a generalized imaging semantics. I argue that combining Kaufmann's thesis with generalized imaging semantics provides a way of defanging the local triviality results, not by undercutting the arguments for them, but by explaining why they are not as philosophically problematic as they seem.

April 21, 2011, Philosophy Colloquium:

Mic Detlefsen, Notre Dame University
Freedom in Mathematics
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A54

Abstract: The freedom proper to mathematical investigation was an important concern of nineteenth and twentieth century writings in the foundations of mathematics. I will survey some of this literature with an eye to identifying how freedom in mathematics was understood by writers of this period, what conditions were generally taken to govern its legitimate exercise and what value its preservation was commonly believed to have. I will also consider the significance of these issues for us today.

April 7, 2011, Philosophy Colloquium: 

Michiel van Lambalgen, University of Amsterdam
Logical Modelling of Cognitive Processes: the Case of Autism
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract:  People with Autism Spectrum Disorder (ASD) have very distinctive reasoning patterns in the domains of causal and counterfactual reasoning, as well as in belief attribution. The challenge is to come up with a unitary explanation for these phenomena. Since there is an element of causality in each of the domains, causal Bayes nets may seem a candidate for a unitary explanation. It is generally felt that explanations in terms of causal Bayes nets are antithetical (and superior) to logical models. However, we present theorems showing that there is a very close connection between Bayes nets as axiomatised in Chapter 3 of Spirtes, Glymour and Scheines (2001), and a non-classical logic called 'logic programming with negation as failure'. In view of this we feel free to model inference (broadly conceived) using this particular logic. We indicate how it explains performance on the false belief task and how it predicts performance on a inference task hitherto not administered to people with ASD. The results are in striking conformity with the predictions (Pijnacker et al, Neuropsychologia (2009)).

April 6, 2011, Informal Lunchtime Talk: 

Michiel van Lambalgen, University of Amsterdam
A Formalisation of Kant's Transcendental Logic
12:00-1:00 DH4303

Abstract:  Although Kant envisaged a prominent role for logic in the argumentative structure of his 'Critique of pure reason', logicians and philosophers have generally judged Kant's logic negatively. What Kant called 'general' or 'formal' logic has been dismissed as a fairly arbitrary subsystem of first order logic, and what he called 'transcendental logic' is considered to be not a logic at all: no syntax, no semantics, no definition of validity. Against this, we argue that Kant's `transcendental logic' is a logic in the strict formal sense, albeit with a semantics and a definition of validity that are vastly more complex than that of first order logic. The main technical application of the formalism developed here is a formal proof that Kant's Table of Judgements in §9 of the 'Critique of pure reason', is indeed, as Kant claimed, complete for the kind of semantics he had in mind. This result implies that Kant's `general' logic is after all a distinguished subsystem of first order logic, namely what is nowadays known as geometric logic. We will discuss the relationship of this result to Avigad et al's axiomatisation of Euclidean geometry in terms of geometric logic (RSL 2009).

March 31, 2011, Philosophy Colloquium:

Anil Gupta, University of Pittsburgh
Conditionals in Theories of Truth
Reception: 4:00-4:35 DH 4301, Talk: 4:45-6:00 BH A53

Abstract:  I will compare the treatment of conditionals in two accounts of truth, revision theory and Hartry Field's recent proposal.

March 28, 2011, Center for Ethics and Policy Colloquium:

Gusfaf Arrhenius, Stockholm University
The Impossibility of a Satisfactory Population Ethics
BH 136-A (Adamson Wing)

Abstract: Population axiology concerns how to evaluate populations in regard to their goodness, that is, how to order populations by the relations “is better than” and “is as good as”. This field has been riddled with paradoxes and impossibility results which seem to show that our considered beliefs are inconsistent in cases where the number of people and their welfare varies. All of these results have one thing in common, however. They all involve an adequacy condition that rules out Derek Parfit’s Repugnant Conclusion. Moreover, some theorists have argued that we should accept the Repugnant Conclusion and hence that avoidance of this conclusion is not a convincing adequacy condition for a population axiology. As I shall show in the present paper, however, one can replace avoidance of the Repugnant Conclusion with a logically weaker and intuitively more convincing condition. The resulting theorem involves, to the best of my knowledge, logically weaker and intuitively more compelling conditions than the other theorems presented in the literature. As such, it challenges the existence of a satisfactory population ethics.

March 17, 2011, Philosophy Colloquium:

Hannes Leitgeb, Ludwig-Maximiliens-University
A Theory of Propositions and Truth
Reception: 4:00-4:35 DH 4301, Talk: 4:45-6:00 BH A53

Abstract: Is it possible to formulate a theory of truth which solves or avoids semantic paradoxes in the same way as modern set theory solves or avoids the set-theoretic paradoxes? Starting from the assumption that the answer is "yes", we investigate what such a theory would have to look like, and we end up with an axiomatic theory of propositional functions, propositions, and truth in which a quasi-Tarskian theory of truth is being applied to a cumulative hierarchy of propositions amongst which no paradoxical propositions can be found. (This is joint work with Philip Welch.)

March 16, 2011, CFE Symposium on Uncertain Acceptance

Hannes Leitgeb, Ludwig-Maximiliens-University
The Lockean Thesis Revisited
12:00-1:00 DH4303

Abstract: At the Inaugural Conference of the CMU Center for Formal Epistemology, I presented a new theory of qualitative and quantitative belief in which a definition of qualitative belief in terms of stably high subjective probability was derived from a set of plausible postulates. In this talk I will return to that theory but now from the viewpoint of the Lockean thesis ("A is believed if and only if the subjective probability of A is high"). It will be shown that the very same theory of belief follows from the Lockean thesis together with some further plausible postulates, once the choice of the threshold value that is underlying the Lockean thesis is made fully contextual. In light of this result, I will explore some of the consequences, applications, and problems of the theory.

Horacio Arlo-Costaand Paul Pedersen, Carnegie Mellon University
Reducing Belief To Degrees of Belief: A General Theory of Probability Cores
1:00-2:00 DH4303

Abstract:  A first step towards providing a paradox-free theory of probabilistic belief was offered in a seminal paper by Bas van Fraassen [7].  The theory of probability cores introduced by van Fraassen was extended and slightly modified in a series of papers by Arló-Costa ([1], [2], [3]) and Arló-Costa and Parikh [4].  The theory permitted a probabilistic characterization of the notion of full belief or certainty as well as the notion of `almost certainty’ or expectation.  In [4] Arló-Costa and Parikh applied the theory  to define conditional belief in addition to belief. They offered then a characterization of the system R of Rational Logic proposed by Lehmann and Magidor in [5].  Hannes Leitgeb has recently proposed in [6] a more general account capable of characterizing a notion of `plain’ belief determined in terms of high probability.  We show that there is a natural extension of the idea of probability core that is equivalent to Leitgeb’s theory.  We extend Letigeb’s theory and study the problem of the dynamics of  high probability cores. This study permits introducing at least two probabilistic versions of the so-called Ramsey test.  One of these tests yields R and another validates the axioms of R except the axiom OR.  We consider some applications of the latter logic.  Finally we consider a ratio rule for acceptance first proposed by Isaac Levi in [7].  We show that the rule also induces a core system, defined this time in a different way.  Although Levi presents the rule as a probabilistic rule it seems that this is a richer rule deploying two measures representing the often conflicting ideals of acquiring as much information as possible and avoiding the incorporation of falsehoods in inquiry.  Still we verify that a core system can also be defined for this more complicated rule.  So the general theory of probability cores seems to go beyond the narrow limits of even the most sophisticated versions of probabilism available today. We will also discuss a new Dutch Book for primitive conditional probability that gives an operational interpretation to various versions of the core structure used throughout the talk.  This part of the talk makes more direct contact with relevant issues in decision theory.  Time-permitting we will discuss similarities and differences with the account presented by Lin and Kelly in [8].

Hanti Lin and Kevin T. Kelly
Propositional Reasoning that Tracks Probabilistic Reasoning
2:00-3:00 DH4303

Absract: We discuss the extent to which propositional belief revision among uncertain, accepted propositions can be made consonant with probabilistic conditioning, when acceptance is governed by an inter-subjective rule.  We present two new riddles for the Lockean thesis that high probability suffices for acceptance.  We show that AGM belief revision cannot, on pain of triviality, be made consonant with Bayesian conditioning.  Then we present an approach to propositional acceptance and belief revision that does maintain perfect consonance with Bayesian conditioning.  The revision rule is due to Shoham (1987) and the acceptance rule was proposed originally by Levi (1996).  Finally, we show that the proposed approach resolves the riddles.

March 3, 2011, Philosophy Colloquium:

Wayne Myrvold, University of Western Ontario
Maxwell and a Third Second Law of Thermodynamics.
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: It has long been recognized that there are two distinct laws that go by the name of the Second Law of Thermodynamics.  The original says that there can be no process resulting in a net decrease in the total entropy of all bodies involved.  A consequence of the kinetic theory of heat is that this law will not be strictly true; statistical fluctuations will result in small spontaneous transfers of heat from a cooler to a warmer body.  The currently accepted version of the Second Law is probabilistic: tiny spontaneous transfers of heat from a cooler to a warmer body will be occurring all the time, while a larger transfer is not impossible, merely improbable can be no process whose expected result is a net decrease in total entropy. According to Maxwell, the Second Law has only statistical validity, and this statement is easily read as an endorsement of the probabilistic version. I argue that a close reading of Maxwell, with attention to his use of "statistical," shows that the version of the second law endorsed by Maxwell is strictly weaker than our probabilistic version. According to Maxwell, even the probable truth of the second law is limited to situations in which we deal with matter only in bulk and are unable to observe or manipulate individual molecules. Maxwell's version does not rule out a device that could, predictably and reliably, transfer heat from a cooler to a warmer body without a compensating increase in entropy. This raises the issue: do we really have good reason to believe in our version, rather than Maxwell's?

January 20, 2011, Philosophy Colloquium:

Cosma Shalizi, Carnegie Mellon University
Praxis and Ideology in Bayesian Statistics
Reception: 4:00-4:35 DH 4301, Talk:  4:45-6:00 BH A53

Abstract: A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise of Bayesian statistics in applications. In this talk, I hope to persuade you that the most successful practices of Bayesian statistics do not actually support that philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism.  Drawing on the literature on the consistency of Bayesian updating and also on experience of applied work, I examine the actual role of prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. I argue that good Bayesian practice is very like good frequentist practice; that Bayesian methods are best understood as regularization devices; and that Bayesian inference is no more inductive than frequentist inference, i.e., not very.  At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not conform to their ideology.  Based on joint work with Andrew Gelman (

December 2, 2010, Informal Lunchtime Talk

Michael Shaffer, St. Cloud State University
Knowledge, Rationality, and Scientific Idealization

Abstract:  Recently, the relationship between knowledge, rationality and practical interests has been an active area of exploration in epistemology.  A number of variously motivated philosophers have argued for something like that claim that S knows that p iff it is rational for S to act on the basis of p.  I argue that the extant proposals concerning the encroachment of pragmatic factors into the concept of knowledge is flawed because they fail to take into account the fact that it is often rational to act on the basis of claims that are not strictly true and so cannot be known.  These sorts of claims importantly include idealizations and they are especially important in the course of rational scientific activity.  So we need a well-grounded theory of idealizations as a part of the account of the methodology of science that reflects the relationship between knowledge and practical interests.  To this end, I treat idealizations as counterfactual conditionals, but this raises the difficult issue of how we confirm the truth of such claims.  This is particularly problematic because such counterfactuals appear to be contingent claims that have empirical content, but which are about non-actual states of affairs. 

December 1, 2010, Center for Formal Epistemology Workshop

Experience, Heuristics, and Choice:  Prospects for Bounded Rationality
Schedule and abstracts


Ralph Hertwig, University of Basel
Jonathan Leland, National Science Foundation
Paul Pedersen, Carnegie Mellon University
Elke Weber, Columbia University
Jerome Busemeyer, Indiana Universityi at Bloomington
Coty Gonzalez, Carnegie Mellon University
Tim Pleskac, Michigan State University
J.D. Trout, Loyola University
Horacio Arlo-Costa, Carnegie Mellon University


David Danks, Carnegie Mellon University
Kevin Kelly, Carnegie Mellon University
Richard Samuels, Ohio State University

November 22, 2010, Center for Ethics and Policy (CEP) Colloquium

Russell Powell, Oxford University
In Genes We Trust: Germ-Line Modification and the Preservation of Human Good

Abstract: Prominent proponents of genetic engineering argue that human germ-line modification is morally desirable because it will result in a net improvement in human wellbeing. I argue here in favor of a more fundamental point, namely that genetic engineering will be necessary merely to sustain the levels of health and wellbeing that humans currently enjoy. I show that a large-scale program of genetic intervention will be necessary to preserve existing levels of human wellbeing due to the population-genetic consequences of relaxed selection pressures in human populations caused by the increasing efficacy and availability of conventional medicine and other health-related institutional resources. I defend the counterintuitive claim that the greater the effectiveness of conventional medicine, the greater the need for germ-line modification, since the former in the absence of the latter will lead undesirably to an increasing reliance on medical technology for the development of normal human capacities. Although this conclusion follows from the structure of evolutionary theory, it has been overlooked in medicine and bioethics due to various misconceptions about human evolution, which I attempt to rectify, as well as the sordid history of Darwinian approaches to medicine and social policy, which I carefully distinguish from the present argument. I propose that human genetic engineering is a prima facie moral imperative grounded in principles relating to the fair and efficient allocation of limited health care resources across generations.

November 18, 2010, Philosophy Colloquium

Timothy Williamson, Wykeham Professor of Logic, Oxford  University
Improbable Knowledge

Abstract:  The paper argues that failures of the KK principle (that if one knows something, one knows that one knows it) can occur in extreme form: one can know p even though it is almost certain on one’s evidence that one does not know p. For related reasons, p can be part of one’s evidence even though it is almost certain on one’s evidence that p is not part of one’s evidence, and it can be rational for one to do A even though it is almost certain on one’s evidence that it is not rational for one to do A. The argument will use formal models from epistemic logic but will not assume previous knowledge of epistemic logic. It will be argued that models of the relevant type are adequate approximations to realistic, rather everyday situations.

November 17, 2010, Informal Lunchtime Talk

Timothy Williamson, Wykeham Professor of Logic, Oxford  University
Necessitism, Contingentism and Plural Quantification

Abstract: Necessitism is the view that necessarily everything is necessarily something; contingentism is the negation of necessitism. The dispute between them is reminiscent of, but clearer than, the more familiar one between possibilism and actualism. A mapping often used to ‘translate’ actualist discourse into possibilist discourse is adapted to map every sentence of a first-order modal language to a sentence the contingentist (but not the necessitist) may regard as equivalent to it but which is neutral in the dispute. This mapping enables the necessitist to extract a ‘cash value’ from what the contingentist says. Similarly, a mapping often used to ‘translate’ possibilist discourse into actualist discourse is adapted to map every sentence of the language to a sentence the necessitist (but not the contingentist) may regard as equivalent to it but which is neutral in the dispute. This mapping enables the contingentist to extract a ‘cash value’ from what the necessitist says. Neither mapping is a translation in the usual sense, since necessitists and contingentists use the same language with the same meanings. The former mapping is extended to a second-order modal language under a plural interpretation of the second-order variables. It is proved that the latter mapping cannot be. Thus although the necessitist can extract a ‘cash value’ from what the contingentist says in the second-order language, the contingentist cannot extract a ‘cash value’ from some of what the necessitist says, even when it raises significant questions. This poses contingentism a serious challenge. The talk will concentrate more on the technical than on the metaphysical aspects of the topic.

October 19,  2010, Nagel Lecture

Brian Skyrms, U.C. Irvine
Naturalizing the Social Contract

Abstract: All social contracts that exist, or that could come to exist, must arise by some kind of natural process. This talk is about using game theory and evolutionary dynamics as tools for a naturalistic investigation of the social contract.

October 21,  2010, Nagel Lecture

Brian Skyrms, U.C. Irvine
Signals: Evolution, Learning and Information

Abstract: Signaling plays a vital role at all levels of biological organizattion. I discuss how meaningfull signaling can arise spontaneously by evolutionary or learning dynamics in the simplest sender-receiver games. Then these games are generalized to simple signaling networks which implement information processing and teamwork.

October 22,  2010, Nagel Lecture

Brian Skyrms, U.C. Irvine
On the Dynamics of Signaling

October 7, 2010, Philosophy Colloquium

Kevin Knuth, SUNY Albany.
Quantification and the Origin of Physical Law:  The Feynman Complex Formalism of Quantum Mechanics

Abstract: It is a commonly held belief that physical laws reflect an underlying order in the universe.  If this is the case, then it is possible that significant aspects of physical law can be derived from a precise specification of that order.  In this talk, I show that this is indeed the case.  Given a set of elements, which comprise the topic of discourse, and a binary ordering relation, one can construct a partially ordered set and often a lattice.  The symmetries possessed by the lattice structure induced by the ordering relation impose strong constraints on any quantification scheme designed to map elements to real numbers or real number pairs.  These constraint equations give rise to what we recognize as physical laws.   In this talk I will demonstrate these ideas by considering sequences of quantum measurements as elements.  Such sequences can be ordered by describing them in terms of sequences obtained by performing measurements in series and parallel.  By quantifying sequences of quantum measurements by pairs of real numbers, I show that pairs combine according to the complex sum and product rules in accordance with the Feynman formulation of quantum mechanics.  In addition, we simultaneously derive the Born rule that maps pairs to real numbers, which represents the probability of a measurement sequence.   In addition to the specific application to quantum mechanics, these techniques have been used to derive probability theory, information theory, and special relativity.

1. Goyal P., Knuth K.H., Skilling J. 2010. Why Quantum Theory is Complex, Phys. Rev. A 81, 022109.
arXiv:0907.0909v3 [quant-ph]
2. Knuth K.H., Skilling J. 2010. Foundations of inference.
arXiv:1008.4831v1 [math.PR]
3. Knuth K.H., Bahreyni N. 2010. A derivation of special relativity from causal sets.
arXiv:1005.4172v2 [math-ph]

September 30, 2010, Distinguished Alumnus Lecture

Robert Malkin, Senior Software Engineer, Google
Predicting Bounce Rates in Sponsored Search

Abstract: A lot of effort has been spent on learning how to predict clickthrough rates for sponsored search advertising.  Relatively little attention has been spent on predicting the quality of the post-click experience, which is vital both for long-term user happiness and for continued advertiser engagement.  Bounce rate is a recent metric which attempts to capture one aspect of the post-click experience.  I present research in which we attempt to automatically predict bounce rates for sponsored search with the ultimate goal of improving ad utility for users and advertisers.

September 24, 2010, Informal Lunchtime Talk

Yasuo Deguchi, University of Kyoto
Three Prisoners Problem and Likelihoodism

  Abstract: The three prisoners' problem (or Monty Hall problem) is a well known puzzle for Bayesianism. But it remains problematic whether the Bayesian solution to the problem is genuinely intuitive. In contrast, I will introduce a modified version of the problem, and claim that Bayesianism cannot avoid giving it a counterintuitive answer, or three prisoners's paradox, as I call it. Then, by showing that a version of likelihoodism can avoid the paradox, I argue for likelihoodism over Bayesianism.

September 2, 2010, Colloquium

Robert Batterman, University of Pittsburgh
Explaining Regularities:  The Need for Singular Behavior

Abstract:  This paper discusses the explanatory role played by mathematical singularities in explaining regularities of a certain type in physical systems. The singularities result from certain mathematical idealizations of physical phenomena. I discuss this with reference to the renormalization group explanation of the universality of critical phenomena.  I argue that the explanatory role played by such mathematical singularities has consequences for a proper  philosophy of applied mathematics.  Related paper

June 26-27, 2010, CFE Opening Celebration



Baker Hall A53:  enter Baker Hall from the uphill end.  Go down the stairway in the Dean's office wing.  A53 is directly to the right at the end of the stairway.

Doherty Hall 4301:  Enter Doherty Hall through the doorway below the banners illustrated in this site's masthead.  The 4300 wing is at the very top of the first stairway you encounter.  Alternatively, pass the stairway into an alcove and take the left-hand elevator to the 4th floor (the right-hand elevator does not go all the way to the 4th floor).  When you exit the elevator you are in 4300.

Doherty Hall 4303:  access the 4300 wing according to the preceding instructinons.  4301 is the seminar room at the end of the 4300 wing's hallway.