Lodging and transportation information. Note: participants interested in the on-campus lodging option must reserve by noon on June 18.

Schedule.

All interested parties are welcome to attend.

The conference is timed to coincide with the completion of the 2010 Summer School for Formal Epistemology. Summer school students are cordially invited to stay for the conference and meet top figures in the field.

In the future, the CFE will run focused, specialty workshops in order to build fresh new research groups within the larger FEW community. But for this initial celebration, we thought it more appropriate to illustrate the full reach of the subject by inviting major figures from all branches of the area to present what they like best, without any pressure from an imposed theme. Nevertheless, the talks organized themselves beautifully into a day on logic, language and causation and a day on probability and credence. Schedule.

Ref. Johan van Benthem, 2010, "Logical Dynamics of Information and Interaction", Cambridge University Press.

Abstract: In this paper, we investigate a semantic framework originally defined by van Rooij (2010) to account for the idea that vague predicates are tolerant, namely for the principle that if x is P, then y should be P whenever y is similar enough to x. The semantics, which makes use of indifference relations in order tocapture the notion of similarity relevant for the application of vague predicates, rests on the interaction of three notions of truth: the classical notion, and two dual notions simultaneously defined in terms of it, which we call tolerant truth and strict truth. Basically, a vague predicate P is said to hold tolerantly of an object x provided

there is an object y similar enough to x in the relevant respects that satisfies P classically. Dually, a vague predicate is said to hold strictly of an object provided it holds classically of all objects that are similar enough in the relevant respects.

In the first part of the paper, we explain how the semantics allows us to validate the tolerance principle and to solve the sorites paradox. We characterize, in particular, the space of consequence relations definable on the basis of the notions of classical, strict and tolerant truth at hand. We present and discuss some correspondences and differences between our approach and other approaches used to deal with vagueness (in particular supervaluationism, subvaluationism, and three-valued logics).

A specificity of the notion of tolerant truth we get is that it is paraconsistent: in particular, it implies that borderline cases are tolerantly P and not P (while they are neither strictly P nor strictly not P). We argue for the plausibility of this conception. In particular, we discuss how the framework can be used to accommodate the recent experimental data by Ripley (2009) and Alxatib and Pelletier (2010) regarding the way subjects respond to classical contradictions and tautologies for borderline cases.

Abstract: The problem of irrelevant conjunction (aka, the "tacking problem") was originally one which plagued Hypothetico-Deductive accounts of confirmation. More recently, Bayesians have offered various sorts of analyses of the problem. I will survey the (early) Bayesian analyses, and explain how they are deficient in several crucial ways. Then, I will present what I take to be a more probative Bayesian analysis (due to myself and Jim Hawthorne). Finally, I will address some recent criticisms of our approach (due to Maher, and Crupi & Tentori).

Abstract: Various
scientific theories stand in a reductive relation to each other. In a
recent article (F. Dizadji-Bahmani, R. Frigg and S. Hartmann: Who is
Afraid of Nagelian Reduction? To appear in *Erkenntnis*),
we have argued that a generalized version of the Nagel-Schaffner model
(GNS) is the right account of this relation. In this talk, we present a
Bayesian analysis of how GNS impacts on confirmation. We formalize the
relation between the reducing and the reduced theory before and after
the reduction using Bayesian networks, and thereby show that,
post-reduction, the two theories are confirmatory of each other. We
then ask when a purported reduction should be accepted on epistemic
grounds. To do so, we compare the prior and posterior probabilities of
the conjunction of both theories before and after the reduction and ask
how well each is confirmed by the available evidence. (This talk is
based on a joint paper, forthcoming in *Synthese, with Foad
Dizadji-Bahmani and Roman Frigg.)*

Selected References:

Carlos Alchourron, Peter Gardenfors and David Makinson, “On the logic of theory change: partial meet contraction and revision functions”,

Tamar Gendler (2009). Alief in Action (and Reaction).

[Pa94] R. Parikh, "Vagueness and Utility: the Semantics of Common Nouns" in Linguistics and Philosophy 17, 1994, 521-35.

[Pa09] R. Parikh, "From language games to social software", in Reduction, Asbstraction, Analysis, proceedings of the 31st International Ludwig Wittgenstein Symposium in Kirchberg, edited by Alexander Hieke and Hannes Leitgeb, Ontos Verlag 2009, pp. 365-376.

[Pa08] R. Parikh "Sentences, belief and logical omniscience, or what does deduction tell us? ", Review of Symbolic Logic, 1 (2008) 459-476.

[Plato] Meno

[Ramsey] F. P. Ramsey, "Truth and Probability", in The Foundations of Mathematics, Routledge 1931.

[Savage] Leonard Savage, The Foundations of Statistics, Wiley 1954

Abstract: An old, wise, and widely held attitude in Statistics is that modest intervention in the design of an experiment followed by simple statistical analysis may yield much more of value than using very sophisticated statistical analysis on a poorly designed existing data set. In this sense, good inductive learning is active and forward looking, not passive and focused exclusively on analyzing what is already known. In this talk I review three different approaches for how a decision maker might actively frame her/his probability space rather than being passive in that phase of decision making.

Method-1: Assess precise/determinate probabilities only for the set of random variables that define the decision problem at hand. Do not include other "nuisance" variables in the space of possibilities. In this sense, over-refining the space of possibilities may make assessing probabilities infeasible for good decision making.

Method-2: With respect to a particular decision problem, choose wisely a set of events E that you can assess with precise/determinate probabilities.

Method-3: Your probabilistic assessments may be incoherent so that you may be exposed to a sure-loss in your decision making about some specific quantities. Nonetheless, you may be able to use familiar algorithms (e.g., Bayes' theorem) to update your views with new data and to improve your incoherent assessments about these quantities. That is, you may be able to reduce your degree of incoherence about these quantities by active, Bayesian-styled learning. Specifically, by framing your probability space so that incoherence is concentrated in your "prior," you may use Bayesian algorithms to update to a less-incoherent "posterior."

I illustrate these three methods with several problems, including how to sidestep what others have claimed to be some computational limitations of Bayesian inference.

The theoretical investigations are complemented by experiments with the proof search algorithm AProS. Its strategically guided search has been extended beyond pure logic to elementary parts of set theory. Here, the subtle interaction between understanding and reasoning, i.e., between introducing concepts and proving theorems, is crucial and suggests principles for structuring proofs conceptually.

These strands are part of a fascinating intellectual fabric and connect classical themes with contemporary problems - reaching from proof theory and the foundations of mathematics through computer science to cognitive science and back.

Abstract:
Ranking theory is a dynamic theory
of belief and
as such of comparable fundamentality and generality to Bayesianism.
Ranks also are degrees of belief ranging from - to + infinity,
behaving in many ways similar and in characteristic ways unlike
probabilities; in particular, positive ranks adequately represent
belief and distinguish more or less firm beliefs (negative ranks
correspondingly express disbelief). Moreover, in contrast to other
theories of belief it can fully account for the dynamics of belief, a
task equivalent to answering the problem of induction. The talk will
explain some basic features and some surprising applications of
ranking theory.

## James Woodward

###
Causal Learning
and Judgment: Covariation and Contact Mechanics

**Abstract:**: Contemporary philosophical and
psychological accounts of causation may be grouped into two broad
categories, which may seem to draw on fundamentally
different ideas about causation. Difference-making theories (including
“interventionist”
accounts and accounts of casual learning and representation in terms of
Bayes
nets) rely on the guiding idea that causes make a difference to their
effects. Such theories
emphasize the
role of information about the role of contingencies or covariation in
causal
learning. Such theories contrast with a second family of contact
mechanical theories thatfocus on the role of connecting
processes mechanical
relationships in causal learning
and the characterization of casual relationships. Within psychology,
such approaches stress
the role of causal representations in terms of
“forces” and “generative mechanisms” and
often assign a central role to the so-called perception
of causation. This talk will explore the
relationship between these two ways of thinking about causation (and
the capacities for learning and understanding
associated
with them). One issue: adult humans apparently seamlessly
integrate information from
about causal relationships derived from causal perception and contact
mechanics
with contingency information, including information resulting
from interventions. In
contrast, although there is some dispute
about this, infants are often claimed to
show
sensitivity to contact mechanical relationships, as revealed in looking
time
studies, before they are able to exploit such information in action.
This
raises the question of when (and how) these two sources of information
and two
ways of thinking about causal relationships are integrated in human
development. I will discuss some
conceptual and empirical considerations bearing on this question.

## Events

Opening
Celebration Summer
School Colloquium
Series
## Visits

Fellows
2010
Students
2010
Faculty
visits
Student
visits
Lodging
and transport
Local
attractions
Maps
## Contact

Kevin
T. Kelly, Director Horacio
Arlo-Costa, Associate Director Mauren
Antkowski, Administrator