Opening Celebration Conference
June 26-27
To celebrate its recent founding, the Center is hosting a gala Opening
Celebration Conference
on
June 26-27 in the Adamson Wing in
Baker
Hall, on the Carnegie
Mellon campus in Pittsburgh, PA.
Lodging and
transportation information.
Note: participants interested in the
on-campus lodging option must reserve by noon on June 18.
Schedule.
All
interested parties are
welcome to attend.
The conference is timed to coincide with the completion of the 2010
Summer School for Formal
Epistemology. Summer school students are cordially invited to
stay for the conference and meet top figures in the field.
In the future, the CFE will run focused, specialty workshops in order
to build fresh new research groups within the larger FEW
community. But for this initial celebration, we thought it more
appropriate to illustrate the full reach of the subject by inviting
major figures from all branches of the area to present what they like
best, without any pressure from an imposed theme. Nevertheless,
the talks organized themselves beautifully into a day on logic,
language and causation and a day on probability and credence.
Schedule.
Talk Abstracts
Schedule
Johan van Benthem
Logical Dynamics of Information and
Interaction
Abstract: Information and
knowledge can only be understood
well in the setting of events that drive rational agency.
I will show how this dynamics can be made an explicit part of logic, by
merging ideas from the philosophical and computational traditions. I
end by exploring the
resulting transformation of the role of logic in epistemology, game
theory, and a bunch of related fields.
Ref. Johan van Benthem, 2010, "Logical Dynamics of
Information and Interaction", Cambridge University Press.
Paul Egre
Vagueness: Tolerant, Classical, Strict
Joint work with Pablo Cobreros (U. of Navarra), David Ripley (IJN) and
Robert van Rooij (ILLC, Amsterdam)
Abstract: In this paper, we
investigate a semantic framework originally defined by van Rooij (2010)
to account for the idea that vague predicates are tolerant, namely for
the principle that if x is P,
then y should be P whenever y is similar enough to x. The semantics,
which makes use of indifference relations in order tocapture
the notion of similarity relevant for the application of vague
predicates, rests on the interaction of three notions of truth: the
classical notion, and two dual notions simultaneously defined in terms
of it, which we call tolerant truth and strict truth. Basically, a
vague predicate P is said to hold tolerantly of an object x provided
there is an object y similar enough to x in the relevant respects that
satisfies P classically. Dually, a vague predicate is said to hold
strictly of an object provided it holds classically of all objects that
are similar enough in the relevant respects.
In the first part of the paper, we explain how the semantics allows us
to validate the tolerance principle and to solve the sorites paradox.
We characterize, in particular, the space of consequence relations
definable on the basis of the notions of classical, strict and tolerant
truth at hand. We present and discuss some correspondences and
differences between our approach and other approaches used to deal with
vagueness (in particular supervaluationism, subvaluationism, and
three-valued logics).
A specificity of the notion of tolerant truth we get is that it is
paraconsistent: in particular, it implies that borderline cases are
tolerantly P and not P (while they are neither strictly P nor strictly
not P). We argue for the plausibility of this conception. In
particular, we discuss how the framework can be used to accommodate the
recent experimental data by Ripley (2009) and Alxatib and Pelletier
(2010) regarding the way subjects respond to classical
contradictions and tautologies for borderline cases.
Branden Fitelson
The Problem of Irrelevant Conjunction --- Revisited
Joint work with James Hawthorne
Abstract: The problem of
irrelevant conjunction (aka, the "tacking problem") was
originally one which plagued Hypothetico-Deductive accounts of
confirmation. More recently, Bayesians have offered various sorts
of
analyses of the problem. I will survey the (early) Bayesian
analyses,
and explain how they are deficient in several crucial ways. Then,
I
will present what I take to be a more probative Bayesian analysis (due
to myself and Jim Hawthorne). Finally, I will address some recent
criticisms of our approach (due to Maher, and Crupi & Tentori).
Stephan Hartmann
Confirmation and Reduction: A Bayesian
Account
Abstract: Various
scientific theories stand in a reductive relation to each other. In a
recent article (F. Dizadji-Bahmani, R. Frigg and S. Hartmann: Who is
Afraid of Nagelian Reduction? To appear in Erkenntnis),
we have argued that a generalized version of the Nagel-Schaffner model
(GNS) is the right account of this relation. In this talk, we present a
Bayesian analysis of how GNS impacts on confirmation. We formalize the
relation between the reducing and the reduced theory before and after
the reduction using Bayesian networks, and thereby show that,
post-reduction, the two theories are confirmatory of each other. We
then ask when a purported reduction should be accepted on epistemic
grounds. To do so, we compare the prior and posterior probabilities of
the conjunction of both theories before and after the reduction and ask
how well each is confirmed by the available evidence. (This talk is
based on a joint paper, forthcoming in Synthese, with Foad
Dizadji-Bahmani and Roman Frigg.)
James
Joyce
TBA
Hans
Kamp
Back and Forth between Language and
Thought
Abstract:
What can we learn about the structure of thought from studying
the structure of language and the ways in which it is used? The
question is ancient and it has given rise to a long
défilé of speculations, hypotheses and claims.
I will begin by looking at one aspect of this question: What can we
learn about the structure of mental representations from close
attention to the ways in which linguistic form maps onto utterance
meaning and the meaning of extended discourse and text?
After rehearsing some earlier conclusions about plausible conclusions
that can be drawn from such considerations about the structure of the
mental representations which result from language interpretation, I
will argue for the hypothesis that similar representations also serve
as inputs to language production.
These assumptions lead to a picture of verbal communication according
to which communicative success consists in the similarity between the
representation constructed by the interpreter of an utterance and the
representation that led the speaker/author to choose the words
contained in the utterance she produced.
In order that such similarity (and thus communicative success) can be
achieved on a regular basis, discourse participants must share their
understanding of the relations between words and thoughts. (Which
entails that they must share the syntactic, semantic and pragmatic
rules of the language they use to communicate.) But more often
than not, just sharing the rules of the language isn’t enough.
Most speech acts presuppose that, in addition, speaker and interpreter
share certain bits of background information - about particular
entities and facts as well as general principles to which the world
conforms (with few exceptions if any).
What has been argued earlier about the input to language production and
output from language interpretation also applies to such background
information: the form in which the information is represented is as
important as its propositional content. The final part of the talk will
focus on this particular aspect of what renders verbal communication
possible and effective.
Hannes
Leitgeb
Reducing Belief Simpliciter to Degrees
of Belief
Abstract: We prove that given
quite reasonable assumptions, it is possible to give an explicit
definition of belief simpliciter in terms of subjective probability,
such that it is neither the case that belief is stripped of any of its
usual logical properties, nor is it the case that believed propositions
are bound to have probability 1. Belief simpliciter is not to be
eliminated in favour of degrees of belief, rather, by reducing it to
assignments of consistently high degrees of belief, both quantitative
and qualitative belief turn out to be governed by one unified theory.
Turning to possible applications and extensions of the theory, we
suggest that this will allow us to see: how the Bayesian approach in
general philosophy of science can be reconciled with the deductive or
semantic conception of scientific theories and theory change; how the
assertability of conditionals can become an all-or-nothing affair in
the face of non-trivial subjective conditional probabilities; how
knowledge entails a high degree of belief but not necessarly certainty;
how primitive conditional probability functions (Popper functions)
arise from conditionalizing absolute probability measures on maximally
strong believed propositions with respect to different cautiousness
thresholds; and how conditional chances may become the truthmakers of
counterfactuals.
Rohit Parikh
Behavior and Belief
Abstract: One common
interpretation of states of belief is that they are sets of
propositions, that
these propositions are sets of worlds, represented by sentences, and
that
communication changes one’s belief state by replacing one such set by
another. The assumption of logical closure is also common, and
appealed
to, as early as Plato’s dialogue
Meno. Such a
representation
does allow various formal theories to come into play. The AGM
theory uses
this representation of belief and so does the KD45 representation by
means of
Kripke structures. However, there are
severe difficulties in the way of accepting such an account of belief
(and
hence of knowledge which typically presupposes belief). We want to
present a
“propositionless” account of belief and change in belief relying more
on some
automata theoretic models. Beliefs can change not only by hearing
sentences but also by witnessing events and via purely internal
process, like
deduction. Moreover, beliefs come
in two flavors. Beliefs inferred from
action, our usual method with infants and animals, and beliefs
expressed in
words. With Ramsey and Savage,
beliefs are revealed by the actions one takes and the choices one makes
between
various options. These beliefs which
we
call e-beliefs are a little like Gendler’s “aliefs.” The other variety
of
beliefs, which are our usual “vanilla” beliefs, are expressed in words.
These are vulnerable to “morning star,
evening star” and “Pierre” examples. We explain the
two
notions and how they relate to each other.
Selected
References:
Carlos Alchourron, Peter Gardenfors and David Makinson, “On the logic
of theory
change: partial meet contraction and revision functions”,
J. Symbolic Logic 50, (1985) 510–530.
Tamar Gendler (2009). Alief in Action (and Reaction).
Mind
& Language 23 (5):552-585.
[Pa94] R. Parikh, "Vagueness and Utility: the Semantics of Common
Nouns" in
Linguistics and Philosophy
17, 1994, 521-35.
[Pa09] R. Parikh, "From language games to social software", in
Reduction, Asbstraction, Analysis,
proceedings of the 31st International Ludwig Wittgenstein Symposium in
Kirchberg, edited by Alexander Hieke and Hannes Leitgeb, Ontos Verlag
2009, pp. 365-376.
[Pa08] R. Parikh "Sentences, belief and logical omniscience, or what
does deduction tell us? ",
Review of
Symbolic Logic, 1 (2008) 459-476.
[Plato]
Meno
[Ramsey] F. P. Ramsey, "Truth and Probability", in
The Foundations of Mathematics,
Routledge 1931.
[Savage] Leonard Savage,
The
Foundations of Statistics, Wiley 1954
Teddy
Seidenfeld
Getting to Know your Probabilities: Three Ways to Frame Personal
probabilities for decision making.
Abstract: An old, wise, and widely
held attitude in Statistics
is that modest intervention in the design of an experiment followed by
simple statistical analysis may yield much more of value than using
very sophisticated statistical analysis on a poorly designed existing
data set. In this sense, good inductive learning is active and
forward
looking, not passive and focused exclusively on analyzing what is
already known. In this talk I review three different approaches
for
how a
decision maker might actively frame her/his probability space rather
than being passive in that phase of decision making.
Method-1: Assess precise/determinate probabilities only for the
set of
random variables that define the decision problem at hand. Do not
include
other "nuisance" variables in the space of possibilities.
In this
sense, over-refining the space of possibilities may make assessing
probabilities infeasible for good decision making.
Method-2: With respect to a particular decision problem, choose
wisely
a set of events E that you can assess with precise/determinate
probabilities.
Coherence (as in de Finetti's theory) requires
that you extend these probabilities to the linear span generated by E,
which may be a smaller and simpler set than the Boolean algebra
generated by E. If E is wisely chosen, the decision problem at
hand
may be solved by the assessments over the smaller space.
Method-3: Your probabilistic assessments may be incoherent so that you
may be exposed to a sure-loss in your decision making about some
specific quantities. Nonetheless, you may be able to use familiar
algorithms (e.g., Bayes' theorem) to update your views with new data
and to improve your incoherent assessments about these
quantities.
That is, you may be able to reduce your degree of incoherence about
these quantities by active, Bayesian-styled learning.
Specifically, by
framing your probability space so that incoherence is concentrated in
your "prior," you may use Bayesian algorithms to update to a
less-incoherent "posterior."
I illustrate these three methods with several problems, including how
to sidestep what others have claimed to be some computational
limitations of Bayesian inference.
Wilfried Sieg
Structural Proof Theory: Uncovering Aspects of the Mathematical Mind
Abstract. What shapes
mathematical arguments into intelligible proofs (that can be found
efficiently)? This is the informal question I address by
investigating, on the one hand, the abstract ways of the axiomatic
method in modern mathematics and, on the other hand, the concrete ways
of proof construction suggested by modern proof theory.
The theoretical investigations are complemented by experiments with the
proof search algorithm AProS. Its strategically guided search has
been extended beyond pure logic to elementary parts of set theory.
Here, the subtle interaction between understanding and reasoning, i.e.,
between introducing concepts and proving theorems, is crucial and
suggests principles for structuring proofs conceptually.
These strands are part of a fascinating intellectual fabric and connect
classical themes with contemporary problems - reaching from proof
theory and the foundations of mathematics through computer science to
cognitive science and back.
Brian Skyrms
The Concepts of Information and
Deception in Signalling Games
Abstract: I discuss the
concepts of information in signals - both quantity of information and
informational content -
in the context of signaling games. The analysis is meant to apply at
all levels of biological organization.
Wolfgang Spohn
A Guided Tour through the Cosmos of Ranking Theory
Abstract:
Ranking theory is a dynamic theory
of belief and
as such of comparable fundamentality and generality to Bayesianism.
Ranks also are degrees of belief ranging from - to + infinity,
behaving in many ways similar and in characteristic ways unlike
probabilities; in particular, positive ranks adequately represent
belief and distinguish more or less firm beliefs (negative ranks
correspondingly express disbelief). Moreover, in contrast to other
theories of belief it can fully account for the dynamics of belief, a
task equivalent to answering the problem of induction. The talk will
explain some basic features and some surprising applications of
ranking theory.
James Woodward
Causal Learning
and Judgment: Covariation and Contact Mechanics
Abstract:: Contemporary philosophical and
psychological accounts of causation may be grouped into two broad
categories, which may seem to draw on fundamentally
different ideas about causation. Difference-making theories (including
“interventionist”
accounts and accounts of casual learning and representation in terms of
Bayes
nets) rely on the guiding idea that causes make a difference to their
effects. Such theories
emphasize the
role of information about the role of contingencies or covariation in
causal
learning. Such theories contrast with a second family of contact
mechanical theories thatfocus on the role of connecting
processes mechanical
relationships in causal learning
and the characterization of casual relationships. Within psychology,
such approaches stress
the role of causal representations in terms of
“forces” and “generative mechanisms” and
often assign a central role to the so-called perception
of causation. This talk will explore the
relationship between these two ways of thinking about causation (and
the capacities for learning and understanding
associated
with them). One issue: adult humans apparently seamlessly
integrate information from
about causal relationships derived from causal perception and contact
mechanics
with contingency information, including information resulting
from interventions. In
contrast, although there is some dispute
about this, infants are often claimed to
show
sensitivity to contact mechanical relationships, as revealed in looking
time
studies, before they are able to exploit such information in action.
This
raises the question of when (and how) these two sources of information
and two
ways of thinking about causal relationships are integrated in human
development. I will discuss some
conceptual and empirical considerations bearing on this question.
Events
Opening
Celebration Summer
School Colloquium
Series
Visits
Fellows
2010
Students
2010
Faculty
visits
Student
visits
Lodging
and transport
Local
attractions
Maps
Contact
Kevin
T. Kelly, Director Horacio
Arlo-Costa, Associate Director Mauren
Antkowski, Administrator