Workshop on the Logic of Simplicity

June 7-9 2013

Jointly sponsored by Studia Logica and the John Templeton Foundation.

Rationale: Ockham's razor is the characteristic bias toward simple hypotheses that has characterized scientific inquiry since Copernicus.  But what is it, exactly?  This workshop aims to revisit that question from a fresh logical perspective.  Potential candidates for the simplicity order include dimensionality, Kolmogorov complexity, and VC dimension.  Candidates for Ockham's razor, itself, include logical theories for revising belief in light of such an order in the deterministic case and a host of model selection methods on the side of statistics and machine learning.   This interdisciplinary workshop will begin to explore a number of new and interesting logical questions at the interface of logic and scientific method.  Which orders are simplicity orders?   Is simplicity relative to questions or subject to other framing effects?  How should a simplicity order be modified in light of new information?  What may one believe in light of a simplicity order and given information?  What should one do if the simplicity order branches?   Are the essential features of a simplicity order preserved by the associated belief revision rule?  Are standard belief revision principles descriptively plausible in scientific applications?  Is simplicity absolute or relative to framing effects?  Is there any normative reason to revise according to simplicity rather than some other principle?  Addressing these fundamental questions promises both to sharpen our conception of scientific method and to broaden our ideas about the logic of belief revision.

Schedule:

Friday, June 7, 2013:

9:00-9:30 Jacek Malinowski, (Editor in Chief of Studia Logica)

Opening remarks

9:30-10:45 Peter Spirtes (Carnegie Mellon University, CMU).

Searching for Causal Graphs

11:00-12:15 Oliver Schulte (Simon Fraser University Computer Science).

Topological Simplicity and Inductive Inference

12:15-2:30 Lunch.

2:30-3:45 Eric Martin (University of New South Wales Computer Science).

The Four Notions of Complexity of Parametric Logic

4:00-5:15 Kevin T. Kelly and Hanti Lin (Carnegie Mellon University, Philosophy).

Empirical Simplicity, Efficient Inquiry, and Ockham's Razor

5:30-6:00 Discussion

6:30 Dinner

Saturday, June 8, 2013:

9:00-10:15 James Delgrande (Simon Fraser University Computer Science).

Believing the Simplest Course of Events

10:30-11:45 Sven Ove Hannson (University of Stockholm, Philosophy).

Belief change for finite minds

12:00-2:00 Lunch.

2:00-3:15 Sonja Smets (ILLC Amsterdam)

Epistemic Topology: problem-solving, belief-revision and simplicity-based priors

3:30-4:45 Alexandru Baltag (ILLC Amsterdam)

Conditioning as a Universal Learning Method: qualitative, probabilistic and computable updates

5:00-6:15 Nina Gierasimczuk(ILLC Amsterdam).

Computability and Ockham's Razor 

6:30 Dinner

Sunday, June 9, 2013:

9:30-11:00 Round Table Discusion.

Abstracts:

Friday, June 7, 2013:

9:00-9:30 Jacek Malinowski, (Editor in Chief of Studia Logica)

Opening remarks

9:30-10:45 Peter Spirtes (Carnegie Mellon University, CMU).

Searching for Causal Graphs

Over the last 25 years, a number of algorithms that search for causal models from non-experimental data have been developed. These search algorithms typically represent causal models as directed acylic graphs, and take as a fundamental assumption the Causal Markov Assumption, which states that each variable is independent of its non-effects conditional on its immediate causes. The Causal Markov Assumption associates with each causal graph G a set of probability distributions P(G) compatible with G and imposes a natural simplicity partial order on the causal graphs. In an idealized form these search aglorithms take as input a probability distribution, and output a causal graph or set of causal graphs compatible with the distribution. In this talk I will describe  how the simplicity partial order can be used to guide the order in which the search is performed, and different possible responses to cases where there is more than one maximally simple graph G compatible with a given input distribution. 

11:00-12:15 Oliver Schulte (Simon Fraser University Computer Science).

Topological Simplicity and Inductive Inference

In this talk I will show how a geometric notion of simplicity from the 19th century can be applied to modern problems of inductive inference, including: 1) Various Goodmanian problems of induction, and 2) Inferring the causal structure of a domain from observed significant correlations (constraint-based Bayes net learning). Cantor proposed a classic measure of the complexity of a set of points in a topological space based on the set's boundary. While the geometry of boundaries may seem far from questions of belief and induction, Kelly has shown that point-set topology becomes a theory of learning when we view a space of possible hypotheses as a topological space. In a learning topology, Cantor's measure assigns a complexity rank to each hypothesis. Using Occam's Razor to minimize topological complexity selects the natural projection rule in Goodmanian Riddles of Induction, and defines a plausible method for learning causal structure.

12:15-2:30 Lunch.

2:30-3:45 Eric Martin (University of New South Wales Computer Science).

The Four Notions of Complexity of Parametric Logic

We present Parametric logic, a logical framework where a generalised notion of logical consequence is defined as a function of a number of parameters (possible worlds, possible axioms, possible data) set to particular values. Depending on the values of the parameters, this generalised notion of logical consequence might or might not be compact. Given an ordinal beta, a notion of beta-compactness is defined, with classical compactness corresponding to the case where beta = 0.. We argue that deduction is captured by 0-compactness and induction by 1-compactness. We show that the notion of beta-compactness is closely related to the notion of learning with fewer than beta mind changes, and also sheds new lights on nonmonotonic reasoning. Classical logic is a particular instance of this framework where the  generalised notion of logical consequence is the classical, compact, notion of logical consequence, hence where ``logical consequence'' and ``deductive consequence'' are the same notions. But when the generalised notion of logical consequence is not compact, the beta-consequences of a theory withbeta ranging over the ordinals form a hierarchy which is closely related to a particular topological space and the stratification of its Delta-2 sets by the difference hierarchy. A proof theory is presented which necessitates the introduction of a specific syntactic normal form. Putting everything together, four notions of complexity, model-theoretic (generalised compactness), topological (level of the difference hierarchy), syntactic (kind of normal form) and learning-theoretic (number of mind changes) have strong relationships between each other and characterise forms of logical inference which are more and more uncertain, more and more subjected to potential revisions. In this framework, induction is a particular form of inference alongside deduction, where an inductive consequence can be computed from a set of premises following the application of some rules of inference, similarly to the fact that deductive consequences can be computed from a set of premises following the application of some rules of inference. If the "game of deduction'' is to find interesting deductive consequences, we could question where the "game of induction'' should be any different. A deep result which can be proved deductively can have a simple expression; still what is sought after is usually not simplicity, but how much the result reveals about the structures which are the object of scientific inquiry. It is legitimate to question whether results which can be proved inductively are any different.

4:00-5:15 Kevin T. Kelly and Hanti Lin (Carnegie Mellon University, Philosophy).

Ockham's Razor as Belief Revision With Respect to Empirical Simplicity

We present an axiomatic theory of empirical complexity relative to an empirical problem, which specifies possible information states and possible answers to an empirical question. The theory generalizes topological difference complexity and recovers uniquely the standard, branching search order for causal networks. Next, we define worst-case efficiency of empirical inquiry relative to empirical simplicity in a way that does not beg questions whether the actual world is simple. Ockham's razor is then a belief revision strategy relative to the simpicity order. But which? That depends in an interesting and important way on the strategic losses that are assumed in underlying notion of efficient inquiry, as we will illustrate. A particularly natural version of Ockham's razor corresponds to the aim of minimizing losses in true content en route to convergence to the true answer to the empirical question.

5:30-6:00 Discussion

6:30 Dinner

Saturday, June 8, 2013:

9:00-10:15 James Delgrande (Simon Fraser University Computer Science).

Believing the Simplest Course of Events

This talk describes a theory in which an agent may execute actions and sense
its environment. The agent have incomplete or incorrect beliefs; as well, actions may fail, or have unintended consequences, or may have outcomes that are impossible to predict. The goal is to maintain an agent's belief state wherein the agent's beliefs accord with the simplest, or most plausible, sequence of actions consistent with its prior beliefs and sensing actions. Hence, for example, an agent that senses that a light is on, and again later senses that it is on, will believe that the light has remained on and not that it was toggled some (even) number of times. Similarly, the agent may also subsequently revise its beliefs if it learns that a particular sequence of actions could not have been the case.This account is based on an epistemic extension to the situation calculus, that is, a first-order theory of reasoning about action that accommodates sensing actions. Our position is that nondeterminism is an epistemic phenomenon, and arises from an agent's limited awareness and perception of a deterministic world. The account offers several advantages: an agent has a set of categorical (as opposed to probabilistic) beliefs, yet can deal with equally-likely outcomes (such as in flipping a fair coin) or with outcomes of differing plausibility (such as an action that may on occasion fail). It maintains as its set of contingent beliefs the most plausible, or simplest, picture of the world, consistent with its beliefs and actions it believes it executed; yet it may modify these in light of later information.

10:30-11:45 Sven Ove Hannson (University of Stockholm, Philosophy).

Belief change for finite minds

Standard models of belief change such as partial meet contraction operate by making choices among cognitively inaccessible objects such as possible worlds or maximal consistent subsets that lack a finite representation. Finite belief bases avoid that difficulty, but bring in others. An alternative approach is presented in which changes take place on finitely representable belief sets but no distinction is made between different belief bases for the same belief set. Reference to infinite objects is avoided by changing the level of selection. Choice functions can operate directly on the set of possible outcomes (the credible and reachable finite-based belief sets) rather than on infinite and cognitively inaccessible objects. 

12:00-2:00 Lunch.

2:00-3:15 Sonja Smets (ILLC Amsterdam)

Epistemic Topology: problem-solving, belief-revision and simplicity-based priors

3:30-4:45 Alexandru Baltag (ILLC Amsterdam)

Conditioning as a Universal Learning Method: qualitative, probabilistic and computable updates

5:00-6:15 Nina Gierasimczuk (ILLC Amsterdam).

Computability and Ockham's Razor 

In this talk I will show how the principle of Ockham's Razor, understood as the requirement of conservativity and efficiency of learning functions, interacts with the uniform decidability of epistemic spaces in the context of computable learners. 

6:30 Dinner

Sunday, June 9, 2013:

9:30-11:00 Round Table Discusion.