Abstracts for Thursday Philosophy of Physics seminars

MICHAELMAS TERM 2024

Week 1 (17 Oct) David Wallace ‘The local quantum vacuum as the Past Hypothesis’

Abstract: The ‘Past Hypothesis’, as advocated by David Albert and Barry Loewer, is the hypothesis that the world came into being in whatever particular low-entropy highly-condensed big-bang sort of macrocondition it is that the normal inferential procedures of cosmology will eventually present to us. I consider some hypotheses about what that macrocondition is likely to be given what cosmology has already presented to us, and explore the consequences of these hypotheses for the broader (‘Mentaculus’) project of grounding physics and the special sciences in the Past Hypothesis. My main conclusion is that current cosmology suggests a unique, pure quantum state (the local quantum vacuum, or ‘Bunch-Davies vacuum’) for the initial state of the Universe, in which case statistical-mechanical probabilities emerge from quantum probabilities without any need for an intervening statistical postulate.

Week 2 (24 Oct) Peter Morgan ‘A Dataset & Signal Analysis Interpretation of Quantum Field Theory’

Abstract: Elementary signal analysis, with a dependence only on time, can be thought of as a 1+0-dimensional classical field theory, for which Fourier and other integral transforms and the use of Hilbert spaces are familiar. Signal analysis is less constrained than traditional classical mechanics insofar as its relationship with the available datasets does not assume a priori that the data is about object properties, making it a helpful intermediary for thinking about quantum field theory. The presence of several distinct kinds of noise requires statistical methods and, to accommodate the intervention and causal modelling aspect of signal analysis in a Hilbert space formalism, generalized probability theory.
I will show how we can use the Poisson bracket to extend classical mechanics to be as unconstrained as signal analysis, giving what I have called ‘CM+’. The greater generality of CM+ includes measurement incompatibility, so that it has a measurement problem, which allows us to rethink the measurement problem as we have it for quantum mechanics. If we further require a CM+ model to differentiate between thermal noise and quantum noise (spoiler: it’s about Lorentz invariance), then we must work in at least 1+1-dimensions and we can make even closer contact with quantum field theory.
I will also show that the nonlinear response to applied modulations that we expect in a signal analysis perspective suggests a way to rethink renormalization as a surreptitious way to introduce nonlinearity into an axiomatic quantum field theory. Signal analysis thus gives us ways to rethink both the measurement problem and the renormalization ‘problem’.

Week 6 (21 Nov) Lucy Mason ‘Measurement, Metrology, and Perspectives on the Quantum State’

There are two main styles of interpreting the quantum state; either focusing on the fundamentality of the quantum state or on how projection operators represent observable properties. Rather than being incompatible, I argue that these correspond to taking a 3rd person and 1st person perspective respectively. I also argue that the 1st person perspective is ineliminable for the way that the metrology literature characterises measurement through the properties of a system. Metrology can help us define what the 1st person perspective is and identify what concepts are necessary to provide a model of measurement.

Week 7 (28 Nov, BLOC seminar): Simon Saunders ‘Quantum kinematics as intrinsic probability’

Abstract:I examine the concept of interval or imprecise probability as applied to any admissible ensemble of microstates, in which the probability of a projector P in an ensemble is bounded by the frequency of +1 eigenstates (lower bound) and 0 eigenstates (upper bound).

Given a Hilbert space H and quantum state |psi>, an admissible ensemble is defined by (any) equi-amplitude decomposition of the state (so the ensembles are necessarily finite). It is a simple matter to see that all such interval probabilities for fixed |psi>, H, and P are mutually consistent, and that in the limit of large ensembles, they approximate the Born-rule quantity .

This account of probability is synchronic, so independent of the dynamics, and independent of the basis, and of whether any experiment is performed, or any evaluation by any agent. So it plausibly counts as ‘intrinsic’. I also consider its implications for the Everett interpretation, and its connection to Gibbs’ analysis of probability, which it closely follows.

Week 8 (5 Dec): Richard Healey (Arizona) ‘How to be a single-world quantum relativist’

Abstract: As Timotheus Riedel notes in a recent paper, over the past few years, a flurry of related no-go results in extended Wigner’s friend scenarios has been taken to place strong constraints on the possibility of absolute facts about the outcomes of quantum measurements. In my pragmatist view a system’s quantum state, and the outcome of a measurement on it, are each relative—not to “the observer” but to something physical. I shall explain what this means, how my view differs from Rovelli’s relational quantum mechanics, and why this perspective on quantum theory is not refuted by arguments based on extended Wigner’s friend scenarios, including Riedel’s.

TRINITY TERM 2024

Week 1 (25th April): Simon Saunders (Oxford)

Title: Finite frequentist explains quantum probability (or: Gibbs meets Everett)

Abstract: Gibbs explained probability in terms of frequentism using the notion of ensemble equipped with a ‘density-in-phase’ function (a non-negative real-valued function on phase-space). The same procedure is applied to a decoherent history space, with parameter space M replacing phase space, equipped with a quantum state, replacing the density-in-phase by the modulus square of the associated wave-function (as a non-negative real-valued function on M). So long as M includes at least one continuous variable, I show that probability is similarly explained in terms of frequentism. The ensembles consist of finite numbers of equi-amplitude decohering microstates, whose superposition is the quantum state.

There is a natural alternative, following Boltzmann, defining microstates as equi-volume microstates of non-zero amplitude (for given volume measure on M (for Boltzmann, Liouville measure). It has long been known that probabilities similarly defined (but in terms of macrostates) are diachronically inconsistent; I show that on the Boltzmann variant, they are synchronically inconsistent as well.

The Gibbs method, as applied to the Everett interpretation, is then further a form of actual frequentism, as contrasted with hypothetical frequentism, as all the microstates exist in a superposition. No limiting procedure is needed. This contrasts with my earlier work on this topic in https://arxiv.org/abs/2201.06087, which involved diachronic probabilities. The present talk is based on http://arxiv.org/abs/2404.12954, just posted, and is purely synchronic. As such it can be extended to contexts independent of decoherence altogether; if the basis is allowed to vary, the constraint on M can be dropped as well.

Week 2 (2 May): Natasha Oughton (National Quantum Computing Centre)

Title: Why quantum theory? Understanding and explanation through reconstruction

Abstract: It is sometimes claimed that information-theoretic reconstructions of quantum theory provide interpretational insight. I’ll argue that such claims misrepresent the contribution of information-theoretic reconstructions: we should not expect reconstructions to shed light on the inner workings of quantum theory by locating its boundaries in the context of more general theories, nor to reveal what quantum theory is about, since the features and principles privileged through each reconstruction are, at least to some extent, dependent on what one chooses to focus on. Despite this though, I maintain that information-theoretic reconstructions genuinely can play an epistemic role. By adopting a pragmatic, pluralist account of explanation inspired by Van Fraassen and proposing an extension of it to an account of understanding, I’ll argue that reconstructions can be seen to provide both explanation and understanding through providing answers, distinct from those about interpretation, to the question “why quantum theory” in the salient contexts.

Week 3 (9 May): NO SEMINAR

Week 4 (16 May): Jingy Wu (LSE)

Title: Between a Stone and a Hausdorff space

Week 5 (23 May): Oliver Pooely (Oxford)

Title: Dynamical versus geometrical approaches to spacetime structure

Week 6 (30 May):Daniel Grimmer (Oxford)
Title: In search of new spacetimes: topological redescription via the ISE method

Week 7 (6 June):NO SEMINAR

Week 8 (13 June): Giovanni Valente (Milan)

Title: On the quantum Boltzmann equation: what is the source, if any, of irreversibility?

HILARY TERM 2024

Week 1 (18th January) NO SEMINAR

Week 2 (25th January) Caspar Jacobs (Leiden)

Title: Stating Maths-First Realism, or How to Say Things with Models’

Abstract:
The aim of philosophy of physics, broadly speaking, is to interpret physical theories. Since those theories are expressed mathematically, this means either extracting meaning from their mathematical models or endowing those models with meaning. The traditional means by which we interpret theories is language. But there seems to be a mismatch between the content of those models and that of their linguistic interpretations; hence the desire to avoid linguistic means in the process interpretation altogether. Wallace’s ‘maths-first realism’ is a recent expression of this desire. It is still unclear, however, how mathematical models can mean anything in the absence of a linguistically-provided interpretation. In this talk I will survey a range of options and their complications. I will conclude that if language-free interpretation is possible at all, it would radically alter the face of philosophy of physics.

Week 3: (1 Feb) NO SEMINAR

Week 4: (8th Feb) Adam Caulton (Oxford)

Title: Reduction and Equivalence: Some mild suggestions.

Abstract: In this talk, I consider a topic in the philosophy of science and physics that has been dominant since the days of logical empiricism: namely inter-theoretic reduction and theoretical equivalence. I explore the reason why (something like) Nagel’s model still survives to this day, and suggest that an essential idea in that model is that it makes the recovery of the reduced theory to the reducing theory inevitable in some strong sense. Taking that idea seriously, I argue, places two constraints on reduction (and equivalence) that are often underplayed or ignored: (i) that so-called “bridge laws” are not only needed, but they need to be something approximating analytic truths; and (ii) that some suitably strict account of construction needs to be articulated, or at least obeyed. Both (i) and (ii) might be thought to be wedded to the syntactic account of theories; I will argue otherwise. Finally, and related to (i) and (ii), a more faithful consideration of what physical theories are actually like suggests that we cannot focus purely on a theory’s dynamical solutions (or “DPMs”). I hope to illustrate these points with some examples, including Boltzmann’s combinatorial argument in deriving the Maxwell-Boltzmann distribution in kinetic theory, and John Winnie’s 1977 attempt to reduce the structure of Minkowski spacetime to the causal connectibility relation.

Week 5 (15th Feb): NO SEMINAR

Week 6: (22nd Feb) Nick Ormrod (Oxford, Computer Science):

Title: Quantum Influences and Event Relativity

Abstract: We develop a new interpretation of quantum theory by combining insights from extended Wigner’s friend scenarios and quantum causal modelling. In this interpretation, which synthesizes ideas from relational quantum mechanics and consistent histories, events obtain relative to a set of systems, and correspond to projectors that are picked out by causal structure. We articulate these ideas using a precise mathematical formalism. Using this formalism, we show through specific examples and general constructions how quantum phenomena can be modelled and paradoxes avoided; how different scenarios may be classified and the framework of quantum causal models extended; and how one can approach decoherence and emergent classicality without relying on quantum states.

Based on a paper with Jonathan Barrett https://arxiv.org/pdf/2401.18005.pdf

Week 7 (29th Feb): NO SEMINAR

Week 8 (7th March) James Read (Oxford)

Title: The Non-Relativistic Geometric Trinity of Gravity

Abstract: Three topics by now well-established in the canon of the philosophy of space and time are: (i) the non-relativistic limit, (ii) geometrisation and recovery in non-relativistic gravity, and (iii) Maxwell gravitation. In addition, other topics are becoming increasing well-known to philosophers of space and time—e.g., the existence of a ‘geometric trinity’ of theories of relativistic gravity (of which GR is but one node). Thus far, however, these topics have remained largely isolated from one another. In this talk, I’ll connect the dots, by (a) taking the non relativistic limit of the geometric trinity, thereby constructing a novel non-relativistic geometric trinity of gravity which subsumes known geometrisation/recovery results, (b) thinking about the common core of the relativistic trinity as opposed to the non-relativistic trinity—in the former case, this turns out to be GR itself, but in the latter case it is precisely Maxwell gravity, which is distinct from the original three ‘nodes’ of the non-relativistic trinity. Overall, this work not only helps us to understand much better the ‘space of spacetime theories’, but also raises many interesting philosophical questions for future exploration.

MICHAELMAS TERM 2023

Week 1 (12th October) David Wallace (Pittsburgh)

Title: Thermodynamics with and without reversibility

Abstract: Working inside the control-theoretic framework for understanding thermodynamics, I develop a systematic way to characterize thermodynamic theories via their compatibility with various notions of coarse-graining, which can be thought of as parametrizing an agent’s degree of control of a system’s degrees of freedom, and explore the features of those theories. Phenomenological thermodynamics is reconstructed via the `equilibration’ coarse-graining where a system is coarse-grained to a canonical distribution; finer-grained forms of thermodynamics differ from phenomenological thermodynamics only in that some states of a system possess a free energy that can be extracted by reversibly transforming the system (as close as possible) to a canonical distribution. Exceeding the limits of phenomenological thermodynamics thus requires both finer-grained control of a system and finer-grained information about its state. I consider the status of the Second Law in this framework, and distinguish two versions: the principle that entropy does not decrease, and the Kelvin/Clausius statements about the impossibility of transforming heat to work, or moving heat from a cold body to a hotter body, in a cyclic process. The former should be understood as relative to a coarse-graining, and can be violated given finer control than that coarse-graining permits; the latter is absolute, and binds any thermodynamic theory compatible with the laws of physics, even the entirely reversible limit where no coarse-graining is appealed to at all. I illustrate these points via a discussion of Maxwell’s demon.

Week 2 (19 October) Nicholas Teh (Notre Dame)

Title: Understanding the Geroch–Jang argument

Abstract: In some corners of the philosophy of physics, Geroch and Jang’s “theorem” of 1975 has been held up as a model for how geodesic motion is to be explained in General Relativity. This raises the question of the extent to which the conceptual and mathematical structure of the Geroch-Jang (GJ) argument has been interrogated by the community. In this talk, based on joint work with Dominic Dold (University of Notre Dame and MPIWG), I will argue that the original GJ argument has been insufficiently understood and contains important lacunae. Furthermore, filling in these lacunae will illuminate several important themes, viz. (i) the sense in which part of GJ’s result is Special Relativistic; (ii) the sense in which this Special Relativistic result is “extended” to General Relativity; and (iii) the relationship between such an extension and the “equivalence principle” as understood by Linnemann, Read and Teh in https://arxiv.org/abs/2305.01534. The resulting picture will also shed light on the extent to which the GJ argument uses the dynamical content of General Relativity, as well as the reasons for which this style of argumentation has played a relatively minor role in contemporary work on the geodesic principle within mathematical physics.

Week 3 (26th October) Sabine Hossenfelder (MCMP)

Title: Superdeterminism – the forgotten solution

Abstract: What is a measurement? This, it turns out, is the most difficult question in physics today. In this talk, I will explain why the measurement problem is important and why all attempts to solve it so far have failed. I will then discuss the obvious solution to the problem that was, unfortunately, discarded half a century ago without ever being seriously considered: Superdeterminism. After addressing some common objections to this idea, I will summarize the existing approaches to develop a theory for it.

Week 4: (2nd November) NO SEMINAR

Week 5 (9th November):Tim Palmer (Oxford) and Chris Timpson (Oxford):

Title: Superdeterminism and non-conspiracy revisited: A Debate

Week 6 (16 November) Jonathan Halliwell (Imperial)

Title: Aspects of Leggett-Garg Tests for macrorealism

Abstract: The Leggett-Garg (LG) inequalities were introduced, as a temporal parallel of the Bell inequalities, to test macroscopic realism (MR) — the world view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state. If violated, such tests indicate the presence of superposition states for macroscopic systems.

In this talk I give an overview of developments in this area over the last few years. This includes the following:

The development of necessary and sufficient conditions for macrorealism and their generalization to multi-time measurements, many variable systems and data sets with higher order correlators.
Modifications of the usual non-invasive measurement protocols required in LG tests which provide improved checks that the measurements really are non-invasive.
Recently discussed LG-type tests which involve only single time measurements (e.g. the Tsirelson inequality).
LG tests for interferometric experiments and for measurements of sign(x) in the harmonic oscillator and other continuous variable systems, which provides a useful setting for genuinely macroscopic tests. The physical origin of the LG violations in the SHO case links to the phenomenon of diffraction in time first highlighted by Moshinsky and is conveniently illustrated using Bohm trajectories.

Some relevant papers:

J. J. Halliwell and C. Mawby Phys. Rev. A 100, 042103 (2019) Fine’s Theorem for Leggett-Garg tests with an arbitrary number of measurement times
J.J. Halliwell, Phys. Rev. A 99, 022119 (2019) Leggett-Garg tests of macrorealism: checks for non-invasiveness and generalizations to higher-order correlators
J.J. Halliwell and C. Mawby Phys. Rev. A 102, 012209 (2020) Conditions for macrorealism for systems described by many-valued variables
C. Mawby and J. J. Halliwell, Phys. Rev. A 107, 032216 (2023) Leggett-Garg violations for continuous variable systems with gaussian states

Week 7 (23 November) Neil Dewar (Cambridge)

Title: The hole argument and mathematical practice

Abstract: Weatherall (2018)’s claim that the Earman-Norton Hole Argument was based on a misconception of mathematical practice—and therefore did not need the attention of philosophers—has resulted in a renewed wave of philosophical attention for that very argument. In this talk, I seek to do three things. The first is to disentangle some of the recent back-and-forth between those sympathetic to Weatherall’s approach, and those who think the Hole Argument requires metaphysical commitments for its resolution—focusing, especially, on a recent exchange between Halvorson & Manchak (forthcoming) and Menon & Read (unpublished). The second is to draw out the implications of this exchange for the issue of determinism in General Relativity and other spacetime theories. The third is to consider what, on Weatherall’s view of mathematical practice, a legitimate version of the Hole Argument might look like.

Week 8 (30 November) Bryan Roberts (LSE)

Title: How black holes are really hot

Abstract: When is a system “really” thermal? Such philosophical questions are not so easy given the many varieties of black hole radiation, acoustic horizon radiation, and other oddities of modern physics. In this talk, I try to add some clarity by defending one precise definition of what it means to be “thermal” in the sense of a model of thermodynamics, which adopts a framework inspired by Gibbs and now known as Geometric Thermodynamics. In this framework, it is possible to give one clear sense in which black holes really are hot, although their thermodynamic properties are radically different than the usual earthly models of thermodynamics like a box of gas in equilibrium.

TRINITY TERM 2023

WEEK 1 (27th April): Sebastian de Haro (Amsterdam)

Title: Dualities and quasi-dualities: on solitons and phases of quantum field theories

Abstract: In physics, a duality is an isomorphism between two theories (here called ‘models’). A quasi-duality is a map between two models that falls short of being a full isomorphism: usually through its being a partial or an approximate isomorphism. In quantum field theory, dualities often exchange particle and soliton states (where a soliton is a solution of the non-linear field equations with finite energy). I take bosonization duality and Seiberg-Witten quasi-duality as case studies to illustrate two aspects of dualities that bear on the interpretation of theories. The first aspect is the conception of a common core theory “behind two duals”. The second aspect is the role of solitons and quasi-dualities in exploring the physical content and phase structure of quantum field theories.

WEEK 2: Johannes Fankhauser (Oxford/Innsbruck)

Title: Quantum Uncertainty and Empirical Completeness

Abstract: I formally define and address the question whether quantum uncertainty could be fundamental, or whether post-quantum theories could have predictive advantage whilst conforming to the Born rule on average. This notion of what I call `empirical completeness’ refers to actual prediction making beyond the Born probabilities, and thus the framework delineates this operational notion of predictability from the`hidden variable’ programme in quantum theory. I study how empirical completeness is connected to signal-locality, and argue that, based on existing results, a partial proof for the impossibility of predictive advantage can be established for bi-partite quantum systems. The relevant results demonstrate signal-locality as a sufficient principle that might explain the fundamental chanciness in present and future quantum theories. This in turn reconciles us to many quantum features as aspects of limits on Nature’s predictability. I then propose an extended Wigner’s friend experiment combining these ideas.

WEEK 3: Renate Loll (Radboud University, Nijmegen)

TItle: Questions on Quantum Gravity

Abstract: My talk will give a broad-brush account of the field of quantum gravity, with an emphasis on structural issues. I will sketch motivations, ambitions and challenges of quantum gravity, how we got to where we are, and discuss promising current and future directions, focusing on nonperturbative quantum field-theoretic approaches. I will describe available tools, why it has taken a long time to put gravity on the lattice correctly (`a la CDT), and highlight two prominent nonperturbative results, dimensional reduction at the Planck scale and spacetime emergence. My presentation will be based loosely on “Quantum Gravity in 30 Questions”, arXiv:2206.06762.

WEEK 4: Paul Skokowski (Stanford)

Title: Superpositions and beliefs about superpositions

Abstract: David Albert and Jeff Barrett have discussed a simple Everettian interpretation of quantum mechanics they call the bare theory. Under the bare theory, experiments may have superpositional rather than determinate outcomes. What, in such a world, would an observer of an experiment’s beliefs about these superpositional results be like? In this talk I’ll challenge some claims about how to evaluate the belief states of observers in a bare, no-collapse world, and in so doing consider how we should interpret these states.

WEEK 5: Fedele Lizzi (Naples)

Title: Quantum Observers for Quantum Spacetime

Abstract: I will discuss how some quantum spacetimes, described by a noncommutative geometry, require, for a consistent and sensible interpretation, that also the observers are themselves quantum objects. I will in particular concentrate on the case of Kapp-Minkoski spacetime.

WEEK 6: Doreen Fraser (Waterloo)

Title: Philosophical implications of measurement in QFT

Abstract: Measurement theory from non-relativistic quantum mechanics (NRQM) cannot be straightforwardly carried over to relativistic quantum field theory (QFT). The ‘impossible measurement’ scenarios presented by Sorkin and Borsten, Jubb & Kells vividly illustrate that superluminal signalling is a hazard of prima facie reasonable attempts to apply measurement theory from NRQM to relativistic quantum theory. This is one motivation for the recent flurry of research activity devoted to formulating models of measurement for QFT. I will draw out some of the philosophically important implications of this recent work. For concreteness, I will focus on the measurement framework for algebraic QFT recently proposed by Fewster and Verch. The Fewster-Verch measurement framework departs from the traditional operationalist interpretation of AQFT in significant respects. Second, the new state update rules in the Fewster-Verch measurement framework cannot be interpreted as representing a change of state that happens in any region of spacetime. This feature is shared with other recent proposals for modeling measurement in QFT, which suggests that it is a general moral about measurement theories that are suited to relativistic spacetime. This has direct consequences for the form that the Measurement Problem takes in QFT with its measurement theory (compared to NRQM with its measurement theory).

WEEK 7: Dennis Lehmkuhl (Bonn)

Title: TBA

WEEK 8: Natasha Oughton (Oxford)

Title: Why quantum theory? Understanding and explanation through reconstruction

Abstract: It is sometimes claimed that information-theoretic reconstructions of quantum theory provide interpretational insight. I’ll argue that such claims misrepresent the contribution of reconstructions: we should not expect to gain knowledge of the inner workings of quantum theory by locating its boundaries in the context of more general theories, nor to learn what quantum theory is about, since the features and principles privileged through reconstruction are, at least to some extent, dependent on what one chooses to focus on. Despite this though, I maintain that information-theoretic reconstructions genuinely can play an epistemic role. By adopting a pragmatic, pluralist account of explanation inspired by Van Fraassen and proposing an extension to an account of understanding, I’ll argue that reconstructions can be seen to provide both explanation and understanding through providing answers, distinct from those about interpretation, to the question “why quantum theory” in the salient contexts.

MICHAELMAS TERM 2022

Week 1 (13 October): Caspar Jacobs (Oxford, philosophy)

Title: The model is not the territory: on quality and isomorphism

Abstract: It is often claimed (especially by Oxford philosophers of physics!) that isomorphic models represent qualitatively identical possibilities, that is, possibilities that differ at most haecceitistically. On the strictest interpretation, this claim is false, as I will show with counterexamples. I also discuss some weaker versions of the claim: that it holds true for models when interpreted ‘naturally’ or ‘literally’. But these weaker versions are either incoherent or fail to account for the use of the isomorphism-claim in discussions of symmetries. The ‘Oxford consensus’ fails. Instead, I offer a different construal of that claim, on which it functions as a guide towards the qualitative structure of the world.

Week 2 (20 October): Lu Chen (Koç University)

Title: A discrete case for dynamicism

Abstract: Dynamicism is the view that dynamical laws are more fundamental that spacetime geometry, with the opposite view called geometricism. I argue in favor of dynamicism through the case of discrete spacetime. I show that, surprisingly, only dynamicism naturally allows the possibility that approximately Euclidean geometry emerges from a discrete space lattice, which is an important advantage for the view. I demonstrate this claim through two rigorous toy examples: the simple physics of a random walk and discrete quantum mechanics.

Week 3 (27 October): Christian Wüthrich (Geneva, Philosophy)

Title: Laws beyond spacetime

Abstract: Quantum gravity’s suggestion that spacetime may be emergent and so only exist contingently would force a radical reconception of extant analyses of laws of nature. Humeanism presupposes a spatiotemporal mosaic of particular matters of fact on which laws supervene; primitivism and dispositionalism conceive of the action of primitive laws or of dispositions as a process of `nomic production’ unfolding over time. We show how the Humean supervenience basis of non-modal facts and primitivist or dispositionalist accounts of nomic production can be reconceived, avoiding a reliance on fundamental spacetime. However, it is unclear that naturalistic forms of Humeanism can maintain their commitment to there being no necessary connections among distinct entities. Furthermore, non-temporal conceptions of production render this central concept more elusive than before. In fact, the challenges run so deep that the survival of the investigated analyses into the era of quantum gravity is questionable.

Week 4 (3 November):NO SEMINAR

Week 5 (10 November): Ted Jacobson (Maryland and Cambridge, Physics)

Title: Diffeomorphism invariance and the black hole information paradox

Abstract: I will argue that the only compelling reason to expect that black holes do not cause information loss stems from the diffeomorphism invariance of general relativity, and that hence the resolution of the information paradox must therefore hinge on diffeomorphism invariance. As with other famous paradoxes of physics, I believe, the resolution is to be found in paying close attention to what is a meaningful statement within the theory. Key to the resolution is the recognition that the paradox to be resolved exists even when no black hole is present. I will propose how the resolution works in principle, and will review recent progress (by other authors) illustrating how it works even in perturbation theory.

Week 6 (17 November): Patrick Dürr (Oxford and the Hebrew University of Jerusalem, Philosophy)

Title: Conventionalism – a sophisticated philosophy for our (space)times

Abstract: In the talk, I flesh out and advocate conventionalism about physical geometry as a nuanced response to the challenges of empirical underdetermination of geometry that takes into account the specific role and peculiarities of geometry in physical theorising. Drawing on and updating ideas in Poincaré, I highlight the advantages of conventionalism over other stances in the realist responses (explanationism, structural realism or entity realism). A cornucopia of examples from both relativistic and non-relativistic gravitational physics can be adduced to which a conventionalist can point to make her case. These examples illustrate that the coexistence of empirically equivalent theories that postulate mutually incompatible underlying geometries (and that, on a natural stance towards theory individuation, count as distinct theories, rather than reformulations of the same theory) is a real challenge at the heart of modern gravitational physics — a challenge for which conventionalism, as I conceive of it, proffers a convincing and sophisticated response.

Week 7 (24 November): Natasha Oughton (University of Oxford)

Title: Why quantum theory? Understanding and explanation through reconstruction

Abstract: Driven by the successes of work in quantum foundations and quantum information theory in recent decades, a number of attempts have been proposed to reconstruct quantum theory, or something relevantly similar, from constraints on the possibility of various information-theoretic tasks. Such proposals are often accompanied by the claim that, in virtue of their description of quantum phenomena through principles, they offer particular advantages over quantum theory as it is usually presented. In this talk, I’ll discuss some ways in which principle theories might be taken to provide an advantage over their constructive counterparts, as well as flagging some disanalogies between principle-theoretic reconstructions of quantum theory (PTRQTs), and paradigmatic examples of principle theories. I’ll suggest that whilst PTRQTs cannot answer interpretative questions (and nor should we expect them to), they can provide epistemological insight. I will develop a novel account of understanding by extending van Fraassen’s account of explanation, and argue thereby that PTRQTs can be seen to provide both explanation and understanding through providing answers, distinct from those about interpretation, to the question “Why quantum theory?” in the salient contexts.

Week 8 (1 December): BBLOC seminar, Kings College London: Sam Fletcher (Minnesota and London)

Title: The representation and determinable structure of quantum properties

TRINITY TERM 2022

Week 1, 28 April — Ard Louis (Oxford, Physics)

Title:An algorithmic version of Occam’s razor in machine learning and biological evolution

Abstract: In algorithmic information theory (AIT), Levin’s coding theorem (which should be much more widely taught in Physics!) predicts that, upon uniform random sampling of programmes, a universal Turing machine (UTM) will be exponentially biased towards outputs with low Kolmogorov complexity. In this talk I will provide evidence for a similar exponential bias towards descriptional simplicity (low Kolmogorov complexity) in biological evolution. A similar Occam’s razor-like bias helps explain why deep neural networks generalize well in the overparameterised regime, where classical learning theory predicts they should badly overfit. I will discuss how these principles from AIT fit into a wider discussion about the use (and abuse) of Occam’s razor in science.

Week 2, 5 May — Julian Barbour (Oxford, Independent)

Title: Complexity as time

Abstract: The realisation of Mach’s principle relies on group theory, which makes it is possible to construct consistent theories in which all absolute
elements are eliminated. However, the direct removal of scale leads to a theory that cannot describe the observed growth of structure in the universe. I will suggest that the problem can only be resolved if a group invariant called complexity that measures the variety of the
universe is identified with time itself. This radical step, which eliminates all absolutes because the complexity is also an intrinsic scale, permits highly predictive consistent description of classical Newtonian universes and may do the same for Einsteinian universes. Such
a theory also includes intriguing quantum hints.

Week 3, 12 May — Tomasz Bigaj (Warsaw, Philosophy)

Title: Entanglement and discernibility of identical particles

Abstract: I defend an unorthodox interpretation of quantum states of many particles of the same type, according to which individuation of the components of a composite system of identical particles is done not with the help of unphysical labels (indices) but physically meaningful projections operators. This unorthodox conception requires a modification of the standard notion of entanglement, in order to exclude states whose non-factorizability comes entirely from the (anti-)symmetrization of a product state. I will report several facts regarding the connections of the new concept of entanglement with the issue of discernibility. I will also discuss recent experiments involving measurement-induced entanglement, and I will point out that they do not threaten the cogency of the new concept of entanglement applied to identical particles. I will argue that the non-local correlations observed in these experiments are explainable not by the entanglement of the initial state but by the creation of a new, genuinely entangled state by means of a pre-measurement selection.

Week 4, 19 May — Jim Al Khalili (Surrey, Physics)

Title: Life on the edge: the dawn of quantum biology

Abstract: How do living organisms maintain their highly ordered low entropy states? And might quantum mechanics play a role in this, as hinted at by Schrödinger in his celebrated book, What is Life? Quantum biology is an exciting new field of interdisciplinary research, bringing together theoretical physics, computational chemistry and molecular biology, but it remains speculative and, some might say, even controversial. However, growing evidence is showing that non-trivial quantum effects, such as long-lived coherence, quantum entanglement and tunnelling may well play a functionally important role inside living cells. For example, enzymes utilise quantum tunnelling to accelerate biochemical reactions, while plants and bacteria make use of quantum coherence in photosynthesis to determine the most efficient route for photons from sunlight to reach the reaction centre where they can be converted into chemical energy. More intriguingly, it appears that some animals use quantum entanglement – what Einstein called “spooky action at a distance” – to ‘see’ the earth’s magnetic field for directional information.

In this talk I trace the origins of the field back to the 1930s, and examine how fragile quantum mechanical mechanisms previously thought to be confined to highly rarefied laboratory environments at temperatures close to absolute zero, might manage to play a role in the wet, warm biological world. I will also report on our latest results showing the importance of proton tunnelling in DNA and how this can lead to genetic mutations.

Week 5, 26 May – Nick Huggett (Illinois, Philosophy)

Title: Quantum gravity in a laboratory

Abstract: The characteristic – Planck – energy scale of quantum gravity is utterly beyond current technology, making experimental access to the relevant physics apparently impossible. Nevertheless, low energy experiments linking gravity and the quantum have been undertaken: the Page and Geilker quantum Cavendish experiment, and the Colella-Overhauser-Werner neutron interferometry experiment, for instance. However, neither probes states in which gravity remains in a coherent quantum superposition, unlike — it is claimed — recent proposals that have created considerable interest among physicists. In essence, if two initially unentangled subsystems interacting solely via gravity become entangled, then a simple theorem of quantum mechanics shows that gravity cannot be a classical subsystem. There are formidable challenges to creating such a system, but remarkably, tabletop technology into the gravitational fields of very small bodies has advanced to the point that such an experiment might be feasible in the next several years. In this talk I will explain the proposal and what it aims to show, highlighting the important ways in which its interpretation is theory-laden. (Drawn from joint work with Niels Linnemann and Mike Schneider.)

Week 6, 2 June – Simon Saunders (Oxford, Philosophy)

Title: Probability and branch-counting in the decoherence-based Everett interpretation of quantum mechanics

Abstract: Frequentism in the philosophy of probability is the view that the relative frequency of an event in an ensemble is the probability of the event relative to the ensemble. It has had a perennial appeal down the ages yet suffers from seemingly insuperable difficulties; it has few if any defenders today. However the relevant ensembles have hitherto always been defined by repetition of trials, either trials at different times or in different places, in a single world. If instead the ensemble is defined by the branching structure of the quantum state at a single time and place, using Boltzmann’s method for counting microstates, the usual difficulties of frequentism disappear. The relative frequency of an event in the ensemble thus defined is in agreement with the Born rule.
I shall consider the legitimacy of this representation of the quantum state, its relation to the Deutsch-Wallace Born-rule theorem, and its uses in Bayesian updating. I also offer an analysis of uncertainty in the face of branching that has nothing to do with lack of knowledge, indexical or otherwise.
For further background, see https://arxiv.org/abs/2201.06087.

Week 7, 9th June — Fay Dowker (Imperial, Physics)

Title: Recovering General Relativity from a Planck scale discrete theory of quantum gravity

Abstract: An argument is presented that if a theory of quantum gravity is physically discrete at the Planck scale and the theory recovers General Relativity as an approximation, then, at the current stage of our knowledge, causal sets must arise within the theory, even if they are not its basis.
We show in particular that an apparent alternative to causal sets, viz. a certain sort of discrete Lorentzian simplicial complex, cannot recover General Relativistic spacetimes in the appropriately unique way. For it cannot discriminate between Minkowski spacetime and a spacetime with a certain sort of gravitational wave burst.
This talk is based on joint work with Jeremy Butterfield, available at https://arxiv.org/abs/2106.01297

Week 8, 16th June — Oliver Pooley (Oxford, Philosophy)

Abstract: Roughly speaking, a theory is deterministic if it decrees that each of its possible instantaneous states has a unique future continuation. A theory is indeterministic if it permits distinct future continuations of at least some of its possible instantaneous states. Dynamical collapse modifications of non-relativistic quantum mechanics provide examples. From the standpoint of an instantaneous state countenanced by such a theory, what is the metaphysical status of its possible future continuations? In this talk, I will give an opinionated review of some recent work in philosophy that is germane to this question. I will then ask whether and how the machinery employed in such work might be adapted to relativistic physics.

HILARY TERM 2022

Week 1, 20 January — Tushar Menon (Cambridge, Philosophy)

Title: Dynamical substantivalism

Abstract: I set up and argue for a novel view in the metaphysics of spacetime: dynamical substantivalism. Dynamical substantivalism results from treating as orthogonal two easily conflated questions. First, the question of whether there are immaterial spacetime points. Second, the question of what grounds spatiotemporal geometric structure. Dynamical substantivalism sides with the substantivalist on the former question, and with the dynamical theorist on the latter. I argue that dynamical substantivalism should be treated as a serious contender in the metaphysics of space and time.

Week 2, 27 January — Claudio Calosi (Geneva)

Title: ‌Wavefunction Monism

Abstract: Wavefunction Monism is a peculiar combination of monism and realism about the wavefunction. I first point out a tension within such a combination and suggest different ways to solve it and evaluate their costs. I then consider the consequences of such a view for mereology and location.

Week 3, 3 February — ‌Valeriya Chasova (Salzburg / Strasbourg / Louvain-la-Neuve)

Title: ‌Local symmetries have direct empirical status.

Abstract: Empirical statuses are ways for theoretical symmetries to be physically significant. Direct empirical status (DES) is one such way, consisting in a correspondence between theoretical symmetries and empirical symmetries in the world.

An example of empirical symmetry is Galileo’s ship – the fact that phenomena within a ship are observably invariant under the boost of the ship with respect to the shore. This empirical symmetry can be matched with theoretical boost symmetries, which therefore have DES with respect to it.

Boosts are global symmetries in the sense that they are applied in a uniform way across spacetime. It has long been thought that only global symmetries can be matched with empirical symmetries and so have DES (e.g. Brading & Brown 2004).

However, Greaves and Wallace (2014) have famously argued that local (i.e., non-uniform) symmetries, like the gauge symmetries of classical electromagnetism, also have DES. Their article has elicited many reactions, but not much support.

In my talk I will argue contra the orthodox view that local symmetries do have DES, and this in a way opposite to Greaves and Wallace’s. I will also explain how to reconcile that claim with the view that gauge symmetries do not have DES.

Week 4, 10 February– Samuel Fletcher (Minnesota)

Title: Relativistic Spacetime: Dependence and Ontology

Abstract: Longstanding debates about the ontology of space and metaphysics of motion can receive a new inflection in general relativity as concerning alternative proposals for the dependence relations between the components of a relativistic spacetime. Some of these, such as versions of Brown’s “dynamical” approach, take features of matter to explain or determine features of spacetime structure such as the metric. Others, such as DiSalle and Knox, take inertial frames or the affine connection to determine such features. One can even interpret the notorious Hole Argument, usually phrased as concerning spacetime ontology, as bearing on what spacetime structures determine others and the identity conditions for relativistic spacetimes. I clarify and assess the challenges these views face, and describe how their important insights can be incorporated into a more standard “geometrical” approach.
Week 5 – -17th Feb James Read (Oxford)

Week 5, 17 Feb — James Read (Oxford)

Title: Curvature coupling, electromagnetic wave propagation, and the consistency of the geometrical optics limit

Abstract: We study the propagation of Maxwellian electromagnetic waves in curved spacetimes in terms of the appropriate geometrical optics limit, notions of signal speed, and minimal coupling prescription from Maxwellian theory in flat spacetime. In the course of this, we counter a recent major claim by Asenjo and Hojman (2017) to the effect that the geometrical optics limit is partly ill-defined in Gödel spacetime; we thereby dissolve the present tension concerning established results on wave propagation and the optical limit. (Based upon joint work with Niels Linnemann; CQG 2021.)

Week 8, 10 March — Katherine Brading (Duke)

Title: Du Châtelet, Euler, d’Alembert: constructive and principle approaches to philosophical mechanics

Abstract: This is a talk about the foundations of physics and classical mechanics in the 18th century, and more specifically the problem of bodies and the theory of constrained motions. At the beginning of the 18th century, physics and rational mechanics were distinct disciplines but they shared a common object of theorizing: bodies. I argue that the search for an adequate theory of bodies in motion was a foundational problem for both, one whose solution demanded a “philosophical mechanics”. The first half of the 18th century was dominated by constructive approaches to this problem, as exemplified by Du Châtelet in her Foundations of Physics. However, new work on constrained motion mid century changed the problem space: I explain why, and with what consequences. Within this context, I present d’Alembert’s Treatise on Dynamics as offering a principle approach to the problem. This serves two philosophical purposes: it opens up new questions about d’Alembert’s attempted axiomatization of mechanics, and it shows the significance of his Treatise for the problem of bodies.

MICHAELMAS TERM 2021

Week 1, 14th Oct — Sebastián Murgueitio-Ramirez (Oxford)

Title: On Symmetries, Models and Representation

Abstract: The principle that (RT) symmetry transformations always relate solutions that are representationally equivalent has been widely endorsed in the philosophy of symmetries. In recent years, however, it has been argued that this principle is false. The most well-known argument in this respect is developed by Gordon Belot in “Symmetry and Equivalence,” where he presents several cases of symmetry transformations of differential equations that map solutions onto other solutions that do not seem to represent physically equivalent states of the system. For example, there are symmetries of the harmonic oscillator that map a state of the spring characterized by certain positive amplitude A onto a different state where the amplitude is zero. As Belot puts it, “an approach to understanding physical theories that leaves us unable to see these distinctions is not something we can live with.” In this talk, I show that the cases Belot presents do not succeed once we are explicit about how the mathematical equations in our model are being used to represent the concrete physical systems in question. More precisely, I show that once we are explicit about (a) the concrete system that we want to model, (b) the kind of model we want to use, and (c) the types of measurements that we want to consider, a natural interpretation of the symmetries in question arises that is compatible with (RT). I end the talk by connecting my proposal to David Wallace’s recent work on the philosophy of symmetries.

Week 2, 21st Oct — Kasia Rejzner (York)

Title: Symmetries, anomalies and quantum Noether theorem

Abstract: In this talk I will present how symmetries are treated in perturbative algebraic quantum field theory and how this informs a recent non-perturbative formulation provided in [2108.13336]. One of the main results of that paper was non-perturbative formulation of what we called “anomalous quantum Noether theorem.” It can be seen as the natural counterpart of the classical Noether theorem, which works in quantum theory also in the presence of anomalies.

28th Oct — [no seminar due to clash with Foundations]

Week 4, 4th Nov — David Wallace (Pittsburgh)

Title: The sky is blue, and other reasons physics needs the Everett interpretation

Abstract: The quantum measurement problem is often described as a standoff or a case of underdetermination – perhaps between Everett, Bohm and GRW, perhaps between Everett, Copenhagen and QBism. (It depends on your audience.) The background assumption is that these various alternatives are all compatible with the quantum formalism, and so any question as to which is preferred turns on second-order issues: distaste for action at a distance, worries about probability, competing intuitions about simplicity. I argue, by contrast, that very large swathes of modern physics, from the exotic to the mundane, rely on the Everett interpretation or something very much like it. Specifically, they rely on something like: unitary, the eliminability of collapse except as an approximation, decoherence-type approaches as an explanation of the macroscopic, the ability to use quantum theory far outside the classic predict-evolve-measure paradigm, and the applicability of the theory to many systems at many levels, not just to a supposed ‘fundamental’ level – at which point we’re most of the way to Everett. (In particular, I will argue that the de Broglie-Bohm theory – especially but not only in its popular ‘primitive-ontology’ form – does not at present solve the measurement problem, even if we disregard the esoterica of high-energy particle physics). Other interpretative strategies might point the way to exciting physics in the future, but only Everett-style approaches can make full sense of the physics of today.

Week 5, 11th Nov — Milena Ivanova (Cambridge)

Title: What is a Beautiful Experiment?

Abstract: In this talk I explore the aesthetic dimensions of scientific experimentation, addressing specifically the question how aesthetic features enter the construction, evaluation and reception of an experiment. I highlight the relationship between experiments and artistic acts in the early years of the Royal Society where experiments do not serve only epistemic aims, but also aim to generate feelings of awe and pleasure. I turn to analysing which aspects of experiments are appreciated aesthetically, identifying several contenders, from the ability of an experiment to uncover nature’s beauty, to encapsulating original designs and human creativity. Following this analysis, I focus on the notion of beauty: what makes an experiment beautiful? Several common qualities are explored, from the simplicity and economy of the experiment, to the significance of the experimental results.

Week 6, 18th Nov — Tomoko Kitagawa (Oxford/Science Museum)

Title: Moscow, Oxford, or Princeton: Emmy Noether’s Move from Göttingen (1933)

Abstract: Emmy Noether (1882–1935) received the notification of dismissal from her university post in April 1933 and had to look for a university outside of Germany where she could continue her mathematical research. By the end of the year, she moved to Bryn Mawr College in the United States, and started to give guest lectures at the Institute for Advanced Study, Princeton in February 1934. Her move was successful, but Noether initially considered to go to Moscow and Oxford. She was enthusiastic about both options, and she even accepted an offer from Somerville College Oxford once. This chapter recovers the documents left in the Weston Library and Somerville College, the University of Oxford and Bryn Mawr College, and recounts the effort of Pavel Sergeyevich Alexandroff (1896–1982) and Helen Darbishire (1881–1961) who wished to help Noether and her academic career when she was forced to leave Göttingen.

Week 7, 25th Nov — Emily Qureshi-Hurst (Oxford)

Title: The Many Worries of Many Worlds: Exploring Some Possible Implications of Everettian Quantum Mechanics

Abstract: This paper sets out, in more detail than has hitherto been achieved, some philosophical and theological implications of Hugh Everett III’s Many Worlds Interpretation of Quantum Mechanics. In particular, this paper is concerned with personal identity, the problem of evil, and the Christian doctrine of salvation. Theological engagement with Quantum Mechanics has been dominated by the Copenhagen interpretation, leaving a significant gap in the literature with regards to other interpretations. As the Many Worlds Interpretation’s credibility grows, it is imperative that metaphysicians and theologians engage with its ideas and explore its implications. This paper does just that. It argues that this fascinating interpretation of Quantum Mechanics and its seemingly radical implications must be taken seriously, and that taking these seriously means facing at least three major worries pertaining to personal identity, the problem of evil, and salvation. The paper concludes by calling philosophers and theologians to address these worries, in order that these matters of theological importance remain both credible and coherent if Many Worlds turns out to be correct.

Week 8, 2nd Dec — Neil Dewar (Cambridge) – CANCELLED due to strike action.

Title: Symmetries, Quiddities, and Higher-Order Structure

Abstract: According to the “Symmetry Principle”, two models related by a symmetry transformation represent the same possibility. According to “Anti-Quidditism”, two worlds related by an exchange of properties represent the same possibility. It is natural to think that these two principles are related; in this talk, I discuss what that relationship is.

Week 9, 9th Dec – Katie Robertson (Birmingham) [BLLOC seminar]

Title: On the status of thermodynamics: the village witch’s trial.

Abstract: Thermodynamics is an unusual physical theory; del Rio et al. describe thermodynamics as the `village witch’ of physics and say that “The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and no one dares to contradict her”. And the philosophical status of thermodynamics is disputed; Is it reduced? Autonomous? Anthropocentric? Universal?

In this talk, I tackle two of these questions. First, I discuss the arguments for thermodynamics being not objective, or anthropocentric. I argue, contra Myrvold’s Maxwellian view of thermodynamics, that thermodynamics is not anthropocentric. I then block another road to subjectivity by arguing that the introduction of probability need not be justified by our ignorance of the exact microstate, as Jaynes argued. Instead, in agreement with Chen and Wallace, I argue that the statistical mechanical probabilities can be understood as quantum mechanical probabilities. But my account differs from Chen’s quantum Mentaculus in several respects. Finally, I consider the autonomy of thermodynamics. How can other physical theories ‘come to thermodynamics for advice’ — as we see with black hole thermodynamics guiding the search for a theory of quantum gravity — if thermodynamics is autonomous, and so floats free, of any underlying fundamental theory?

TRINITY TERM 2021

Week 1, 29 April: Caspar Jacobs (Oxford)

Title: Comparativist Theories and Conspiracy Theories: the no miracles argument against comparativism

Abstract: In order to avoid symmetry-related underdetermination, many philosophers have opted for comparativist theories.The fundamental quantities of such theories are comparative: for example, mass ratios or spatial distances. However, there are certain structural facts about the instantiation of those quantities which the comparativist cannot explain. Several examples of such `comparativist conspiracies’ are known; for example, distances famously obey the Triangle Inequality. I argue that these conspiracies are a general problem for a wide class of comparativist theories, including Leibnizian relationism and mass comparativism. On the other hand, absolutism does not face this issue, and so all else being equal we should prefer the latter.

Week 2, 6 May: Nora Boyd (Siena College, NY)

Title: Is Laboratory Astrophysics Astrophysics?

Abstract: The distinction between empirical and virtual data makes an epistemic difference for empiricists. To learn about the natural world, we must put ourselves in the right sort of contact with it. Speculating or simulating alone will not do—empirical constraints on our theorizing are necessary. But what is the ‘right sort’ of contact? Philosophers of science who engage with astrophysics often portray it as a distinctively observational science, since its targets are too distant to experiment upon. While physical manipulation of a target system is neither a necessary nor sufficient condition for good scientific investigation of that target, I have argued that to count as properly empirical, evidence needs to have been derived from a causal chain with one end anchored in the worldly target of interest. On this view, the fact that astrophysical targets cannot be poked and prodded does not in itself undermine our capacity to learn about them through empirical research. However, this view does raise interesting questions in the context of terrestrial laboratory astrophysics experiments. Is the evidence produced in such experiments properly astrophysical evidence? In this talk, I apply my view in a case study of National Ignition Facility research on the effect of high energy flux conditions on the structure of the Rayleigh-Taylor hydrodynamical instability in young supernova remnants. This case illuminates both the arguments needed to justify the epistemic leap from the laboratory to the stars, and also conditions under which those arguments break down.

Week 3, 13 May: Alyssa Ney (UC Davis)

Title: Three Arguments for Wave Function Realism

Abstract: Wave function realism is an interpretative framework for quantum theories which recommends taking the central ontology of these theories to consist of the quantum wave function, understood as a field on a high-dimensional space. I will present and evaluate three arguments for wave function realism and clarify the sort of ontological framework these arguments support.

Week 4, 20 May: Gordon Belot (Michigan, Ann Arbor)

Title: The Mach—Einstein Principle of 1917—1918

Abstract: In 1917 and 1918, Einstein maintained that in general relativity the spacetime metric is fully determined by the distribution of matter. I’ll worry about what he meant by this, whether it makes any sense, and whether any claim in the neighbourhood is true.

Week 5, 27 May: Mike Miller (Toronto)

Title: Cluster decomposition and entanglement

Abstract: Cluster decomposition is a locality constraint. Roughly, it holds that scattering experiments conducted at large spatial remove from one another should give independent results. At this level of specificity nearly everyone seems to be in agreement. There is significant variation, however, in how the principle is mathematically expressed and in how its adoption is motivated. Moreover, some articulations of the principle seem to stand in tension with the phenomena of quantum entanglement. In this talk, I will present some preliminary efforts to articulate what precisely we should take the content of the principle to be, and how to square it with the presence of entanglement in quantum field theory. This is work done in collaboration with John Dougherty (LMU) and Porter Williams (USC).

Week 6, 3 June: Jacob Barandes (Harvard)

Title: Why We Shouldn’t Believe in Hilbert Spaces Anymore, and the Case for Platonic Quantum Theory

Abstract: In this talk, I will argue that the historical focus on Hilbert spaces and their ingredients, from wave functions to Hamiltonians, has obscured the right way to think about interpreting quantum theory. Along these lines, I will explain why instead of attempting to reify Hilbert-space pictures, it may be better to regard them as analogous to gauge formulations of classical field theories. I will then argue that giving up on Hilbert-space pictures as fundamental ingredients of reality does not mean losing our grip on finding ontologies for quantum systems. To the contrary, I will present the case for a new interpretive program – the Platonic interpretation – that starts with ontological posits from the beginning and builds up the mathematical formalism from there. As I will show, Platonic quantum theory is a conservative approach that separates the ontic ingredients of quantum theory from its epistemic-statistical-nomic ingredients while also validating many of the ways that we traditionally talk about the theory, in addition to shedding new light on the measurement problem.

Week 7, 10 June: David Albert (Columbia)

Title: How to Teach Quantum Mechanics

Abstract: Presentations of Quantum Mechanics traditionally start out by pointing to various empirical phenomena – the emission spectrum of hydrogen, electron diffraction, the double-slit experiment, and what have you – which suggest that something is profoundly wrong with the classical picture of the world. I want to come at it from another angle here. I want to start out by pointing to a certain ambiguity in the classical conception of space – and to show how the business of merely dragging that ambiguity out into the open makes quantum mechanics look like something more natural, and more understandable, and more to be expected, than has usually been thought.

Week 8, 17 June: Tim Maudlin (NYU)

Title: The PBR Theorem, Quantum State Realism, and Statistical Independence

Abstract: The PBR Theorem has been characterized as the most important result in foundations of quantum theory since Bell’s Theorem, and indeed the two theorems share several characteristics in common. Both are founded on straightforward empirical postulates that can be checked in the lab. Both have been ascribed significance in quite various ways by different people. And both rely, apart from the empirical postulate, on a statistical independence assumption that has been questioned.

I will give an elementary presentation of the simplest instance of the theorem, and then discuss what it proves and the standing of the statistical independence assumption.

HILARY TERM 2021

January 21st
Martin Lesourd, Black Holes Initiative, Harvard.

Penrose crosses the street and what’s happened since

Abstract: Penrose brought about a revolution in our understanding of general relativity starting in the mid 60s with his emphasis on conformal and causal structure, and with his introduction of the concept of a trapped surface (which in turn led to his 1965 singularity theorem that appears in his Nobel citation). Much has happened since and in this talk I’ll describe some (a fraction!) of Penrose’s seminal contributions as well as some of the modern outlook and exciting recent developments. Although I will mention some results by the speaker in this theme, I will also speak more broadly about the various important contributions occurring in the field as it is today.

January 28th

Eddy Chen, Philosophy, UCSD

The Wentaculus: Density Matrix Realism Meets the Arrow of Time

Abstract: In this talk, I explain my understanding of how to be a realist about the fundamental density matrix of the universe, and discuss the ramifications of such realism for understanding the arrow of time. Since the resultant theory is inspired by the “Mentaculus Vision” of David Albert and Barry Loewer, I call the overall pacakge “The Wentaculus,” where “W” stands for the fundamental density matrix. Unlike the Mentaculus, the Wentaculus says that there is only one (nomologically) possible initial quantum state of the universe. In this way, the Wentaculus eliminates the need for a probability distribution of initial quantum states. I then apply the theory to some problems in the foundations of physics and the philosophy of science, including the nature of the quantum state, the status of the Past Hypothesis, and the issue of ‘nomic vagueness.’

Feb 4th

Guy Hetzroni, Philosophy, Open University, Israel):

Symmetry Arguments, Methodological Equivalence and Relational Quantities

Abstract: Despite the “century of symmetry” during which symmetry considerations became one of the strongest tools of theoretical physics, we have not yet achieved a satisfactory understanding of the reason they repeatedly turn out helpful in constructing, unifying and interpreting physical theories. In this talk I will suggest a unified approach towards the question of the relationship between symmetries and interactions, and will demonstrate it in three different cases: Newtonian gravity, relativistic gravity and gauge symmetries. The approach is based on a suggested “methodological equivalence principle” according to which the non-invariance of the laws of a given theory under passive transformations prescribes the introduction of a new interaction. The most natural way to understand the success of this methodology in different cases, I shall argue, is by adopting an assumption à la Rovelli concerning the relational nature of dynamical quantities. I shall conclude by discussing the epistemological implications of this view: symmetry arguments are neither deductive nor strictly inductive, but are best understood as guesswork based on analogical reasoning, which I suggest to understand along the lines of Norton’s material analogy.

Feb 11th

David Wallace (Philosophy, Pittsburgh):

Title: Quantum gravity at low energies, or Yes, Virginia, there really is a cosmological constant problem

Abstract: “Quantum gravity” is usually defined as the program of reconciling classical general relativity with quantum mechanics, so that a ‘quantum theory of gravity’ is any theory that reduces to general relativity and the standard model of particle physics in appropriate limits. And it is often claimed that (a) we have no such theory at present, only a variety of research programs; (b) there is no, or virtually no, empirical data to inform the search for such a theory. I will argue that both claims are false. We have a perfectly viable quantum theory of gravity – the ordinary quantum-field-theoretic version of general relativity, defined and interpreted in modern ways – and there is actually quite a lot of evidence – comprising a large part of astrophysics and cosmology – that is in support of that theory. I will develop some implications of this attitude to quantum gravity: in particular, I will argue that what high-energy physicists mean by ‘quantum gravity’ is a high-energy completion of our existing quantum theory of gravity, and that the cosmological constant problem is best understood as a Kuhnian anomaly in that theory.

Feb 18th

John Norton (Philosophy, Pittsburgh):

Title: How to make possibility safe for empiricists.

Abstract: What is possible, according to the empiricist conception, is what our evidence positively allows; and what is necessary is what it compels. These notions, along with logical possibility, are the only defensible notions of possibility and necessity. In so far as nomic and metaphysical possibilities are defensible, they fall within empirical possibility. These empirical conceptions are incompatible with traditional possible world semantics. Empirically necessary propositions cannot be defined as those true in all possible worlds. There can be empirical possibilities without empirical necessities. The duality of possibility and necessity can be degenerate and can even be falsified.

Feb 25th

Nick Huggett (Philosophy, Illinois):

Title: Subjectivists about quantum probabilities should be realist about quantum states.

Abstract: I will argue that the arguments for realism about quantum states go through when the probabilities involved are taken to be subjective, if the conclusion is about the agent’s beliefs: an agent whose credences conform to quantum probabilities should believe that preparation procedures with which she associates distinct pure quantum states produce distinct states of reality. The conclusion can be avoided only by stipulation of limitations on the agent’s theorizing about the world, limitations that are not warranted by the empirical success of quantum mechanics or any other empirical considerations. Subjectivists about quantum probabilities should be realists about quantum states. This talk is based on http://philsci-archive.pitt.edu/16656/, which is a sequel to http://philsci-archive.pitt.edu/16655/.

MICHAELMAS TERM 2020

October 15th

Kian Salimkhani, University of Cologne
The Dynamical Approach to Spin-2 Gravity

Abstract: In this presentation I study how the spin-2 approach to gravity helps to strengthen Brown and Pooley’s dynamical approach to general relativity. In particular, I investigate the ontological status of the metric field and the status of the equivalence principle.

October 22nd

Jeremy Steeger, University of Washington
One World Is (Probably) Just as Good as Many

Abstract: One of our most sophisticated accounts of objective chance in quantum theories involves the Deutsch-Wallace theorem, which uses symmetries of the quantum state space to justify agents’ use of the Born rule when the quantum state is known. But Wallace (2003, 2012) argues that this theorem requires an Everettian approach to measurement. I find this argument to be unsound, and I demonstrate a counter-example by applying the Deutsch-Wallace theorem to Bohmian mechanics.

October 29th

Porter Williams, University of Southern California
Identifying Causal Directions in Quantum Theories via Entanglement

Abstract:

November 5th

Sarita Rosenstock, Australian National University

Title: A Category Theoretic Framework for Physical Representation

Abstract: It is increasingly popular for philosophers of physics to use category theory, the mathematical theory of structure, to adjudicate debates about the (in)equivalence of formal physical theories. In this talk, I discuss the theoretical foundations of this strategy. I introduce the concept of a “representation diagram” as a way to scaffold narrative accounts of how mathematical gadgets represent target systems, and demonstrate how their content can be effectively summarised by what I call a “structure category”. I argue that the narrative accounts contain the real content of an act of physical representation, and the category theoretic methodology serves only to make that content precise and conducive to further analysis. In particular, one can use tools from category theory to assess whether one physical formalism thus presented has more “properties”, “structure”, or “stuff” than another according to a given narrative about how they both purport to represent the same physical systems.

November 12th

Trevor Teitel, University of Toronto

Title: How to be a Spacetime Substantivalist

Abstract: The consensus among spacetime substantivalists is to respond to Leibniz’s classic shift arguments, and their contemporary incarnation in the form of the hole argument, by pruning the allegedly problematic surplus metaphysical possibilities. Some substantivalists do so by directly appealing to a modal doctrine akin to anti-haecceitism; others do so by appealing to an underlying hyperintensional doctrine that implies some such modal doctrine. My first aim in this talk is to undermine all extant forms of this consensus position. My second aim is to show what form substantivalism must take in order to uphold the consensus while addressing my challenge from part one. In so doing, I’ll discuss some related issues about the interaction of modality and vagueness. I’ll then argue against the resulting substantivalist metaphysic on independent grounds. I’ll conclude by discussing the way forward for substantivalists once we reject the consensus position.

November 19th

Nicolas Menicucci, RMIT University

Title Sonic Relativity and the Sound Postulate

Abstract: Sound propagation within certain non-relativistic condensed matter models obeys a relativistic wave equation despite such systems admitting entirely non-relativistic descriptions. A natural question that arises upon consideration of this is, “do devices exist that will experience the relativity in these systems?” We describe a thought experiment in which ‘acoustic observers’ possess devices called sound clocks that can be connected to form chains. Careful investigation shows that appropriately constructed chains of stationary and moving sound clocks are perceived by observers on the other chain as undergoing the relativistic phenomena of length contraction and time dilation by the Lorentz factor, γ, with c the speed of sound. Sound clocks within moving chains actually tick less frequently than stationary ones and must be separated by a shorter distance than when stationary to satisfy simultaneity conditions. Stationary sound clocks appear to be length contracted and time dilated to moving observers due to their misunderstanding of their own state of motion with respect to the laboratory. Observers restricted to using sound clocks describe a universe kinematically consistent with the theory of special relativity, despite the preferred frame of their universe in the laboratory. Such devices show promise in further probing analogue relativity models, for example in investigating phenomena that require careful consideration of the proper time elapsed for observers.

Nov 19th (BBLOC Seminar)

Emily Adlam

Title: Spooky Action at a Temporal Distance

Abstract: Since the discovery of Bell’s theorem, the physics community has come to take seriously the possibility that the universe might contain physical processes which are spatially nonlocal, but there has been no such revolution with regard to the possibility of temporally nonlocal processes. In this talk, I argue that the assumption of temporal locality is actively limiting progress in the field of quantum foundations. I investigate the origins of the assumption, arguing that it has arisen for historical and pragmatic reasons rather than good scientific ones, then explain why temporal locality is in tension with relativity and review some recent results which cast doubt on its validity.

November 26th

Martin Lipman, Leiden University

Title: Realism About Relative Facts and Special Relativity

Abstract: In this talk I will set out a non-standard metaphysical framework, according to which there can be genuine facts regarding matters that are normally said to obtain only relative to something. The framework is in the spirit of Kit Fine’s fragmentalism, though different in formulation (also from the version of fragmentalism that I defended in earlier work). After sketching the metaphysical principles and the bit of logic needed to make sense of the view, I will discuss its application to the special theory of relativity. According to the proposed interpretation of the special theory of relativity, there are genuine facts regarding absolute simultaneity, temporal duration and length. If time permits, I will discuss one or two objections

December 3rd

Baptiste Le Bihan, University of Geneva

Title: What Does the World Look Like According to Superdeterminism?

Abstract:

TRINITY TERM 2020

Week 1 (30 April):

Sir Roger Penrose (Maths, Oxford):

Title: CCC and Hawking Points in the Microwave Sky.

Abstract: The theory of of conformal cyclic cosmology (CCC), that I originally put forward in 2005, proposes that our Big Bang was the (conformal) continuation of the remote future of a previous cosmic “aeon” whose exponentially expanding remote future conformally continued to become our Big Bang. Conformal space-time geometry is the geometry defined by the light cones, but where the metric loses its fundamental role. It is the geometry respected by massless particles and fields, most particularly Maxwell’s electromagnetism. The huge, cold, and reified future of the previous aeon thereby identifies with our tiny, hot, and dense Big Bang. Moreover, with CCC, the cycle of aeons continues indefinitely.

CCC has taken a long time to be respected seriously by the cosmological community, despite its being the only scheme I know of that properly explains the source of the 2nd law of thermodynamics in the form that we find it, in addition to certain observed features in the cosmic microwave background (CMB)predicted by CCC. More specifically, recent analysis of the CMB, of both the WMAP and Planck satellites’ data, has revealed numerous previously unobserved remarkably energetic anomalous spots in the CMB, such spots being implications of CCC. They would be the effects of the ultimate Hawking evaporation of supermassive black holes in the previous aeon, whose entire mass-energy would come through the crossover into our aeon in points referred to as “Hawking points”, that following the first 380000 years of our own aeon’s expansion would produce spots like those actually observed in our CMB sky.

Week 2 (7 March):

David Wallace (Philosophy, Pittsburgh)

Title: Isolated systems and their symmetries

Abstract: I defend the view that metaphysical and epistemic implications of a theory’s symmetries for that theory can be understood via a formal conception of those symmetries, understood as transformations which preserve the form of the equations of motion (contra recent work by, e.g., Belot, Dasgupta, and Moller-Nielsen). A key concept here is extendibility of a symmetry: whether or not a symmetry of a system remains a symmetry when that system is coupled to other systems (most notably measurement devices). This in turn requires us to interpret (most) physical theories as idealised isolated subsystems of a larger universe, not (as is common in philosophy of physics) under the fiction that they describe an entire Universe. I provide a detailed framework for doing so and for extracting consequences for symmetry extendibility: the core concept is subsystem-recursivity, whereby interpretative conclusions about a sector of a theory can be deduced from considering subsystems of other models of the same theory.

Background reading (not assumed):
“Observability, redundancy and modality for dynamical symmetry transformations” http://philsci-archive.pitt.edu/16622/
“Isolated Systems and their Symmetries, Part I: General Framework and Particle-Mechanics” http://philsci-archive.pitt.edu/16623/
“Isolated systems and their symmetries, part II: local and global symmetries of field theories “http://philsci-archive.pitt.edu/16624/

Week 3 (14 March):

Erik Curiel (MCMP):

Title: On the Cogency of Quantum Field Theory on Curved Spacetime and Semi-Classical Gravity

Abstract: Quantum field theory on curved spacetime (QFT-CST), and semi-classical gravity (SCG) more generally, is the framework within which our current theories about quantum effects around black holes are formulated. The results of their study, including most famously the Hawking effect and its infamous spawn the information-loss paradox, have revealed several surprises that threaten to overturn the views of space, time, and matter that general relativity and quantum field theory each on their own suggests. In particular, they appear to point to a deep and hitherto unsuspected connection among our three most fundamental theories, general relativity, quantum field theory and thermodynamics. As such, work in SCG provides today some of the most important, central, and fruitful fields of study in theoretical physics, bringing together workers from a variety of fields such as cosmology, general relativity, quantum field theory, particle physics, fluid dynamics, condensed matter, and quantum gravity, providing bridges that now closely connect disciplines once seen as largely independent. The framework, however, has serious mathematical, physical and conceptual problems, which I survey. One might think that treating SCG as merely an effective theory would ameliorate these problems. I argue that the issue is not straightforward. Thus, SCG presents us with problems that are foundational in a serious sense: they must be addressed in order to make sense of contemporary theoretical physics.

Week 4 (21 May):

James Read (Philosophy, Oxford):

Title: Newtonian Equivalence Principles

Abstract: I present a unified framework for understanding equivalence principles in spacetime theories, applicable to both relativistic and Newtonian contexts. This builds on prior work by Knox (2014) and Lehmkuhl (forthcoming).

Week 5 (28 May):

David Baker (Philosophy, Michigan)

Title: What Are Symmetries?

Abstract: I advance a stipulational account of symmetries, according to which symmetries are part of the content of theories. For a theory to have a certain symmetry is for the theory to stipulate that models related by the symmetry represent the same possibility. I show that the stipulational account compares positively with alternatives, including Dasgupta’s epistemic account of symmetry, Moller-Nielsen’s motivational account, and so-called formal and ontic accounts. In particular, the stipulational account avoids the problems Belot and Dasgupta have raised against formal and ontic accounts of symmetry while retaining many of the advantages of these otherwise-attractive frameworks. It also fits naturally into an appealing account of how we ought to interpret effective theories as opposed to fundamental ones.

Week 6 (4 June):

Marij van Strien (Wuppertal)

Title: Bohm’s theory of quantum mechanics and the notion of classicality

Abstract: When David Bohm published his alternative theory of quantum mechanics in 1952, it was not received well; a recurring criticism was that it formed a reactionary attempt to return to classical physics. In response, Bohm emphasized the progressiveness of his approach, and even turned the accusation of classicality around by arguing that he wanted to move beyond classical elements still inherent in orthodox quantum mechanics. In later years, he moved more and more towards speculative and mystical directions.
In this talk I will aim to explain this discrepancy between the ways in which Bohm’s work on quantum mechanics has been received and the way in which Bohm himself presented it. I reject the idea that Bohm’s early work can be described as mechanist, determinist, and realist, in contrast to his later writings, and argue that there is in fact a strong continuity between his work on quantum mechanics from the early 1950s and his later, more speculative writings. In particular, I argue that Bohm was never strongly committed to determinism and was a realist in some ways but not in others. A closer look at Bohm’s philosophical commitments highlights the ways in which his theory of quantum mechanics is non-classical and does not offer a way to avoid all ‘quantum weirdness’.

HILARY TERM 2020

Week 1 (23 January):

Anders Sandberg (Oxford)

Title: ‌Physical eschatology: how much can we say about the far future of the universe, and how much does it matter?

Abstract: Historically science has been reluctant to make long-term predictions about the future. One interesting exception is astronomy, where the combination of relatively low complexity, low noise environments and large timespans as well as strong theories have allowed the field of physical eschatology to emerge. This talk will outline the development of physical eschatology, discuss the current main models, and try to analyse the methodological challenges of such extreme long-range predictions, especially in the light of longtermist ethics increasingly being interested in some of the results as being potentially relevant for deciding near-term strategies.

Week 2 (30 January). Mauro Dorato (University of Rome 3): ‌Overcoming dynamical explanations with structural explanations

Abstract: By briefly reviewing three well-known scientific revolutions in spacetime physics (the discovery of inertia, of special relativity and of general relativity), I claim that problems that were supposed to be crying for a dynamical explanation in the old paradigm ended up receiving a structural explanation in the new one. This claim is meant to give more substance to Kuhn’s claim that revolutions are accompanied by a shift in what needs to be explained, while suggesting at the same time the existence of a pattern that is common to all of the above three case-studies and that involves the overcoming of central assumptions of the manifest image. In the last part I discuss the question whether also entanglement can be given a purely structural-non dynamical explanation.

Week 3 (6–7 February) The first Oxford Philosophy of Physics Graduate Conference.
https://philphysgradconference.com

Week 4 (13 February). John Dougherty (Munich): Why ghosts are real and “surplus structure” isn’t

Astract: Gauge theories are often thought to pose an interpretive puzzle. On the one hand, it seems that some of the mathematical structure of a gauge theory is surplus—that is, it does not reflect any structure in the world. Interpreting this structure as surplus is meant to be especially important to the process of quantization. On the other hand, it has proven difficult to eliminate this putatively surplus structure without losing important features of the theory like empirical adequacy. In this talk I argue that this puzzle is ill-posed, because there is no notion of “surplus structure” on which gauge theories have it. The standard conception of surplus structure presumes an account of mathematical structure that excludes the mathematics of gauge theory, so gauge theories neither have nor lack surplus structure on this conception. And on analyses of “surplus structure” that do apply to gauge theories it’s easy to see that they don’t have it.

Week 5 (20 February) J. Brian Pitts (Cambridge): ‌Constraints, Gauge, Change and Observables in Hamiltonian General Relativity

Abstract: Since the mid-1950s it has seemed that change is somehow missing in Hamiltonian General Relativity. How did this problem arise? How compelling are the axioms on which it rests? What of the 1980s+ reforming literature that has aimed to recover the mathematical Hamiltonian-Lagrangian equivalence that was given up in the mid-1950s, a reforming literature that is visible in journals but scarce in books? What should one mean by Hamiltonian gauge transformations and observables, and how can one decide? The absence of change in observables can be traced to (1) a pragmatic conjecture (initially by Peter Bergmann and his student Schiller and later by Dirac) that gauge transformations come not merely from a tuned sum of “first-class constraints” (the Rosenfeld-Anderson-Bergmann gauge generator), but also from each first-class constraint separately, and (2) an assumption that the internal gauge symmetry of electromagnetism is an adequate precedent for the external/space-time coordinate symmetry of General Relativity. Requiring that gauge transformations preserve Hamilton’s equations or that equivalent theory formulations yield equivalent observables shows that change is right where it should be in Hamiltonian General Relativity including observables, namely, essential time dependence (e.g., lack of a time-like Killing vector field) in coordinate-covariant quantities. A genuine problem of missing change might exist at the quantum level, however.

Week 6 (27 February)-Simon Saunders (Oxford): Particle trajectories, indistinguishable particles, and the discovery of the photon.

Abstract: It is widely thought that particles with trajectories cannot be indistinguishable, in the quantum mechanical sense – wrongly. In this talk I shall explain how this doctrine first arose in Dirac’s 1926 treatment, and why it has proved so enduring. As a result, historians and philosophers of physics have neglected the obvious precursor of the indistinguishability concept in Gibbs’ concept of generic phase, applicable to classical particles. It was neglected by the discovers of quantum mechanics as well, not least by Einstein, whose 1905 argument for the light quantum was based on the concept of the ‘mutual independence’ of non-interacting particles. Yet indistinguishable particles in Gibbs sense are not mutually independent, in Einstein’s: the fluctuation he considered, for thermal radiation in the Wien regime, does not discriminate between them.
Trajectories are not just compatible with the indistinguishability concept: in an important sense, they are needed for it. This makes clearer the difference between diffeomorphism invariance and permutation invariance, and highlights an important thread to the history of quantum physics, and specifically Bose’s contribution: for Bose showed how to derive the Planck black-body distribution precisely by endowing the light quantum with a state space of its own – in effect, allowing that light quanta may have trajectories. This was the breakthrough to the concept of the photon, as opposed to the light quantum concept.
A final ingredient, needed to redress this history, is the recognition that the kind of entanglement introduced by symmetrisation is essentially trivial – and cannot, for example, lead to the violation of any Bell inequality, an observation recently made by Adam Caulton. I conclude that the failure of statistical independence, as it arises in Bose-Einstein statistics away from the Wien regime, is unrelated to quantum non-locality, and to entanglement.

Week 7 (5 March) – no seminar

Week 8 (12 March). Emily Adlam, BLOC Seminar at King’s College, London: TBC.

MICHAELMAS TERM 2019

Week 1 (17th October): Patricia Palacios (Philosophy, University of Salzburg).

Title: Re-defining equilibrium for long-range interacting systems

Abstract: Long-range interacting systems are systems in which the interaction potential decays slowly for large inter-particle distance. Typical examples of long-range interactions are the gravitational and Coulomb forces. The philosophical interest for studying these kinds of systems has to do with the fact that they exhibit properties that escape traditional definitions of equilibrium based on average ensembles. Some of those properties are ensemble inequivalence, negative specific heat, negative susceptibility and ergodicity breaking. Focusing on long-range interacting systems has thus the potential of leading one to an entirely different conception of equilibrium or, at least, to a revision of traditional definitions of it. But how should we define equilibrium for long-range interacting systems?

In this talk, I address this question and argue that the problem of defining equilibrium in terms of average ensembles is due to the lack of a time-scale in the statistical mechanical treatment. In consequence, I argue that adding a specific time-scale to the statistical treatment can give us a satisfactory definition of equilibrium in terms of metastable states. I point out that such a time-scale depends on the number of particles in the system, as it happens when phase transitions occur, also in the more usual context of short-range interacting systems like condensed matter ones. I thus discuss the analogies and the dissimilarities between the case of long-range systems and that of phase transitions and argue that these analogies, which should be interpreted as liberal formal analogies, can have an important heuristic role in the development of statistical mechanics for long range interacting systems.

Week 2 (24th October): Francesca Chadha-Day (Physics, University of Cambridge).

Title: Dark Matter: Understanding the gravity of the situation

Abstract: The existence of Dark Matter – matter that is unaccounted for by the Standard Model of particle physics – is supported by a staggering quantity and variety of astrophysical observations. A plethora of Dark Matter candidates have been proposed. Dark matter may be cold, warm or fuzzy. It may be composed of right-handed neutrinos, supersymmetric particles, axions or primordial black holes. I will give an overview of Dark Matter candidates and how we can understand the phenomenological differences between them in the framework of quantum theory. I will discuss the difficulties faced by modified gravity theories in explaining our observations, and their relation to Dark Matter.

Week 3 (31st October): NO SEMINAR

Week 4 (7th November): Jamee Elder (Philosophy, University of Notre Dame/University of Bonn).

Title: The epistemology of LIGO

Abstract: In this talk, I examine the methodology and epistemology of LIGO, with a focus on the role of models and simulations in the experimental process. This includes post-Newtonian approximations, models generated through the effective one-body formalism, and numerical relativity simulations, as well as hybrid models that incorporate aspects of all three approaches. I then present an apparent puzzle concerning the validation of these models: how can we successfully validate these models and simulations through our observations of black holes, given that our observations rely on our having valid models of the systems being observed? I argue that there is a problematic circularity here in how we make inferences about the properties of compact binaries. The problem is particularly acute when we consider these experiments as empirical tests of general relativity. I then consider strategies for responding to this challenge.

Week 5 (14th November): Adam Caulton (Philosophy, University of Oxford).

Title; Is a particle an irreducible representation of the Poincaré group?

Abstract: Ever since investigations into the group representation theory of spacetime symmetries, chiefly due to Wigner and Bargmann in the 1930s and ‘40s, it has become something of a mantra in particle physics that a particle is an irreducible representation of the Poincaré group (the symmetry group of Minkowski spacetime). Call this ‘Wigner’s identification’. One may ask, in a philosophical spirit, whether Wigner’s identification could serve as something like a real definition (as opposed to a nominal definition) of ‘particle’—at least for the purposes of relativistic quantum field theory. In this talk, I aim to show that, while Wigner’s identification is materially adequate for many purposes—principally scattering theory—it does not provide a serviceable definition. The main problem, or so I shall argue, is that the regime of legitimate particle talk surpasses the constraints put on it by Wigner’s identification. I aim further to show that, at least in the case of particles with mass, a promising rival definition is available. This promising rival emerges from investigations due to Foldy in the 1950s, which I will outline. The broad upshot is that the definition of ‘particle’ may well be the same in both the relativistic and non-relativistic contexts, and draws upon not the Poincaré group (or any other spacetime symmetry group) but rather the familiar Heisenberg relations.

Week 6 (21st November): Radin Dardashti (Philosophy, University of Wuppertal).

Title: Understanding Problems in Physics

Abstract: In current fundamental physics empirical data is scarce, and it may take several decades before the hypothesised solution to a scientific problem can be tested. So, scientists need to be careful in assessing what constitutes a scientific problem in the first place, for there may be the danger of providing a solution to a non-existing problem. Relying and extending on previous work by Larry Laudan and Thomas Nickles, I apply the philosophical discussion on scientific problems to modern particle physics.

Week 7 (28th November): NO SEMINAR

Week 8 (5th December): Karim Thébault (Philosophy, University of Bristol).

Title: Time and Background Independence

Abstract: We showcase a new framework for the analysis of the symmetries and spatiotemporal structure of a physical theory via the application to the problem of differentiating intuitively background dependent theories from intuitively background independent theories. This problem has been rendered a particularly pressing one by the magisterial analysis of Pooley (2017), who convincingly demonstrates that diffeomorphism invariance cannot be equated with background independence via reference to the comparison between diffeomorphism invariant special relativity and general relativity.

Our framework is built upon the analysis of the transformation behaviour of nomic and temporal structures under kinematical transformations (defined as endomorphisms on the space of kinematically possible models). We define the sub-regions with the space of kinematical transformations corresponding to where a structure is absolute (does not vary) and relative (does vary) and then classify temporal structures via the intersection of their absolute and relative regions with those of nomic structures. Of particular relevance is the case where there is non-trivial overlap between the relative region of some temporal structure and both the absolute and relative regions of the nomic structure. We classify such structure as dynamical surplus structure.

Finally, based upon the analysis of temporal foliation structure, we provide a new account of background independence. On our account background independence manifestly fails for diffeomorphism invariant special relativity (since the temporal foliation structure is non-dynamical surplus structure) and obtains for general relativity (since the temporal foliation structure is dynamical surplus structure). This formalises the intuitive idea of the contingent independence of the dynamical models of general relativity from a spatiotemporal background.

Week 9 (12th December): Barry Loewer (Philosophy, Rutgers) Title: The package deal account of fundamental laws

Abstract: In my talk I will describe an account of the metaphysics of fundamental laws I call “the Package Deal Account (PDA)” that is a descendent of Lewis’ BSA but differs in a number of ways. First, it does not require the truth of a thesis Lewis call “Humean Supervenience” (HS) and so can accommodate relations and structures found in contemporary physics that conflict with HS. Second, it is not committed to Humeanism since it is compatible with there being fundamental necessary connections in nature. Third, it greatly develops the criteria for what counts in favor of a candidate system to determine laws. Fourth and most significantly, unlike the BSA the PDA does not presuppose metaphysically primitive elite properties/quantities that Lewis calls “perfectly natural properties/quantities.

TRINITY TERM 2019

Week1 (Thursday May 2) Olivier Darrigol (Paris): Ludwig Boltzmann: Atoms, mechanics, and probability

Statistical mechanics owes much more to Ludwig Boltzmann than is usually believed. In his attempts to derived thermodynamic and transport phenomena from deeper microphysical assumptions, he explored at least five different approaches: One based on mechanical analogies (with periodic mechanical systems or with statistical ensembles), one based on Maxwell’s collision formula, one based on the ergodic hypothesis, one based on combinatorial probabilities, and one based on the existence of thermodynamic equilibrium. I will sketch this various approaches and show how Boltzmann judged them and interconnected them. It will also argue that in general Boltzmann was more concerned with constructive efficiency than with precise conceptual foundations. Basic questions on the reality of atoms or on the nature of probabilities played only a secondary role in his theoretical enterprise.

Week 2 (Thursday May 9) No seminar

Week 3 (Thursday May 16) Jeremy Butterfield (Cambridge): On realism and functionalism about space and time

(Joint worth with Henrique Gomes.) In this talk I will set the recent literature on spacetime functionalism in context, by discussing two traditions that form its background. First: functionalism in general, as a species of inter-theoretic reduction. Second: relationism about space and time.

Week 4 (Thursday May 23): No seminar

Week 5 (Thursday May 30) Henrique Gomes (Perimeter and Cambridge): Gauge, boundaries, and the connection form

Forces such as electromagnetism and gravity reach across the Universe; they are the long-ranged forces in current physics. And yet, in many applications—theoretical and otherwise—we only have access to finite domains of the world. For instance, in computations of entanglement entropy, e.g. for black holes or cosmic horizons, we raise boundaries to separate the known from the unknown. In this talk, I will argue we do not understand gauge theory as well as we think we do, when boundaries are present.

For example: It is agreed by all that we should aim to construct variables that have a one to one relationship to the theory’s physical content within bounded regions. But puzzles arise if we try to combine definitions of strictly physical variables in different parts of the world. This is most clearly gleaned by first employing the simplest tool for unique physical representation—gauge fixings—and then proceeding to stumble on its shortcomings. Whereas fixing the gauge can often shave off unwanted redundancies, the coupling of different bounded regions requires the use of gauge-variant elements. Therefore, the coupling of regional observables is inimical to gauge-fixing, as usually understood. This resistance to gauge-fixing has led some to declare the coupling of subsystems to be the raison d’être of gauge [Rov14].

Here I will explicate the problems mentioned above and illustrate a possible resolution. The resolution was introduced in a recent series of papers [Gomes & Riello JHEP ’17,Gomes & Riello PRD ’18,Gomes, Hopfumuller, Riello NPB ’19]. It requires the notion of a connection-form in the field-space of gauge theories. Using this tool, a modified version of symplectic geometry—here called ‘horizontal’—is possible. Independently of boundary conditions, this formalism bestows to each region a physically salient, relational notion of charge: the horizontal Noether charge. It is relational in the sense that it only uses the different fields already at play and relationships between them; no new “edge-mode” degrees of freedom are required.

The guiding requirement for the construction of the relational connection-form is simply a harmonious melding of regional and global observables. I show that the ensuing notions of regional relationalism are different from other attempts at resolving the problem posed by gauge symmetries for bounded regions. The distinguishing criterion is what I consider to be the ‘acid test’ of local gauge theories in bounded regions: does the theory license only those regional charges which depend solely on the original field content? In a satisfactory theory, the answer should be “yes”. Lastly, I will introduce explicit examples of relational connection-forms, and show that the ensuing horizontal symplectic geometry passes this ‘acid test’.”

Week 7 (Thursday June 13) Harvey Brown (Oxford): Aspects of probabilistic reasoning in physics

Week 8 (Thursday June 20) Martin Lesourd (Oxford): The epistemic constraints that observers face in General Relativistic spacetimes

What can observers know about their future and their own spacetime on the basis of their past lightcones? Important contributions to this question were made by Earman, Geroch, Glymour, Malament and more recently Manchak. Building on the work of Malament, Manchak (2009/10/11/14) has been able to prove what seem to be general and far reaching results to the effect that observers in general relativistic spacetimes face severe epistemic constraints. Here, after reviewing these results, I shall present a number of new results which grant observers a more positive epistemic status. So in short: if Malament and Manchak’s results were cause for a form of epistemic pessimism, then the ones presented here will strive for a more optimistic outlook.

HILARY TERM 2019

Week 1 (17 Jan) Laurenz Hudetz (LSE): The conceptual-schemas account of interpretation

This talk addresses the question what it is to interpret a formalism. It aims to provide a general framework for talking about interpretations (of various kinds) in a rigorous way. First, I clarify what I mean by a formalism. Second, I give an account of what it is to establish a link between a formalism and data. For this purpose, I draw on the theory of relational databases to explicate what data schemas and collections of data are. I introduce the notion of an interpretive link using tools from mathematical logic. Third, I address the question how a formalism can be interpreted in a way that goes beyond a connection to data. The basic idea is that one extends a data schema to an ontological conceptual schema and links the formalism to this extended schema. I illustrate this account of interpretation by means of simple examples and highlight how it can be fruitfully applied to address conceptual problems in philosophy of science.

Week 2 (24 Jan)  Patrick Dürr (Oxford): Philosophy of the Dead: Nordström Gravity

The talk revisits Nordström Gravity (NoG) – arguably the most plausible relativistic scalar theory of gravity before the advent of GR. In Nordström’s original formulation (1913), NoG1, it appears to describe a scalar gravitational field on Minkowski spacetime. In 1914, Fokker and Einstein showed that NoG is mathematically equivalent to a purely metric theory, NoG2 – strikingly similar to the Einstein Equations. Like GR, NoG2 is plausibly construed as a geometrised theory of gravity: In NoG2, gravitational effects are reduced to manifestations of non-Minkowskian spacetime structure. Both variants of NoG, and their claimed physical equivalence, give rise to three conundrums that we will explore.

(P1) The (Weak) Equivalence Principle appears to be violated in NoG1 – but holds in NoG2. (P2) In NoG1, it appears unproblematic to ascribe the gravitational scalar an energy-momentum tensor. In trying to define gravitational energy in NoG2, by contrast, one faces problems akin to those in GR. (P3) In NoG1, total (i.e. gravitational plus non-gravitational) energy-momentum appears to be conserved, whereas in NoG2, no obvious candidate for gravitational energy is available, and furthermore it seems unclear whether non-gravitational energy is conserved.

In as far as NoG1 and NoG2 are equivalent formulations of the same theory, (P1)-(P3) appear paradoxical. For a resolution, I will proffer a metaphysically perspicuous articulation of NoG’s ontology that explicates the equivalence, and propose an instructive reformulation.

Week 3 (31 Jan)  Yang-Hui He (City, London): Exceptional and Sporadic

I give an overview of a host of seemingly unrelated classification problems in mathematics which turn out to be intimately connected through some deep Correspondences.

Some of these relations were uncovered by focusing on so-called exceptional structures which abound: in geometry, there are the Platonic solids; in algebra, there are the exceptional Lie algebras; in group theory, there are the sporadic groups, to name but a few. A champion for such Correspondences is Prof. John McKay.

I also present how these correspondences have subsequently been harnessed by theoretical physicists. My goal is to take a casual promenade in this land of ‘exceptionology’, reviewing some classic results and presenting some new ones based on joint work with Prof. McKay.

Week 4 (7 Feb)  Casey McCoy (Stockholm): Why is h-bar a universal constant?

Some constants are relevant for all physical phenomena: the speed of light pertains to the causal structure of spacetime and hence all physical processes. Others are relevant only to particular interactions, for example the fine structure constant. Why is Planck’s constant one of the former? I motivate the possibility that there could have been multiple, interaction-specific ‘Planck constants. Although there are indeed good reasons to eschew this possibility, it suggests a further question: what is the actual conceptual significance of Planck’s constant in quantum physics? I argue that it lies principally in relating classical and quantum physics,  and I draw out two main perspectives on this relation, represented by the views of Landsman and Lévy-Leblond.

Week 5 (14 Feb)  Joanna Luc (Cambridge): Generalised manifolds as basic objects of General Relativity

In the definition of a differential manifold in General Relativity (GR) the Hausdorff condition is typically assumed. In my talk I will investigate the consequences of dropping this condition. I will argue that there are good reasons to regard non-Hausdorff manifolds as basic objects of GR, together with Hausdorff manifolds. However, it is not clear whether they can be regarded as physically reasonable basic objects of GR. I will argue that some of the objections to their physical reasonability can be refuted if we understand them as representing a bundle of alternative spacetimes. This interpretation is supported by a theorem stating that every non-Hausdorff manifold can be seen as a result of gluing together some Hausdorff manifolds.

Week 6 (21 Feb)  Katie Robertson (Birmingham): Reducing the second law of thermodynamics: the demons and difficulties.

In this talk I consider how to reduce the second law of thermodynamics. I first discuss what I mean by ‘reduction’, and emphasis how functionalism can be helpful in securing reductions. Then I articulate the second law, and discuss what the ramifications of Maxwell’s demon are for the status of the second law. Should we take Maxwell’s means-relative approach? I argue no: the second law is not a relic of our inability to manipulate individual molecules in the manner of the nimble-fingered demon. When articulating the second law, I take care to distinguish it from the minus first law (Brown and Uffink 2001); the latter concerns the spontaneous approach to equilibrium whereas the former concerns the thermodynamic entropy change between equilibrium states, especially in quasi-static processes. Distinguishing these laws alters the reductive project (Luczak 2018): locating what Callender (1999) calls the Holy Grail – a non-decreasing statistical mechanical quantity to call entropy – is neither necessary nor sufficient. Instead, we must find a quantity that plays the right role, viz. to be constant in adiabatic quasi-static processes and increasing in non-quasi-static processes, and I argue that the Gibbs entropy plays this role.

Week 7 (28 Feb)  Alex Franklin (KCL): On the Effectiveness of Effective Field Theories

Effective Quantum Field Theories (EFTs) are effective insofar as they apply within a prescribed range of length-scales, but within that range they predict and describe with extremely high accuracy and precision. I will argue that the effectiveness of EFTs is best explained in terms of the scaling behaviour of the parameters. The explanation relies on distinguishing autonomy with respect to changes in microstates (autonomy_ms), from autonomy with respect to changes in microlaws (autonomy_ml), and relating these, respectively, to renormalisability and naturalness. It is claimed, pace Williams (2016), that the effectiveness of EFTs is a consequence of each theory’s renormalisability rather than its naturalness. This serves to undermine an important argument in favour of the view that only natural theories are kosher. It has been claimed in a number of recent papers that low-energy EFTs are emergent from their high-energy counterparts, see e.g. Bain (2013) and Butterfield (2014). Building on the foregoing analysis, I will argue that the emergence of EFTs may be understood in terms of the framework developed in Franklin and Knox (2018).

Week 8 (7 Mar)  Karen Crowther (Geneva): As Below, So Before: Synchronic and Diachronic Conceptions of Emergence in Quantum Gravity

The emergence of spacetime from quantum gravity appears to be a striking case-study of emergent phenomena in physics (albeit one that is speculative at present). There are, in fact, two different cases of emergent spacetime in quantum gravity: a “synchronic” conception, applying between different levels of description, and a “diachronic” conception, from the universe “before” and after the “Big Bang” in quantum cosmology. The purpose of this paper is to explore these two different senses of spacetime emergence; and to see whether, and how, they can be understood in the context of specific extant accounts of emergence in physics.

MICHAELMAS TERM 2018

Week 1 (11 Oct): David Wallace (USC): Spontaneous symmetry breaking in finite quantum systems: a decoherent-histories approach.

Abstract: Spontaneous symmetry breaking (SSB) in quantum systems, such as ferromagnets, is normally described as (or as arising from) degeneracy of the ground state; however, it is well established that this degeneracy only occurs in spatially infinite systems, and even better established that ferromagnets are not spatially infinite. I review this well-known paradox, and consider a popular solution where the symmetry is explicitly broken by some external field which goes to zero in the infinite-volume limit; although this is formally satisfactory, I argue that it must be rejected as a physical explanation of SSB since it fails to reproduce some important features of the phenomenology. Motivated by considerations from the analogous classical system, I argue that SSB in finite systems should be understood in terms of the approximate decoupling of the system’s state space into dynamically-isolated sectors, related by a symmetry transformation; I use the formalism of decoherent histories to make this more precise and to quantify the effect, showing that it is more than sufficient to explain SSB in realistic systems and that it goes over in a smooth and natural way to the infinite limit.

Week 2 (18 Oct): Simon Saunders (Oxford): Understanding indistinguishabilty.

Abstract:Indistinguishable entities are usually thought to be exactly alike, but not so in quantum mechanics — nor need the concept be restricted to the quantum domain. The concept, properly understood, can be applied in any context in which the only dynamically-salient state-independent properties are the same (so a fortiori in classical statistical mechanics).

The connection with the Gibbs paradox, and the reasons why the concept of classical indistinguishable particles has been so long resisted, are also discussed. The latter involves some background in the early history of quantum mechanics. This work builds on a recent publication, ‘The Gibbs Paradox’, Entropy (2018) 20(8), 552.

Week 3 (25 Oct): NO SEMINAR

Week 4 (1 Nov): NO SEMINAR

Week 5 (8 Nov): Tushar Menon (Oxford): Rotating spacetimes and the relativistic null hypothesis

Abstract: Recent work in the physics literature demonstrates that, in particular classes of rotating spacetimes, physical light rays do not, in general, traverse null geodesics. In this talk, I discuss its philosophical significance, both for the clock hypothesis (in particular, for Sam Fletcher’s recent purported proof thereof for light clocks), and for the operational meaning of the metric field in GR. (This talk is based on joint work with James Read and Niels Linnemann)

Week 6 (15 Nov): Jonathan Barrett (Oxford): Quantum causal models

Abstract: From a discussion of how to generalise Reichenbach’s Principle of the Common Cause to the case of quantum systems, I will develop a formalism to describe any set of quantum systems that have specified causal relationships between them. This formalism is the nearest quantum analogue to the classical causal models of Judea Pearl and others. I will illustrate the formalism with some simple examples, and if time, describe the quantum analogue of a well known classical theorem that relates the causal relationships between random variables to conditional independences in their joint probability distribution. I will end with some more speculative remarks concerning the significance of the work for the foundations of quantum theory.

Week 7 (22 Nov): James Nguyen (IoP/UCL): Interpreting Models: A Suggestion and its Payoffs

Abstract: I suggest that the representational content of a scientific model is determined by a `key’ associated with it. A key allows the model’s users to draw inferences about its target system. Crucially, these inferences need not be a matter of proposed similarity (structural or otherwise) to its target but can allow for much more conventional associations between model features and features to be exported. Although this is a simple suggestion, it has broad ramifications. I point out that it allows us to re-conceptualise what we mean by `idealisation’: just because a model is a distortion of its target (in the relevant respects, and even essentially so), this does not entail that it is a misrepresentation. I show how, once we think about idealisation in this way, various puzzles in the philosophy of science dissolve (the role of fictional models in science; the non-factivity of understanding; the problem of inconsistent models; and others).

TRINITY TERM 2018

Week 1 (26 Apr) Doreen Fraser (University of Waterloo): Renormalization and scaling transformations in quantum field theory

Abstract: Renormalization is a mathematical operation that needs to be carried out to make empirical sense of quantum field theories (QFTs). How should renormalized QFTs be physically interpreted? A prominent non-perturbative strategy for renormalizing QFTs is to draw on formal analogies with classical statistical mechanical models for critical phenomena. This strategy is implemented in both the Wilsonian renormalization group approach and the Euclidean approach to constructing models of the Wightman axioms. Each approach features a scaling transformation, but the scaling transformations are given different interpretations. I will analyze the two interpretations and argue that the approaches offer compatible and complementary perspectives on renormalization.

Week 2 (3 May) Jeremy Butterfield (Cambridge): On Dualities and Equivalences Between Physical Theories

Abstract:The main aim of this paper is to make a remark about the relation between (i) dualities between theories, as `duality’ is understood in physics and (ii) equivalence of theories, as `equivalence’ is understood in logic and philosophy. The remark is that in physics, two theories can be dual, and accordingly get called `the same theory’, though we interpret them as disagreeing—so that they are certainly not equivalent, as `equivalent’ is normally understood. So the remark is simple: but, I shall argue, worth stressing—since often neglected. My argument for this is based on the account of duality by De Haro: which is illustrated here with several examples, from both elementary physics and string theory. Thus I argue that in some examples, including in string theory, two dual theories disagree in their claims about the world. I also spell out how this remark implies a limitation of proposals (both traditional and recent) to understand theoretical equivalence as either logical equivalence or a weakening of it.

Week 3 (10 May) Matt Farr (Cambridge): The C Theory of Time

Abstract: Does time have a direction? Intuitively, it does. After all, our experiences, our thoughts, even our scientific explanations of phenomena are time-directed; things evolve from earlier to later, and it would seem unnecessary and indeed odd to try to expunge such talk from our philosophical lexicon. Nevertheless, in this talk I will make the case for what I call the C theory of time: in short, the thesis that time does not have a direction. I will do so by making the theory as palatable as possible, and this will involve giving an account of why it is permissible and indeed useful to talk in time-directed terms, what role time-directed explanations play in science, and why neither of these should commit us to the claim that reality is fundamentally directed in time. On the positive side, I will make the case that the C theory’s deflationism about the direction of time offers a superior account of time asymmetries in physics than rival time-direction-realist accounts.

Week 4 (17 May) Seth Lloyd (MIT): The future of Quantum Computing.

Abstract:Technologies for performing quantum computation have progressed rapidly over the past few years. This talk reviews recent advances in constructing quantum computers, and discusses applications for the kinds of quantum computers that are likely to be available in the near future. While full blown error corrected quantum computers capable of factoring large numbers are some way away, quantum computers with 100-1000 qubits should be available soon. Such devices should be able to solve problems quantum simulation and quantum machine learning that are beyond the reach of the most powerful classical computers. The talk will also discuss social aspects of quantum information, including the proliferation of start ups and the integration of quantum technologies in industry.

Week 5 (24 May) Emily Thomas (Durham): John Locke: Newtonian Absolutist about Time?

Abstract:John Locke’s metaphysics of time are relatively neglected but he discussed time throughout his career, from his unpublished 1670s writings to his 1690 Essay Concerning Human Understanding, and beyond. The vast majority of scholars who have written on Locke’s metaphysics of time argue that Locke’s views underwent an evolution: from relationism, the view that time and space are relations holding between bodies; to Newtonian absolutism, on which time and space are real, substance-like entities that are associated with God’s eternal duration and infinite immensity. Against this majority reading, I argue that Locke remained a relationist in the Essay, and throughout his subsequent career.

Week 6 (31 May) Minhyong Kim (Oxford): Three Dualities

Abstract:This talk will present a few contemporary points of view on geometry, with particular emphasis on dualities. Most of the talk will be concerned with mathematical practice, but will be interspersed with brief and superficial allusions to physics.

Week 7 (7 Jun) Owen Maroney (Oxford): TBC

Abstract:TBC

Week 8 (14 Jun) Tushar Menon (Oxford): TBC

Abstract: TBC

HILARY TERM 2018

Week 2 (25 Jan) Giulio Chiribella (Oxford): The Purification Principle

Abstract: Over the past decades there has been an intense work aiming at the reconstruction of quantum theory from principles that can be formulated without the mathematical framework of Hilbert spaces and operator algebras. The motivation was that these principles could provide a new angle to understand into the counterintuitive quantum laws, that they could reveal connections between different quantum features, and that they could provide guidance for constructing new quantum algorithms and for extending quantum theory to new physical scenarios.

In this talk I will discuss on one such principle, called the Purification Principle. Informally, the idea of the Purification Principle is that it is always possible to combine the incomplete information gathered by an observer with a maximally informative picture of the physical world. This idea resonates with Schrödinger’s famous quote that “[in quantum theory] the best possible knowledge of a whole does not necessarily imply the best possible knowledge of its parts”, a property that he called “not one, but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought.”

References for this talk:
GC, GM D’Ariano, and P Perinotti, Probabilistic Theories with Purification, Phys. Rev. A 81, 062348 (2010)
GC, GM D’Ariano, and P Perinotti, Informational Derivation of Quantum Theory, Phys. Rev. A 84, 012311 (2011)
GM D’Ariano, GC, and P Perinotti, Quantum Theory From First Principles, Cambridge University Press (2017).

Week 3 (1 Feb) Christopher Timpson (Oxford): Concepts of fundamentality: the case of information

Abstract: A familiar – perhaps traditional – conception of fundamental physics is as follows: physics presents the world as being populated at the basic level (or the pro tem basic level) by various fields or/and particles, and it furnishes equations describing how these items evolve and interact with one another over time, equations couched primarily in terms of such properties as energy, mass, and various species of charge. This evolution may be conceived to take place against a fixed background spatiotemporal arena of some kind, or in one alternative, it may well be conceived that the arena has a metrical structure which should also be treated as particular kind of field, itself subject to dynamical equations (as in General Relativity). But in recent years, stemming primarily from developments in quantum information theory and related thinking in quantum theory itself, an alternative conception has been gaining momentum, one which sees the concept of information playing a much more fundamental role in physics than this traditional picture would allow. This alternative conception urges that information must be recognised as a fundamental physical quantity, a quantity which in some sense should be conceived as being on a par with energy, mass, or charge. Perhaps even, according to strong versions of the conception, information should be seen as the new subject-matter for physics, displacing the traditional conception of material particles and fields as being the fundamental subject matter.

These are bold and interesting claims on the part of information, and regarding what is said to follow from the successes of quantum information theory. Are they well-motivated? Are they true? I will explore these issues by attempting, first of all, to delineate various ways in which something (object, structure, property, or concept) might be thought to be physically fundamental. On at least one prima facie plausible carving of the notion of fundamentality, one should distinguish between the logically independent notions of ontological fundamentality, nomological fundamentality, and explanatory fundamentality. Something is ontologically fundamental if it is posited by the most fundamental description of the world; it is nomologically fundamental if reference to it is necessary to state the physical laws in some domain; and it is explanatorily fundamental if positing it is necessary in explanation and understanding. It is straightforward to show that the concept of information is not ontologically fundamental (or more cagily put: that there is nothing at all about the successes of quantum information theory that would warrant thinking it to be ontologically fundamental), but the questions of nomological and explanatory fundamentality of information are harder to settle so succinctly.

Week 4 (8 Feb) David Wallace (USC): Why black hole information loss is paradoxical

Abstract: I distinguish between two versions of the black hole information-loss paradox. The first arises from apparent failure of unitarity on the spacetime of a completely evaporating black hole, which appears to be non-globally-hyperbolic; this is the most commonly discussed version of the paradox in the foundational and semipopular literature, and the case for calling it “paradoxical” is less than compelling. But the second arises from a clash between a fully-statistical-mechanical interpretation of black hole evaporation and the quantum-field-theoretic description used in derivations of the Hawking effect. This version of the paradox arises long before a black hole completely evaporates, seems to be the version that has played a central role in quantum gravity, and is genuinely paradoxical. After explicating the paradox, I discuss the implications of more recent work on AdS/CFT duality and on the ‘Firewall paradox’, and conclude that the paradox is if anything now sharper.

Week 5 (15 Feb) Dennis Lehmkuhl (Caltech): The History and Interpretation of Black Hole Solutions

Abstract:The history and philosophy of physics community has spent decades grappling with the interpretation of the Einstein field equations and its central mathematical object, the metric tensor. However, the community has not endeavoured a detailed study of the solutions to these equations. This is all the more surprising as this is where the meat is in terms of the physics: the confirmation of general relativity through the 1919 observation of light being bent by the sun, as well as the derivation of Mercury’s perihelion, both depend much more on the use of the Schwarzschild solution than on the actual field equations. Indeed, Einstein had not yet found the final version of the field equations when he predicted the perihelion of Mercury. The same is true with respect to the recently discovered black holes and gravitational waves: they are, arguably, tests of particular solutions to the Einstein equations and how these solutions are applied to certain observations. Indeed, what is particularly striking is that all the solutions just mentioned are solutions to the vacuum Einstein equations rather than to the full Einstein equations. This is surprising given that black holes are the most massive objects in the universe, and yet they are adequately represented by solutions to the vacuum field equations.

In this talk, I shall discuss the history and the diverse interpretations and applications of three of the most important (classes of) black hole solutions: I will address especially how the free parameters in these solutions were identified as representing the mass, charge and angular momentum of isolated objects, and what kind of coordinate conditions made it possible to apply the solutions in order to represent point particles, stars, and black holes.

Week 6 (22 Feb) No seminar

Week 7 (1 Mar) Carina Prunkl (Oxford): Black Hole Entropy is Entropy and not (necessarily) Information

Abstract:The comparison of geometrical properties of black holes with classical thermodynamic variables reveals surprising parallels between the laws of black hole mechanics and the laws of thermodynamics. Since Hawking’s discovery that black holes when coupled to quantum matter fields emit radiation at a temperature proportional to their surface gravity, the idea that black holes are genuine thermodynamic objects with a well-defined thermodynamic entropy has become more and more popular. Surprisingly, arguments that justify this assumption are both sparse and rarely convincing. Most of them rely on an information-theoretic interpretation of entropy, which in itself is a highly debated topic in the philosophy of physics. Given the amount of disagreement about the nature of entropy and the second law on the one hand, and the growing importance of black hole thermodynamics for the foundations of physics on the other hand, it is desirable to achieve a deeper understanding of the notion of entropy in the context of black hole mechanics. I discuss some of the pertinent arguments that aim at establishing the identity of black hole surface area (times a constant) and thermodynamic entropy and show why these arguments are not satisfactory. I then present a simple model of a Black Hole Carnot cycle to establish that black hole entropy is genuine thermodynamic entropy which does not require an information-theoretic interpretation.

Week 8 (8 Mar) Nicolas Teh (Notre Dame): TBC

MICHAELMAS TERM 2017

Week 2 (October 19) Henrique Gomes, Perimeter Institute, Waterloo.

“New vistas from the many-instant landscape”

Abstract: Quantum gravity has many conceptual problems. Amongst the most well-known is the “Problem of Time”: gravitational observables are global in time, while we would really like to obtain probabilities for processes taking us from an observable at one time to another, later one. Tackling these questions using relationalism will be the preferred strategy during this talk. The ‘relationalist’ approach leads us to shed much redundant information and enables us to identify a reduced configuration space as the arena on which physics unfolds, a goal still beyond our reach in general relativity. Moreover, basing our ontology on this space has far-reaching consequences. One is that it suggests a natural interpretation of quantum mechanics; it is a form of ‘Many-Worlds’ which I have called Many-Instant Bayesianism. Another is that the gravitational reduced configuration space has a rich, highly asymmetric structure which singles out preferred, non-singular and homogeneous initial conditions for a wave-function of the universe, which is yet to be explored.

Week 3 (October 26) Jonathan Halliwell, Imperial College, London

“Comparing conditions for macrorealism: Leggett-Garg inequalities vs no-signalling in time”

Abstract: Macrorealism is the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state.
I discuss two different types of conditions which were proposed to test macrorealism in the context of a system described by a single dichotomic variable Q.  The Leggett-Garg (LG) inequalities, the most commonly-studied test, are only necessary conditions for macrorealism, but I show that when the four three-time LG inequalities are augmented with a certain set of two-time inequalities also of the LG form, Fine’s theorem applies and these augmented conditions are then both necessary and sufficient. A comparison is carried out with a very different set of necessary and sufficient conditions for macrorealism, namely the no-signaling in time (NSIT) conditions proposed by Brukner, Clemente, Kofler and others, which ensure that all probabilities for Q at one and two times are independent of whether earlier or intermediate measurements are made in a given run, and do not involve (but imply) the LG inequalities. I argue that tests based on the LG inequalities have the form of very weak classicality conditions and can be satisfied, in quantum mechanics, in the face of moderate interference effects, but those based on NSIT conditions have the form of much stronger coherence witness conditions, satisfied only for zero interference. The two tests differ in their implementation of non-invasive measurability so are testing different notions of macrorealism. The augmented LG tests are indirect, entailing a combination of the results of different experiments with only compatible quantities measured in each experimental run, in close analogy with Bell tests, and are primarily tests for macrorealism per se. By contrast the NSIT tests entail sequential measurements of incompatible quantities and are primarily tests for non-invasiveness.

Based on the two papers J.J.Halliwell, Phys.Rev. A93, 022123 (2016); A96, 012121 (2017).

Week 4 (November 2) Sam Fletcher, Dept of Philosophy, University of Minnesota.

“Emergence and scale’s labyrinth”

I give precise formal definitions of a hierarchy of emergence concepts for properties described in models of physical theories, showing how some of these concepts are compatible with reductive (but not strictly deductive) relationships between these theories. Besides applying fruitfully to a variety of physical examples, these concepts do not in general track autonomy or novelty along a single simple dimensional scale such as energy, length, or time, but can instead involve labyrinthine balancing relationships between these scales. This complicates the usual view of emergence as relating linearly (or even partially) ordered levels.

Week 5 (November 9) James Ladyman, Dept of Philosophy, University of Bristol

“Why interpret quantum mechanics?”

Abstract: I discuss recent arguments that QM needs no interpretation, and that it should be understood as not representational. I consider how the interpretation of quantum mechanics relates to various kinds of realism, and the fact that the theory is known not to be a complete theory of the world. I tentatively suggest a position that is sceptical about the way the interpretation of quantum mechanics is often undertaken, in particular of the idea of the ontology of the wavefunction, but stops short of regarding quantum states as not representational.

Week 6 (November 16) Hasok Chang, Department of History and Philosophy of Science, University of Cambridge.

“Beyond truth-as-correspondence: Realism for realistic people”

Abstract: In this paper I present arguments against the epistemological ideal of “correspondence”, namely the deeply entrenched notion that empirical truth consists in the match between our theories and the world. The correspondence ideal of knowledge is not something we can actually pursue, for two reasons: it is difficult to discern a coherent sense in which statements correspond to language-independent facts, and we do not have the kind of independent access to the “external world” that would allow us to check the alleged statement–world correspondence. The widespread intuition that correspondence is a pursuable ideal is based on an indefensible kind of externalist referential semantics. The idea that a scientific theory “represents” or “corresponds to” the external world is a metaphor grounded in other human epistemic activities that are actually representational. This metaphor constitutes a serious and well-entrenched obstacle in our attempt to understand scientific practices, and overcoming it will require some disciplined thinking and hard work. On the one hand, we need to continue with real practices of representation in which correspondence can actually be judged; on the other hand, we should stop the illegitimate transfer of intuitions from those practices over to realms in which there are no representations being made and no correspondence to check.

Week 7 (November 23) Alison Fernandes, Department of Philosophy, University of Warwick.

“The temporal asymmetry of chance”

Abstract: The Second Law of Thermodynamics can be derived from the fact that an isolated system at non-maximal entropy is overwhelmingly likely to increase in entropy over time. Such derivations seem to make ineliminable use of objective worldly probabilities (chances). But some have argued that if the fundamental laws are deterministic, there can be no non-trivial chances (Popper, Lewis, Schaffer). Statistical-mechanical probabilities are merely epistemic, or otherwise less real than ‘dynamical’ chances. Many have also thought that chance is intrinsically temporally asymmetric. It is part of the nature of chance that the past is ‘fixed’, and that all non-trivial chances must concern future events. I’ll argue that it is no coincidence that many have held both views: the rejection of deterministic chance is driven by an asymmetric picture of chance in which the past produces the future. I’ll articulate a more deflationary view, according to which more limited temporal asymmetries of chance reflect contingent asymmetries of precisely the kind reflected in the Second Law. The past can be chancy after all.

Week 8 (November 30) Nancy Cartwright, Department of Philosophy, University of Durham and University of California, San Diego

“What are pragmatic trials in medicine good for?”

Abstract: There is widespread call for increasing use of pragmatic trials in both medicine and social science nowadays. These are randomised controlled trials (RCTs) that are administered in ‘more realistic’ circumstances than standard, i.e. with more realistic treatment/programme delivery (e.g. busier, less well-trained doctors/social workers) and a wider range of recipients (e.g. ones that self select into treatment or have ‘co-morbidities’ or are already subject to a number of other interventions that might interfere with the treatment). Pragmatic trial results are supposed to be more readily ‘generalisable’ than results from those with more rigid protocols.
We argue that this is a mistake. Trials, pragmatic or otherwise, can only provide results about those individuals enrolled in the trial. Anything else requires assumptions from elsewhere, and generally strong ones. Based on a common understanding of what causal principles look like in these domains, this talk explains what results can be well warranted by an RCT and warns against the common advice to take the criteria for admission to a trial to be indicative of where else its results may be expected to hold.
Joint work with Sarah Wieten.

TRINITY TERM 2017

15th June 2017 Harvey Brown (Oxford), “QBism: the ineffable reality behind “ participatory realism””

Abstract: The recent philosophy of Quantum Bayesianism, or QBism, represents an attempt to solve the traditional puzzles in the foundations of quantum theory by denying the objective reality of the quantum state. Einstein had hoped to remove the spectre of nonlocality in the theory by also assigning an epistemic status to the quantum state, but his version of this doctrine was recently proved to be inconsistent with the predictions of quantum mechanics. In this talk, I present plausibility arguments, old and new, for the reality of the quantum state, and expose what I think are weaknesses in QBism as a philosophy of science.

8th June 2017 David Jackson (Independent), “How to build a unified field theory from one dimension”

Abstract: Motivated in part by Kant’s work on the a priori nature of space and time, and in part by the conceptual basis of general relativity, a physical theory deriving from a single temporal dimension will be presented. We describe how the basic arithmetic composition of the real line, representing the one dimension of time, itself incorporates structures that can be interpreted as underpinning both the geometrical form of space and the physical form of matter. This unification scheme has a number of features in common with a range of physical theories based on ‘extra dimensions’ of space, while being heavily constrained in deriving from a single dimension of time. A proposal for combining general relativity with quantum theory in the context of this approach will be summarised, along with the connections made with empirical observations. In addition to extracts from Kant further references to sources in the philosophical literature will be cited, in particular with regard to the relation between mathematical objects and physical structures.

1st June 2017 Jo E. Wolff (KCL), “Quantities – Metaphysical Choicepoints”

Abstract: Beginning from the assumption that quantities are (rich) relational structures, I ask, what kind of ontology arises from attributing this sort of structure to physical attributes. There are three natural questions to ask about relational structures: What are the relations, what are the relata, and what is the relationship between relata and relations? I argue that for quantities, the choicepoints available in response to these questions are:
1) intrinsicalism vs. structuralism
2) substantivialism vs. anti-substantivalism
3) absolutism vs. comparativism
In the remainder of the talk I sketch, which of these choices make for coherent candidate ontologies for quantities.

18th May 2017 Paul Tappenden (Independent), “Quantum fission”.

Abstract: Sixty years on there is still deep division about Everett’s proposal. Some very well informed critics take the whole idea to be unintelligible whilst there are important disagreements amongst supporters. I argue that Everett’s fundamental and radical idea is to do with metaphysics rather than physics: it is to abolish the physically possible/actual dichotomy. I show that the idea is intelligible via a thought experiment involving a novel version of the mind-body relation which I have already used in the defence of semantic internalism.
The argument leads to a fission interpretation of branching rather than a “divergence” interpretation of the sort first suggested by David Deutsch in 1985 and more recently developed in different ways by Simon Saunders, David Wallace and Alastair Wilson. I discuss the two metaphysical problems which fission faces: transtemporal identity and the identification of probability with relative branch measure. And I claim that the Born rule applies transparently if the alternative mind-body relation is accepted. The upshot is that what Wallace calls the Radical View replaces his preferred Conservative View, with the result that there are some disturbing consequences such as inevitable personal survival in quantum Russian roulette scenarios and David Lewis’s suggestion that Everettians should “shake in their shoes”.

11th May 2017 Michela Massimi (Edinburgh), “Perspectival models in contemporary high-energy physics”.

Abstract: In recent times perspectivism has come under attack. Critics have argued that when it comes to modelling, perspectivism is either redundant, or, worse, it leads to a plurality of incompatible or even inconsistent models about the same target system. In this paper, I attend to two tasks. First, I try to get clear about the charge of metaphysical inconsistency that has been levelled against perspectivism and identify some key assumptions behind it. Second, I propose a more positive role for perspectivism in some modelling practices by identifying a class of models, which I call “perspectival models”. I illustrate this class of models with examples from contemporary LHC physics.

4th May 2017 Tushar Menon (Oxford), “Affine Balance: Algebraic functionalism and the ontology of spacetime”.

Abstract: Our two most empirically successful theories, quantum mechanics and general relativity, are at odds with each other when it comes to several foundational issues. The deepest of these issues is also, perhaps, the easiest to grasp intuitively: what is spacetime? Most attempts at theories of quantum gravity do not make it obvious which degrees of freedom are spatiotemporal. In non-general relativistic theories, the matter/spacetime distinction is adequately tracked by the dynamical/non-dynamical object distinction. General relativity is different, because spacetime, if taken to be jointly, but with some redundancy, represented by a smooth manifold and a metric tensor field, is not an immutable, inert, external spectator. Our dynamical/non-dynamical distinction appears no longer to do the work for us; we appear to need something else. In the first part of this talk, I push back against the idea that the dynamical/non-dynamical distinction is doomed. I motivate a more general algebraic characterisation of spacetime based on Eleanor Knox’s spacetime functionalism, and the Helmholtzian notion of free mobility. I argue that spacetime is most usefully characterised by its (local) affine structure.

In the second part of this talk, I consider the debate between Brown and Pooley on the one hand and Janssen and Balashov on the other, about the direction of the arrow of explanation in special relativity. Characterising spacetime using algebraic functionalism, I demonstrate that only Brown’s position is neutral on the substantivalism–relationalism debate. This neutrality may prove to be highly desirable in an interpretation of spacetime that one hopes will generalise to theories of quantum gravity—it seems like poor practice to impose restrictions on an acceptable quantum theory of spacetime based on metaphysical prejudices or approximately true effective field theories. The flexibility of Brown’s approach affords us a theory-dependent a posteriori identification of spacetime, and arguably counts in its favour. I conclude by gesturing towards how this construction might be useful in extending Brown’s view to theories of quantum gravity.

27th April 2017 Peter Hylton (UIC) “Analyticity, yet again”.

Abstract: Although Quine became famous for having rejected the analytic-synthetic distinction, he actually accepted it for the last quarter century of his philosophical career. Yet his doing so makes no difference to his other views. In this talk, I press the question ‘Why not?’, in the hope of gaining insight into Quine’s views, and especially his differences with Carnap. I contrast Quine’s position not only with Carnap’s but also with those of Putnam, as represented in his paper ‘The Analytic and the Synthetic’. Putnam there puts forward an answer to the ‘Why not?’ question which is, I think, fairly widely accepted, and perhaps taken to be Quine’s answer as well—wrongly so taken, I claim.

9th March 2017 Michael Hicks (Physics, Oxford), “Explanatory (a)symmetries and Humean laws”.

Abstract: Recently, Lange (2009) has argued that some physical principles are explanatorily prior to others. Lange’s main examples are symmetry principles, which he argues explain both conservation laws–through Noether’s Theorem–and features of dynamic laws–for example, the Lorentz invariance of QFT. Lange calls these “meta-laws” claims that his account of laws, which is built around the counterfactual stability of groups of statements, can capture the fact that these govern or constrain first-order laws, whereas other views, principally Humean views, can’t. After reviewing the problem Lange presents, I’ll show how the explanatory asymmetry between laws he describes follows naturally on a Humean understanding of what laws are–particularly informative summaries. The Humean should agree with Lange that symmetry principles are explanatorily prior to both conservation laws and dynamic theories like QFT; however, I’ll argue that Lange is wrong to consider these principles “meta-laws” which in some way govern first-order laws, and I’ll show that on the Humean view, the explanation of these two sorts of laws from symmetry principles is importantly different.

2nd March 2017 Ronnie Hermens (Philosophy, Groningen), “How ψ-ontic are ψ-ontic models?”.

Abstract: Ψ-ontology theorems show that in any ontic model that is able to reproduce the predictions of quantum mechanics, the quantum state must be encoded by the ontic state. Since the ontic state determines what is real, and it determines the quantum state, the quantum state must be real. But how does this precisely work in detail, and what does the result imply for the status of the quantum state in ψ-ontic models? As a test case scenario I will look at the ontic models of Meyer, Kent and Clifton. Since these models are able to reproduce the predictions of quantum mechanics, they must be ψ-ontic. On the other hand, quantum states play no role whatsoever in the construction of these models. Thus finding out which ontic state belongs to which quantum state is a non-trivial task. But once that is done, we can ask: does the quantum state play any explanatory role in these models, or is the fact that they are ψ-ontic a mere mathematical nicety?

23rd February 2017 Simon Saunders (Philosophy, Oxford), “Quantum monads”.

Abstract: The notion of object (and with it ontology) in the foundations of quantum mechanics has been made both too easy and too hard: too easy, because particle distinguishability, and with it the use of proper names, is routinely assumed; too hard, because a number of metaphysical demands have been made of it (for example, in the notion of ‘primitive ontology’ in the writings of Shelly Goldstein and his collaborators). The measurement problem is also wrapped up with it. I shall first give an account of quantum objects adequate to the thin sense required of quantification theory (in the tradition of Frege and Quine); I then consider an alternative, much thicker notion that is strongly reminiscent of Leibniz’s monadology. Both apply to the Everett interpretation and to dynamical collapse theories (sans primitive ontology).

16th February 2017 Steven Balbus (Physics, Oxford), “An anthropic explanation for the nearly equal angular diameters of the Sun and Moon”.

Abstract: The very similar angular sizes of the Sun and Moon as subtended at the Earth is generally portrayed as coincidental. In fact, close angular size agreement is a direct and inevitable mathematical consequence of even roughly comparable lunar and solar tidal amplitudes. I will argue that the latter was a biological imperative for the evolution of land vertebrates and can be understood on the basis of anthropic arguments. Comparable tidal amplitudes from two astronomical sources, with close but distinct frequencies, leads to strongly modulated forcing: in essence spring and neap tides. This appearance of this surely very rare tidal pattern must be understood in the context of paleogeography and biology of the Late Devonian period. Two great land masses were separated by a broad opening tapering to a very narrow, shallow-sea strait. The combination of this geography and modulated tidal forces would have been conducive to forming a rich inland network of shallow but transient (and therefore isolating) tidal pools at an epoch when fishy tetrapods were evolving and acquiring land navigational skills. I will discuss the recent fossil evidence showing that important transitional species lived in habitats strongly influenced by intermittent tides. It may be that any planet capable of harbouring a contemplative species displays a moon in its sky very close in angular diameter to that of its sun.

9th February 2017 Alastair Wilson (Philosophy, Birmingham), “How multiverses might undercut the fine-tuning argument”.

Abstract: In the context of the probabilistic fine-tuning argument that moves from the fragility of cosmological parameters with respect to life to the existence of a divine designer, appealing to the existence of a multiverse has in general seemed problematically ad hoc. The situation looks rather different, though, if there is independent evidence from physics for a multiverse. I will argue that independently-motivated multiverses can be undercutting defeaters for the fine-tuning argument; but whether the argument is indeed undercut still depends on open questions in fundamental physics and cosmology. I will also argue that Everettian quantum mechanics opens up new routes to undercutting the fine-tuning argument, although by itself it is insufficient to do so.

26th January 2017 Antony Eagle (Philosophy, Adelaide), “Quantum location”.

Abstract: Many metaphysicians are committed to the existence of a location relation between material objects and spacetime, useful in characterising debates in the metaphysics of persistence and time, particularly in the context of trying to map ordinary objects into models of relativity theory. Relatively little attention has been paid to location in quantum mechanics, despite the existence of a position observable in QM being one of the few things metaphysicians know about it. I want to explore how the location relation(s) postulated by metaphysicians might be mapped onto the framework of QM, with particular reference to the idea that there might be such a thing as being indeterminately located.

19th January 2017 Emily Adlam (DAMPT, Cambridge), “Quantum mechanics and global determinism”.

Abstract: We propose that the information-theoretic features of quantum mechanics are perspectival effects which arise because experiments on local variables can only uncover a certain subset of the correlations exhibited by an underlying deterministic theory. We show that the no-signalling principle, information causality, and strong subadditivity can be derived in this way; we then use our approach to propose a new resolution of the black hole information paradox.

24 Nov 2016 David Glick (Philosophy, Oxford), “Swapping Something Real: Entanglement Swapping and Entanglement Realism”.

Abstract: Experiments demonstrating entanglement swapping have been alleged to challenge realism about entanglement. Seevinck (2006) claims that entanglement “cannot be considered ontologically robust” while Healey (2012) claims that entanglement swapping “undermines the idea that ascribing an entangled state to quantum systems is a way of representing some new, non-classical, physical relation between them.” My aim in this paper is to show that realism is not threatened by the possibility of entanglement swapping, but rather, should be informed by the phenomenon. I argue—expanding the argument of Timpson and Brown (2010)—that ordinary entanglement swapping cases present no new challenges for the realist. With respect to the delayed-choice variant discussed by Healey, I claim that there are two options available to the realist: (a) deny these are cases of genuine swapping (following Egg (2013)) or (b) allow for the existence of entanglement relations between timelike separated regions. This latter option, while radical, is not incoherent and has been suggested in quite different contexts. While I stop short of claiming that the realist must take this option, doing so allows one to avoid certain costs associated with Egg’s account. I conclude by noting several important implications of entanglement swapping for how one thinks of entanglement relations more generally.

17 Nov 2016 Jim Weatherall (UC Irvine),”On Stuff: The Field Concept in Classical Physics”.

Abstract: Discussions of physical ontology often come down to two basic options. Either the basic physical entities are particles, or else they are fields. I will argue that, in fact, it is not at all clear what it would mean to say that the world consists of fields. Speaking classically (i.e., non-quantum-ly), there are many different sorts of thing that go by the name “field”, each with different representational roles. Even among those that have some claim to being “fundamental” in the appropriate sense, it does not seem that a single interpretational strategy could apply in all cases. I will end by suggesting that standard strategies for constructing quantum theories of fields are not sensitive to the different roles that “fields” can play in classical physics, which adds a further difficulty to interpreting quantum field theory. Along the way, I will say something about an old debate in the foundations of relativity theory, concerning whether the spacetime metric is a “geometrical” or “physical” field. The view I will defend is that the metric is much like the electromagnetic field: geometrical!

10 Nov 2016 Lina Jansson (Nottingham), ‘Newton’s Methodology Meets Humean Supervenience about Laws of Nature’.

Abstract: Earman and Roberts [2005a,b] have argued for Humean supervenience about laws of nature based on an argument from epistemic access. In rough outline, their argument relies on the claim that if Humean supervenience is false, then we cannot have any empirical evidence in favour of taking a proposition to be a law of nature as opposed to merely accidentally true. I argue that Newton’s methodology in the Principia provides a counterexample to their claim. In particular, I argue that the success or failure of chains of subjunctive reasoning is empirically accessible, and that this provides a way of gaining empirical evidence for or against a proposition being a law of nature (even under the assumption that Humean supervenience fails).

27 Oct 2016 Ryan Samaroo (Bristol), “The Principle of Equivalence is a Criterion of Identity”.

Abstract: In 1907 Einstein had an insight into gravitation that he would later refer to as ‘the happiest thought of my life’. This is the hypothesis, roughly speaking, that bodies in free fall do not ‘feel’ their own weight. This is what is formalized in ‘the equivalence principle’. The principle motivated a critical analysis of the Newtonian and 1905 inertial frame concepts, and it was indispensable to Einstein’s argument for a new concept of inertial motion. A great deal has been written about the equivalence principle. Nearly all of this work has focused on the content of the principle, but its methodological role has been largely neglected. A methodological analysis asks the following questions: what kind of principle is the equivalence principle? What is its role in the conceptual framework of gravitation theory? I maintain that the existing answers are unsatisfactory and I offer new answers.

20 Oct 2016 Niels Martens (Oxford, Philosophy), “Comparativism about Mass in Newtonian Gravity”.

Abstract: Absolutism about mass asserts that facts about mass ratios are true in virtue of intrinsic masses. Comparativism about mass denies this. I present and dismiss Dasgupta’s (2013) analysis of his recent empirical adequacy argument in favour of comparativism—in the context of Newtonian Gravity. I develop and criticise two new versions of comparativism. Regularity Comparativism forms a liberalisation of Huggett’s Regularity Relationalism (2006), which uses the Mill-Ramsey-Lewis Best System’s Account to respond to Newton’s bucket argument in the analogous relationalism-substantivalism debate. To the extent that this approach works at all, I argue that it works too well: it throws away the massive baby with the bathwater. A Machian flavoured version of comparativism is more promising. Although it faces no knock-down objection, it is not without its own problems though.

13 Oct 2016 David Wallace (USC, Philosophy), “Fundamental and emergent geometry in Newtonian gravity”.

Abstract: Using as a starting point recent and apparently incompatible conclusions by Simon Saunders (Philosophy of Science 80 (2013) pp.22-48) and Eleanor Knox (British Journal for the Philosophy of Science 65 (2014) pp.863-880), I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two theories make the same claims both about the background geometry required to define the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view — espoused by Knox, and also by Harvey Brown (Physical Relativity, OUP 2005) — that inertial structure is defined by the dynamics governing subsystems of a larger system. This clarifies some interesting features of Newtonian physics, notably (i) the distinction between using the theory to model subsystems of a larger whole and using it to model complete Universes, and (ii) the scale-relativity of spacetime structure.

19 May 2016 Eleanor Knox (KCL, Philosophy), “Novel Explanation and the Emergence of Phonons”.

Abstract: Discussions of emergence in the philosophy of physics literature often emphasise the role of asymptotic limits in understanding the novelty of emergent phenomena while leaving the nature of the novelty in question unexplored. I’ll put forward an account of explanatory novelty that can accommodate examples involving asymptotic limits, but also applies in other cases. The emergence of phonons in a crystal lattice will provide an example of a description with novel explanatory power that does not depend on asymptotic limits for its novelty. The talk is based on joint work with Alex Franklin.

12th May 2016 Yvonne Geyer (Oxford, Maths), “Rethinking Quantum Field Theory: Traces of String Theory in Yang-Mills and Gravity”.

Abstract: A multitude of recent developments point towards the need for a different understanding of Quantum Field Theories. After a general introduction, I will focus on one specific example involving one of the most natural and fundamental observables; the scattering amplitude. In Yang-Mills theory and Einstein gravity, scattering amplitudes exhibit a simplicity that is completely obscured by the traditional approach to Quantum Field Theories, and that is remarkably reminiscent of the worldsheet models describing string theory. In particular, this implies that – without additional input – the theories describing our universe, Yang-Mills theory and gravity, exhibit traces of string theory.

28th April 2016 Roman Frigg (LSE, Philosophy), “Further Rethinking Equilibrium”.

Abstract: In a recent paper we proposed a new definition of Boltzmannian equilibrium and showed that in the case of deterministic dynamical systems the new definition implies the standard characterisation but without suffering from its well-known problems and limitations. We now generalise this result to stochastic systems and show that the same implication holds. We then discuss an existence theorem for equilibrium states and illustrate with a number of examples how the theorem works. Finally, fist steps towards understanding the relation between Boltzmannian and Gibbsian equilibrium are made.

25 Feb 2016 Stephen J. Blundell (Oxford, Physics), ‘Emergence, causation and storytelling: condensed matter physics and the limitations of the human mind’

Abstract: The physics of matter in the condensed state is concerned with problems in which the number of constituent particles is vastly greater than can be comprehended by the human mind. The physical limitations of the human mind are fundamental and restrict the way in which we can interact with and learn about the universe. This presents challenges for developing scientific explanations that are met by emergent narratives, concepts and arguments that have a nonEtrivial relationship to the underlying microphysics. By examining examples within condensed matter physics, and also from cellular automata, I show how such emergent narratives efficiently describe elements of reality.

18 Feb 2016 Jean-Pierre Llored (University of Clermont-Ferrand), ‘From quantum physics to quantum chemistry’.

Abstract: The first part, which is mainly anthropological, summarizes the results of a survey that we carried out in several research laboratories in 2010. Our aims were to understand what quantum chemists currently do, what kind of questions they ask, and what kind of problems they have to face when creating new theoretical tools both for understanding chemical reactivity and predicting chemical transformations.

The second part, which is mainly historical, highlights the philosophical underpinnings that structure the development of quantum chemistry from 1920 to nowadays. In so doing, we will discuss chemical modeling in quantum chemistry, and the different strategies used in order to define molecular features using atomic ones and the molecular surroundings at the same time. We will show how computers and new laboratories emerged simultaneously, and reshaped the culture of quantum chemistry. This part goes on to describe how the debate between ab initio and semi-empirical methods turned out to be highly controversial because of underlying scientific and metaphysical assumptions about, for instance, the nature of the relationships between science and the possibility for human knowledge to reach a complete description of the world.

The third and last part is about the philosophical implications for the study of quantum chemistry and that of ‘quantum sciences’ at large. It insists on the fact that the history of quantum chemistry is also a history of the attempts of chemists to establish the autonomy of their theories and methods with respect to physical, mathematical, and biological theories. According to this line of argument, chemists gradually proposed new concepts in order to circumvent the impossibility to perform full analytical calculations and to make the language of classical structural chemistry and that of quantum chemistry compatible. Among different topics, we will query the meaning of a chemical bond, the impossibility to deduce a molecular shape from the Schrödinger equation, the way quantum chemistry is involved in order to explain the periodic table, and the possibility to go beyond the Born-Oppenheimer approximation. We would like to show that quantum chemistry is neither physics nor chemistry nor applied mathematics, and that philosophical debates which turned out to be relevant in quantum physics are not necessarily so in quantum chemistry, whereas other philosophical questions arise…

11th Feb 2016 David Wallace (Oxford, Philosophy) , ‘Who’s afraid of coordinate systems?’.

Abstract: Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ‘coordinate-free’ differential geometry relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity.

21 Jan 2016 Philipp Roser (Clemson), ‘‘Time and York time in quantum theory’.

Abstract: Classical general relativity has no notion of a physically meaningful time parameter and one is free to choose one’s coordinates at will. However, when attempting to quantise the theory this freedom leads to difficulties, the notorious `problem of time’ of canonical quantum gravity. One way to overcome this obstacle is the identification of a physically fundamental time parameter. Interestingly, although purely aesthetic at the classical level, different choices of time parameter may in principle lead to different quantum phenomenologies, as I will illustrate with a simple model. This means that an underlying physically fundamental notion of time may (to some extent) be detectable via quantum effects.

For various theoretical reasons one promising candidate for a physical time parameter is `York time’, named after James York and his work on the initial-value problem of general relativity, where its importance first became apparent. I will derive the classical and quantum dynamics with respect to York time for certain cosmological models and discuss some of the unconventional structural features of the resulting quantum theory.

3 Dec 2015 Thomas Moller-Nielsen (Oxford), “Symmetry and the Interpretation of Physical Theories”

Abstract: In this talk I examine two (putative) ways in which symmetries can be used as tools for physical theory interpretation. First, I examine the extent to which symmetries can be used as a guide to a theory’s ideology: that is, as a means of determining which quantities are real, according to the theory. Second, I examine the extent to which symmetries can be used as a guide to a theory’s ontology: that is, as a means of determining which objects are real, according to the theory. I argue that symmetries can only legitimately be used in the first, but not the second, sense.

26 Nov 2015 Ellen Clarke (All Souls), “Biological Ontology”.

Abstract: All sciences invent kind concepts: names for categories that gather particulars together according to their possession of some scientifically interesting properties. But kind concepts must be well-motivated: they need to do some sort of work for us. I show how to define one sort of scientific concept – that of the biological individual, or organism – so that it does plenty of work for biology. My view understands biological individuals as defined by the process of evolution by natural selection. I will engage in some speculation about how the situation compares in regard to other items of scientific ontology.

19 November 2015 Dan Bedingham (Oxford) “Dynamical Collapse of the Wavefunction and Relativity”.

Abstract: When a collapse of the wave function takes place it has an instantaneous effect over all space. One might then assume that a covariant description is not possible since a collapse whose effects are simultaneous in one frame of reference would not have simultaneous effects in a boosted frame. I will show, however, that in fact a consistent covariant picture emerges in which the collapsing wave function depends on the choice of foliation of space time, but that suitably defined local properties are unaffected by this choice. The formulation of a covariant description is important for models attempting to describe the collapse of wave function as a dynamical process. This is a very direct approach to solving the quantum measurement problem. It involves simply giving the wave function the stochastic dynamics that it has in practice. We present some proposals for relativistic versions of dynamical collapse models.

12 November 2015 Karim Thébault (Bristol) “Regarding the ‘Hole Argument’ and the ‘Problem of Time’”

Abstract: The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper I will examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, I will consider the extent to which the two problems can and cannot be precisely and distinctly characterised. Second, from a philosophical perspective, I will consider the implications of various responses to the problems, with a particular focus upon the viability of a ‘deflationary’ attitude to the relationalist/substantivalist debate regarding the ontology of space-time. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies.

5 November 2015 Joseph Melia (Oxford) “Haecceitism, Identity and Indiscernibility: (Mis-)Uses of Modality in the Philosophy of Physics”

Abstract: I examine a number of arguments involving modality and identity in the Philosophy of Physics. In particular, (a) Wilson’s use of Leibniz’ law to argue for emergent entities; (b) the implications of anti-haecceitism for the Hole argument in GR and QM; (c) the proposal to “define” or “ground” or “account” for identity via some version of Principle of the Identity of Indiscernibles or the Hilbert-Bernays formula.

Against (a) I argue that familiar problems with applications of Leibniz’ law in modal contexts block the argument for the existence of emergent entities;

On (b), I argue that (i) there are multiple and incompatible definitions of haecceitism at play in the literature; (ii) that, properly understood, haecceitism *is* a plausible position; indeed, even supposedly mysterious haecceities do not warrant the criticism of obscurity they have received; (iii) we do better to solve the Hole argument by other means than a thesis about the range and variety of possibilities.

On (c), I argue that recent attempts to formulate a principle of PII fit to serve as a definition of identity are either trivially true, or must draw distinctions between different kinds of properties that are problematic: better to accept identity as primitive.

Some relevant papers/helpful reading (I will not, of course, assume familiarity with these papers)
J. Ladyman: `On the Identity and Diversity of Objects in a Structure.’ Proc. Aristotelian Supp Soc. (2007).
D. Lewis: `On the Plurality of Worlds’, Chp.4. (1986)
O. Pooley: `Points, Particles and Structural Realism’, in Rickles, French and Saatsi, `The Structural Foundations of Quantum Gravity.’ (2006)
S. Saunders: `Are Quantum Particles Objects?’ Analysis (2006)
J. Wilson: `Non-Reductive Physicalism and Degrees of Freedom’, BJPS (2010)

29 October 2015 Chiara Marletto (Oxford, Materials), “Constructor theory of information (and its implications for our understanding of quantum theory)”.

Abstract: Constructor Theory is a radically new mode of explanation in fundamental physics. It demands a local, deterministic description of physical reality – expressed exclusively in terms of statements about what tasks are possible, what are impossible, and why. This mode of explanation has recently been applied to provide physical foundations for the theory of information – expressing, as conjectured physical principles, the regularities of the laws of physics necessary for there to be what has been so far informally called ‘information’. In constructor theory, one also expresses exactly the relation between classical information and the so-called ‘quantum information’ – showing how properties of the latter arise from a single, constructor-theoretic constraint. This provides a unified conceptual basis for the quantum theory of information (which was previously lacking one qua theory of information). Moreover, the arising of quantum-information like properties in a deterministic, local framework also has implications for the understanding of quantum theory, and of its successors.

22 October 2015 Bryan Roberts (LSE) “The future of the weakly interacting arrow of time”.

Abstract: This talk discusses the evidence for time asymmetry in fundamental physics. The main aim is to propose some general templates characterising how time asymmetry can be detected among weakly interacting particles. We will then step back and evaluate how this evidence bears on time asymmetry in future physical theories beyond the standard model.

15 October 2015 Oscar Dahlsten (Oxford Physics) “The role of information in work extraction”.

Abstract: Since Maxwell’s daemon it has been known that extra information can give more work. I will discuss how this can be made concrete and quantified. I will focus on
so-called single-shot statistical mechanics. There one can derive expressions for the maximum work one can extract from a system given one’s information. Only one property
of the state one assigns to the system matters: the entropy. There are subtleties, including which entropy to use. I will also discuss the relation to fluctuation theorems, and our recent
paper on realising a photonic Maxwell’s daemon.

Some references, I will certainly not assume you have looked at them:
arXiv:0908.0424 The work value of information, Dahlsten, Renner, Rieper and Vedral
arXiv:1009.1630 The thermodynamic meaning of negative entropy, del Rio, Aaberg, Renner, Dahlsten and Vedral
arXiv:1207.0434 A measure of majorisation emerging from single-shot statistical mechanics, Egloff, Dahlsten, Renner, Vedral
arXiv:1409.3878 Introducing one-shot work into fluctuation relations, Yunger Halpern, Garner, Dahlsten, Vedral
arXiv:1504.05152 Equality for worst-case work at any protocol speed, Dahlsten, Choi, Braun, Garner, Yunger Halpern, Vedral
arxiv:1510.02164 Photonic Maxwell’s demon, Vidrighin, Dahlsten, Barbieri, Kim, Vedral and Walmsley”

11 June 2015 Tim Pashby (University of Southern California)
‘Schroedinger’s Cat: It’s About Time (Not Measurement)’

Abstract: I argue for a novel resolution of Schroedinger’s cat paradox by paying particular attention to the role of time and tense in setting up the problem. The quantum system at the heart of the paradoxical situation is an unstable atom, primed for indeterministic decay at some unknown time. The conventional account gives probabilities for the result of instantaneous measurements and leads to the unacceptable conclusion that the cat can neither be considered alive nor dead until the moment the box is opened (at a time of the experimenter’s choosing). To resolve the paradox I reject the status of the instantaneous quantum state as `truthmaker’ and show how a quantum description of the situation can be given instead in terms of time-dependent chance propositions concerning the time of decay, without reference to measurement.

The conclusions reached in the case of Schroedinger’s cat may be generalized throughout quantum mechanics with the means of event time observables (interpreted as conditional probabilities), which play the role of the time of decay for an arbitrary system. Conventional quantum logic restricts its attention to the lattice of projections, taken to represent possible properties of the system. I argue that event time observables provide a compelling reason to look beyond the lattice of projections to the algebra of effects, and suggest an interpretation in which propositions are made true by events rather than properties. This provides the means to resolve the Wigner’s friend paradox along similar lines.

4th June 2015 Neil Dewar (Oxford)
‘Symmetry and Interpretation: or, Translations and Translations’

Abstract: There has been much discussion of whether we should take (exact) symmetries of a physical theory to relate physically equivalent states of affairs, and – if so – what it is that justifies us in so doing. I argue that we can understand the propriety of this move in essentially semantic terms: namely, by thinking of a symmetry transformation as a means of translating a physical theory into itself. To explain why symmetry transformations have this character, I’ll first look at how notions of translation and definition are dealt with in model theory. Then, I’ll set up some analogies between the model-theoretic formalism and the formalism of differential equations, and show how the relevant analogue of self-translation is a symmetry transformation. I conclude with some remarks on how this argument bears on debates over theoretical equivalence.

28th May 2015George Ellis (Cape Town)
‘On the crucial role of top-down causation in complex systems’

Abstract: It will be suggested that causal influences in the real world occurring on evolutionary, developmental, and functional timescales are characterized by a combination of bottom up and top down effects. Digital computers give very clear exemplars of how this happens. There are five different distinct classes of top down effects, the key one leading to the existence of complex systems being adaptive selection. The issue of how there can be causal openness at the bottom allowing this to occur will be discussed. The case will be made that while bottom-up self-assembly can attain a certain degree of complexity, truly complex systems such as life can only come into being if top-down processes come into play in addition to bottom up processes. They allow genuine emergence to occur, based in multiple realisability at lower levels of higher level structures and functions.

21 May 2015
Francesca Vidotto (Radboud University, Nijmegen) “Relational ontology from General Relativity and Quantum Mechanics”.

Abstract: Our current most reliable physical theories, General Relativity and Quantum Mechanics, point both towards a relational description of reality. General Relativity builds up the spacetime structure from the notion of contiguity between dynamical objects. Quantum Mechanics describes how physical systems affect one another in the course of interactions. Only local interactions define what exists, and there is no meaning in talking about entities but in terms of local interactions.

14 May 2015 Harvey Brown (Philosohy, Oxford) and Chris Timpson (Philosophy, Oxford) “Bell on Bell’s theorem: the changing face of nonlocality”.

Between 1964 and 1990, the notion of nonlocality in Bell’s papers underwent a profound change as his nonlocality theorem gradually became detached from quantum mechanics, and referred to wider probabilistic theories involving correlations between separated beables. The proposition that standard quantum mechanics is itself nonlocal (more precisely, that it violates ‘local causality’) became divorced from the Bell theorem per se from 1976 on, although this important point is widely overlooked in the literature. In 1990, the year of his death, Bell would express serious misgivings about the mathematical form of the local causality condition, and leave ill-defined the issue of the consistency between special relativity and violation of the Bell-type inequality. In our view, the significance of the Bell theorem, both in its deterministic and stochastic forms, can only be fully understood by taking into account the fact that a fully Lorentz-covariant version of quantum theory, free of action-at-a-distance, can be articulated in the Everett interpretation.

7 May 2015 Mauro Dorato (Rome) “The passage of time between physics and psychology”.

Abstract: The three main aims of my paper are: To defend a minimalistic theory of objective becoming that takes STR and GTR at face value; to bring to bear relevant neuro-psychological data in support of 1; to combine 1 and 2 to try to explain with as little metaphysics as possible three key features of our experience of passage, namely:
1. Our untutored belief in a cosmic extension of the now (leading to postulate privileged frames and presentism;
2. The becoming more past of the past (leading to Skow’s 2009 moving spotlight, branching spacetimes)
3. The fact that our actions clearly seem to bring new events into being (Broad 1923, Tooley 1997, Ellis 2014)

26 February 2015 James Ladyman (Bristol)”Do local symmetries have ‘direct empirical consequences’?”

Abstract: Hilary Greaves and David Wallace argue that, contrary to the widespread view of philosophers of physics, local symmetries have direct empirical consequences. They do this by showing that there are `Galileo’s Ship Scenarios’ in theories with local symmetries. In this paper I will argue that the notion of `direct empirical consequences’ is ambiguous and admits of two kinds of precisification. Greaves and Wallace do not purport to show that local symmetries have empirical consequences in the stronger of the two senses, but I will argue that it is the salient one. I will then argue that they are right to focus on Galileo’s Ship Scenarios, and I will offer a characterisation of the form of such arguments from symmetries to empirical consequences. I will then discuss how various examples relate to this template. I will then offer a new argument in defence of the orthodoxy that direct empirical consequences do not depend on local symmetries.

19 February 2015 David Wallace (Oxford): “Fields as Bodies: a unified treatment of spacetime and gauge symmetry”.

Abstract: Using the parametrised representation of field theory (in which the location in spacetime of a part of a field is itself represented by a map from the base manifold to Minkowski spacetime) I demonstrate that in both local and global cases, internal (Yang-Mills-type) and spacetime (Poincare) symmetries can be treated precisely on a par, so that gravitational theories may be regarded as gauge theories in a completely standard sense.

12 February 2015 Erik Curiel (Munich), “Problems with the interpretation of energy conditions in general relativity”.

Abstract: An energy condition, in the context of a wide class of spacetime theories (including general relativity), is, crudely speaking, a relation one demands the stress-energy tensor of matter satisfy in order to try to capture the idea that “energy should be positive”. The remarkable fact I will discuss is that such simple, general, almost trivial seeming propositions have profound and far-reaching import for our understanding of the structure of relativistic spacetimes. It is therefore especially surprising when one also learns that we have no clear understanding of the nature of these conditions, what theoretical status they have with respect to fundamental physics, what epistemic status they may have, when we should and should not expect them to be satisfied, and even in many cases how they and their consequences should be interpreted physically. Or so I shall argue, by a detailed analysis of the technical and conceptual character of all the standard conditions used in physics today, including examination of their consequences and the circumstances in which they are believed to be violated in the actual universe.

22nd January 2015 Jonathan Halliwell (Imperial College London):”Negative Probabilities, Fine’s Theorem and Quantum Histories”.

Abstract: Many situations in quantum theory and other areas of physics lead to quasi-probabilities which seem to be physically useful but can be negative. The interpretation of such objects is not at all clear. I argue that quasi-probabilities naturally fall into two qualitatively different types, according to whether their non-negative marginals can or cannot be matched to a non-negative probability. The former type, which we call viable, are qualitatively similar to true probabilities, but the latter type, which we call non-viable, may not have a sensible interpretation. Determining the existence of a probability matching given marginals is a non-trivial question in general. In simple examples, Fine’s theorem indicates that inequalities of the Bell and CHSH type provide criteria for its existence. A simple proof of Fine’s theorem is given. The results have consequences for the linear positivity condition of Goldstein and Page in the context of the histories approach to quantum theory. Although it is a very weak condition for the assignment of probabilities it fails in some important cases where our results indicate that probabilities clearly exist. Some implications for the histories approach to quantum theory are discussed.

4 December 2014: Tony Sudbery (Maths, York), “The logic of the future in the Everett-Wheeler understanding of quantum theory”

Abstract: I discuss the problems of probability and the future in the Everett-Wheeler understanding of quantum theory. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. I construct a lattice of tensed propositions, with truth values in the interval [0, 1], and derive logical properties of the truth values given by the usual quantum-mechanical formula for the probability of histories. I argue that with this understanding, Everett-Wheeler quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

27 November 2014 : Owen Maroney (Philosophy, Oxford), “How epistemic can a quantum state be?”

Abstract: The “psi-epistemic” view is that the quantum state does not represent a state of the world, but a state of knowledge about the world. It draws its motivation, in part, from the observation of qualitative similarities between characteristic properties of non-orthogonal quantum wavefunctions and between overlapping classical probability distributions. It might be suggested that it gives a natural explanation for these properties, which seem puzzling for the alternative “psi-ontic” view. However, for two key similarities, quantum state overlap and quantum state discrimination, it turns out that the psi-epistemic view cannot account for the values shown by quantum theory, and for a wide range of quantum states must rely on the same supposedly puzzling explanations as the “psi-ontic” view.

20 November 2014 : Boris Zilber (Maths, Oxford), “The semantics of the canonical commutation relations”

Abstract: I will argue that the canonical commutation relations and the way of calculating with those discovered in the 1920th is in essence a syntactic reflection of a world the semantics of which is still to be reconstructed. The same can be said about the calculus of Feynman integrals. Similar developments have been taking place in pure mathematics since the 1950s in the form of Grothendieck’s schemes and the formalism of non-commutative geometry. I will report on some progress of reconstructing the missing semantics. In particular, for the canonical commutation relations it leads to a theory of representation in finite-dimensional “algebraic Hilbert spaces” which in the limit look rather similar, although not the same, as conventional Hilbert spaces.

13 November 2014 1st BLOC Seminar, KCL, London : Huw Price (Philosophy, Cambridge), “Two Paths to the Paris Interpretation”

Abstract: In 1953 de Broglie’s student, Olivier Costa de Beauregard, raised what he took to be an objection to the EPR argument. He pointed out that the EPR assumption of Locality might fail, without action-at-a-distance, so long as the influence in question is allowed to take a zigzag path, via the past lightcones of the particles concerned. (He argued that considerations of time-symmetry counted in favour of this proposal.) As later writers pointed out, the same idea provides a loophole in Bell’s Theorem, allowing a hidden variable theory to account for the Bell correlations, without irreducible spacelike influence. (The trick depends on the fact that retrocausal models reject an independence assumption on which Bell’s Theorem depends, thereby blocking the derivation of Bell’s Inequality.) Until recently, however, it seems to have gone unnoticed that there is a simple argument that shows that the quantum world must be retrocausal, if we accept three assumptions (one of them time-symmetry) that would have all seemed independently plausible to many physicists in the years following Einstein’s 1905 discovery of the quantisation of light. While it is true that later developments in quantum theory provide ways of challenging these assumptions – different ways of challenging them, for different views of the ontology of the quantum world – it is interesting to ask whether this new argument provides a reason to re-examine the Costa de Beauregard’s ‘Paris interpretation’.

6 November 2014 : Vlatko Vedral (Physics, Oxford), “Macroscopicity”

ABSTRACT: We have a good framework for how to quantify entanglement based, broadly speaking, on two different ideas. One is the fact that local operations and classical communications (LOCCs) do not increase entanglement and hence introduce a natural ordering on the set of entangled states. The other one is inspired by the mean-field theory and quantifies entanglement of a state by how difficult it is to approximate it with disentangled states (the two, while not identical, lead frequently to the same measures). Interestingly, neither of these captures the notion of “macroscopicity” which ask what states are very quantum and macroscopic at the same time. Here the GHZ states win as the ones with the highest macroscopicity, however, they are not highly entangled as far as either the LOCCs or the mean-field theory point of view. I discuss different ways of quantifying macroscopicity and exemplify them with a range of quantum experiments producing different many-body states (GHZ, and general GHZ states, cluster states, topological states). And the winner for producing the highest degree of macroscopicity is…

30 October 2014 : David Wallace (Philosophy, Oxford), “How not to do the metaphysics of quantum mechanics”

Abstract: Recent years have seen an increasing interest in the metaphysics of quantum theory. While welcome, this trend has an unwelcome side effect: an inappropriate (and often unknowing) identification of quantum theory in general with one particular brand of quantum theory, namely the nonrelativistic mechanics of finitely many point particles. In this talk I’ll explain just why this is problematic, partly by analogy with questions about the metaphysics of classical mechanics.

23 October 2014 : Daniel Bedingham (Philosophy, Oxford), “Time reversal symmetry and collapse models”

Abstract: Collapse models are modifications of quantum theory where the wave function is treated as physically real and collapse of the wave function is a physical process. This introduces a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. However, it is shown that if the physically real part of the model is reduced to the set of points in space and time about which the collapses occur then a collapsing wave function picture can be given both forward and backward in time, in each case satisfying the Born rule (under certain conditions). This implies that if the collapse locations can serve as an ontology then these models can in fact have time reversal symmetry.

16 October 2014 : Dennis Lehmkuhl, “Einstein, Cartan, Weyl, Jordan: The neighborhood of General Relativity in the space of spacetime theories”.

Abstract: Recent years have seen a renewed interest in Newton-Cartan theory (NCT), i.e. Newtonian gravitation theory reformulated in the language of differential geometry. The comparison of this theory with the general theory of relativity (GR) has been particularly interesting, among other reasons, because it allows us to ask how `special’ GR really is, as compared to other theories of gravity. Indeed, the literature so far has focused on the similarities between the two theories, for example on the fact that both theories describe gravity in terms of curvature, and the paths of free particles as geodesics. However, the question of how `special’ GR is can only be properly answered if we highlight differences as much as similarities, and there are plenty of differences between NCT and GR. Furthermore, I will argue that it is not enough to compare GR to simpler theories like NCT, we also have to compare it to more complicated theories; more complicated in terms of geometrical structure and gravitational degrees of freedom. While NCT is the most natural degenerative limit of GR, gravitational theory defined on a Weyl geometry (to be distinguished from a unified field theory based on Weyl geometry) and gravitational scalar-tensor theories (like Jordan-Brans-Dicke theory) are two of the most natural generalisations of GR. Thus, in this talk I will compare Newton-Cartan, GR, Weyl and Jordan-Brans-Dicke theory, to see how special GR really is as compared to its immediate neighborhood in the `space of spacetime theories’.

19 June 2014 : Antony Valentini (Physics, Clemson), “Hidden variables in the early universe II: towards an explanation for large-scale cosmic anomalies”

Abstract: Following on from Part I, we discuss the large-scale anomalies that have been reported in measurements of the cosmic microwave background (CMB) by the Planck satellite. We consider how the anomalies might be explained as the result of incomplete relaxation to quantum equilibrium at long wavelengths on expanding space (during a ‘pre-inflationary phase’) in the de Broglie-Bohm formulation of quantum theory. The first anomaly we consider is the reported large-scale power deficit. This could arise from incomplete relaxation for the amplitudes of the primordial perturbations. It is shown, by numerical simulations, that if the pre-inflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (a specific dependence on wavelength). It is also shown that our scenario is able to produce a power deficit in the observed region and of the observed magnitude, for an appropriate choice of cosmological parameters. The second anomaly we consider is the reported large-scale anisotropy. This could arise from incomplete relaxation for the phases of the primordial perturbations. We report on recent numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. While difficult questions remain concerning the extent to which the data might support our scenario, we argue that we have an (at least) viable model that is able to explain two apparently independent cosmological anomalies at a single stroke.

12 June 2014 : Antony Valentini (Physics, Clemson), “Hidden variables in the early universe I: quantum nonequilibrium and the cosmic microwave background”.

Abstract: Assuming inflationary cosmology to be broadly correct, we discuss recent work showing that the Born probability rule for primordial quantum fluctuations can be tested (and indeed is being tested) by measurements of the cosmic microwave background (CMB). We consider in particular the hypothesis of ‘quantum nonequilibrium’ — the idea that the universe began with an anomalous distribution of hidden variables that violates the Born rule — in the context of the de Broglie-Bohm pilot-wave formulation of quantum field theory. An analysis of the de Broglie-Bohm field dynamics on expanding space shows that relaxation to quantum equilibrium is generally retarded (and can be suppressed) for long-wavelength field modes. If the initial probability distribution is assumed to have a less-than-quantum variance, we may expect a large-scale power deficit in the CMB — as appears to be observed by the Planck satellite. Particular attention is paid to conceptual questions concerning the use of probabilities ‘for the universe’ in modern theoretical and observational cosmology.
[Key references: A. Valentini, ‘Inflationary Cosmology as a Probe of Primordial Quantum Mechanics’, Phys. Rev. D 82, 063513 (2010) [arXiv:0805.0163]; S. Colin and A. Valentini, ‘Mechanism for the suppression of quantum noise at large scales on expanding space’, Phys. Rev. D 88, 103515 (2013) [arXiv:1306.1579].]

5 June 2014 : Mike Cuffaro, “Reconsidering quantum no-go theorems from a computational perspective”

Abstract: Bell’s and related inequalities are misleadingly thought of as “no-go” theorems, except in a highly qualified sense. More properly, they should be understood as imposing constraints on locally causal models which aim to recover quantum mechanical predictions. Thinking of them as no-go theorems is nevertheless mostly harmless in most circumstances; i.e., the necessary qualifications are, in typical discussions of the foundations of quantum mechanics, understood as holding unproblematically. But the situation can change once we leave the traditional context. In the context of a discussion of quantum computation and information, for example, our judgements regarding which locally causal models are to be ruled out as implausible will be different than our similar judgements in the traditional context. In particular, the “all-or-nothing” GHZ inequality, which is traditionally considered to be a more powerful refutation of local causality than statistical inequalities like Bell’s, has very little force in the context of a discussion of quantum computation and information. In this context it is only the statistical inequalities which can legitimately be thought of as no-go theorems. Considering this situation serves to emphasise, I argue, that there is a difference in aim between practical sciences like quantum computation and information, and the foundations of quantum mechanics traditionally construed: describing physical systems as they exist and interact with one another in the natural world is different from describing what one can do with physical systems.

22 May 2014 Elise Crull, “Whence Physical Significance in Bimetric Theories?”

Abstract: Recently there has been lively discussion regarding a certain class of alternative theories to general relativity called bimetric theories. Such theories are meant to resolve certain physical problems (e.g. the existence of ghost fields and dark matter) as well as philosophical problems (e.g. the apparent experimental violation of relativistic causality and assigning physical significance to metrics).
In this talk, I suggest that a new type of bimetric theory wherein matter couples to both metrics may yield further insights regarding those same philosophical questions, while at the same time addressing (perhaps to greater satisfaction!) the physical worries motivating standard bimetric theories.

15 May 2014: Julian Barbour (Independent), “A Gravitational Arrow of Time”.

Abstract: My talk (based on arXiv: 1310.5167 [gr-qc]) will draw attention to a hitherto unnoticed way in which scale-invariant notions of complexity and information can be defined in the problem of N point particles interacting through Newtonian gravity. In accordance with these definitions, all typical solutions of the problem with nonnegative energy divide at a uniquely defined point into two halves that are effectively separate histories. They have a common ‘past’ at the point of division but separate ‘futures’. In each half, the arrow from past to future is defined by growth of the complexity and information. All previous attempts to explain how time-symmetric laws can give rise to the various arrows of time have invoked special boundary conditions. In contrast, the complexity and information arrows are inevitable consequences of the form
of the gravitational law and nothing else. General relativity
shares key structural features with Newtonian gravity, so it may be possible to obtain similar results for Einsteinian gravity.

8 May 2014 : Simon Saunders (Philosohy, Oxford), “Reference to indistinguishables, and other paradoxes”.

Abstract:There is a seeming-paradox about indistinguishables: if described only by totally symmetric properties and relations, or by totally (anti)-symmetrized states, then how is reference to them possible? And we surely do refer to subsets of indistinguishable particles, and sometimes individual elementary particles (as in: the electrons, protons, and neutrons of which your computer screen is composed). Call it the paradox of composition.
The paradox can be framed in the predicate calculus as well, in application to everyday things: indistinguishability goes over to weak discernibility. It connects with two other paradoxes: the Gibbs paradox and Putnam’s paradox. It also connects with the h