“A Fork in the Middle Path: Explaining the Success of Mature Physical Theories” (5/3/17)

May 24, 2017

Shahin Kaveh

Abstract: The problem of explaining the success of science has been debated for a long time. This debate has led not only to different answers to this question, but also to different interpretations of the question itself, each of which calls for a different type of answer. Recently, Stanford has summarized this debate and identified a “Middle Path” when it comes to explaining success, characterized by the commitment that the success of the successful theory must be explained through a systematic relationship between the theory’s constructs and the inner workings of the system, whatever this relationship may be (the Maddy-Wilson principle). However, a survey of the views on the offer shows that only two types of Middle Path explanations have been offered: ones that draw on Truthlikeness, and ones that give up on fleshing out this systematic relationship in any general terms at all. I aim to offer a third way within the Middle Path, in which one can satisfy the Maddy-Wilson principle without drawing on Truthlikeness.


“A Pluralist View of Biological Individuality” (4/28/17)

May 24, 2017

Haixin Dang

Abstract: In this paper, I focus on one family of views: the so-called monist accounts of biological individuality, which have been most prominently defended by Ellen Clarke (2013) and Peter Godfrey-Smith (2009).  These accounts of biological individuality are monist because they assume that there ought to be one unified concept of the biological individual: there is only one correct way to pick out the fundamental units of the living.  I argue that the monist view is problematic and instead defend a pluralist view of biological individuality.  I argue, first of all, that within the purportedly monist account, the kinds of mechanisms/criteria defended by Clarke and Godfrey-Smith in fact picks out more than the narrow class of entities they believe they have identified.  I then defend a pluralist view by showing that, once we think beyond the scope of population genetics, we will find that there exist many different kinds of biological individuals. 


“Technique-Driven Research: Clarifying the nature of exploratory experimentation” (4/6/17)

May 24, 2017

David Colaco

Abstract: Recently, historians and philosophers of science have sought to account for experimental research that is not driven by the evaluation of theory, but instead is motivated by the desire to explore.  Along these lines, I discuss how techniques can drive research and allow scientists to explore systems.  I describe the preparation technique CLARITY, which drives cutting-edge microscopy research in neuroscience.  Though technique-driven research is exploratory and experimental, it fails to satisfy the conditions of several popular accounts of exploratory experimentation.  In light of this discrepancy, I critically assess these accounts, and reappraise what conditions are necessary for exploratory research.


“Incompressible Patterns: CRISPR vs Dennett” (3/30/17)

May 24, 2017

Katie Creel

Abstract: Dennett’s classic paper defines “Real Patterns” as present in data if
“there is a description of the data that is more efficient than the
bit map, whether or not anyone can concoct it.” (Dennett 1991, 34)
However, compressibility is not the right criterion for pattern
realism. A better pattern ontology is one based on informational
relationships between the pattern and the perceiver of the pattern,
whether human, biological, or machine.

A simple compression algorithm such as Huffman coding should be
perfect for Dennett’s purposes. It can compress text into an efficient
lossless binary tree in which letters are assigned unique codes based
on their frequency. Instead, Huffman coding illustrates the problem
with compression as a metric: lack of generalizability. If the
algorithm were only used once, it could compress a novel into one
character: “W” for all of War and Peace. But this would be no
informational savings. What makes compression work is that the “cost”
of the compression algorithm is amortized over many uses.

Further, any discrete chunk of randomness can be recognized as a
pattern if it has the right informational relationship with its
recognizer. Such recognition relationships between random sequences
and detectors occur in genetic material. New tools for genetic
manipulation such as CRISPR use a recognition relationship with
sequences of base pairs to snip and replace precise segments of DNA.
Using case studies, I suggest that we should think of patterns as
representing the informational relationship between pattern and


“Implicit learning, attention and consciousness: where gatekeepers fear to tread” (3/24/17)

March 21, 2017

Mahi Hardalupas

Abstract: An influential view on the relation between attention and consciousness is that attention is necessary and sufficient for consciousness, which means an agent cannot be conscious of anything outside the scope of attention.  This is the “gatekeeping” view defended by Prinz and De Brigard. The justification for gatekeeping relies on evidence from visual spatial attention tasks thus criticism of gatekeeping normally focuses on questioning the interpretation of these studies. In this talk, I intend to take a different approach and look to another domain of psychology: implicit learning. I will show how debates within implicit learning raise doubts about whether psychologists assume the gatekeeping view in their work and illustrate this with reference to a study in incidental auditory learning. Ultimately, I conclude that the gatekeeping view cannot be considered the assumed intuitive position of psychologists without more evidence from subfields of psychology other than spatial visual attention.


“A Logical Obscurity” (3/17/17)

March 21, 2017

Joshua Eisenthal

Abstract: There is unambiguous evidence that Wittgenstein had a deep and life-long appreciation of Heinrich Hertz’s seminal work, Principles of Mechanics. Besides two direct references to Principles in the Tractatus (at 4.04 and 6.361), Wittgenstein also included Hertz’s name on one of the rare occasions when he directly listed his influences. Furthermore, Wittgenstein quoted a passage from Hertz’s introduction both times that he gave a programmatic address at Cambridge, and even considered using an extract from this passage as the motto for the Philosophical Investigations. However, it is not clear how this passage from Hertz’s introduction relates to the rest of Hertz’s book. This has resulted in a serious obstacle to understanding what the impact of Principles was on Wittgenstein’s philosophy. In my paper I present a detailed analysis of the context and significance of this passage from Hertz’s introduction. 

Hertz asserted that his aim in Principles was to give a ‘complete and definite presentation of the laws of mechanics’. In motivating this project, Hertz complained of a logical obscurity in the traditional Newtonian formulation of mechanics, and drew particular attention to the concern felt amongst physicists over the nature of force. It is this passage, concerning the seemingly mysterious nature of force, that so struck Wittgenstein. Of particular importance is the way the passage ends: 

‘the answer which we want is not really an answer to this question [“What is the nature of force?”]. It is not by finding out more and fresh relations and connections that it can be answered; but by removing the contradictions existing between those already known, and thus perhaps by reducing their number. When these painful contradictions are removed, the question as to the nature of force will not have been answered; but our minds, no longer vexed, will cease to ask illegitimate questions.’

Hertz’s characterization of this ‘illegitimate question’, and the way he took himself to respond to it in Principles, resonated profoundly with Wittgenstein. As Wittgenstein remarked in The Big Typescript: ‘As I do philosophy, its entire task is to shape expression in such a way that certain worries disappear. ((Hertz.))’

In order to be able to compare Hertz’s methods with Wittgenstein’s, we need an account of the context and significance of this passage from Hertz’s introduction. A fundamental difficulty in this task arises from the fact that there is no satisfactory account of the ‘logical obscurity’ that motivated Hertz to write Principles in the first place. FitzGerald suggested that Hertz had simply misunderstood Newton’s third law, but it is implausible that this would have led Hertz to spend the last four years of his life reformulating mechanics. Mach suggested that Hertz had been troubled by the fact that forces are not directly observable, but a careful reading reveals that Hertz regarded an appeal to the unobservable as necessary in scientific theorizing. No other commentators, either historical or contemporary, have offered a satisfactory account of what troubled Hertz in the traditional formulation of mechanics. 

In my paper I argue that the source of Hertz’s concerns can be traced back to tacit shifts between “bottom-up” central forces and “top-down” constraint forces. Although such shifts are pervasive they are clearly problematic from a logically rigorous point of view. In particular, the velocity dependence of constraint forces threatens to undermine a clear derivation of the conservation of energy. I argue that an appreciation of these issues reveals the tension that Hertz perceived in the Newtonian notion of force. I then show how Hertz dissolved this tension through the logically perspicuous notion of force that he derived within his own framework. Thus we have in view a precise characterization of the achievement that seems to have inspired Wittgenstein’s approach to analogous tensions in philosophy.



“Daedal Data: The Problem of Empirical Adequacy” (2/24/17)

February 25, 2017

Nora Boyd

Abstract: Whatever else our theories about the natural world are, they ought to be consistent with the evidence produced by our interactions with it–our theories ought to be at least empirically adequate. This is the minimal commitment of empiricism. Yet the central notions of evidence and empirical adequacy have not been satisfactorily elucidated.  Prominent accounts of evidence treat it as detachable from the manner in which it was produced.  However, considered as detached results, the corpus of empirical evidence appears to be contradictory and discontinuous.  Empirically derived parameter values evolve, sometimes radically, over time and the very concepts used to interpret evidence change between epistemic contexts.  It would be a fool’s errand to try to make our theories adequate with respect to evidence in this sense.  In this talk, I lay the groundwork for a new empiricist philosophy of science by furnishing a non-detached characterization of evidence and an epistemology of empirical adequacy appropriate to it.  I illustrate these accounts using case studies from astrophysics and cosmology, including observations of the Hulse-Taylor pulsar, historical observations of supernovae, and the history of measurements of the Hubble parameter.