Contemporary scientific realism: the challenge from the history of science timothy d lyons - Downloa

Page 1


ScientificRealism:TheChallenge fromtheHistoryofScienceTimothyDLyons

https://ebookmass.com/product/contemporary-scientificrealism-the-challenge-from-the-history-of-science-timothy-dlyons/

Instant digital products (PDF, ePub, MOBI) ready for you

Download now and discover formats that fit your needs...

Science without God? : rethinking the history of scientific naturalism First Edition. Edition Harrison

https://ebookmass.com/product/science-without-god-rethinking-thehistory-of-scientific-naturalism-first-edition-edition-harrison/

ebookmass.com

The Challenge of World Theatre History 1st ed. Edition Steve Tillis

https://ebookmass.com/product/the-challenge-of-world-theatrehistory-1st-ed-edition-steve-tillis/

ebookmass.com

Perspectival Realism Michela Massimi

https://ebookmass.com/product/perspectival-realism-michela-massimi/

ebookmass.com

Travel Journalism and Travel Media: Identities, Places and Imaginings 1st ed. Edition Ben Cocking

https://ebookmass.com/product/travel-journalism-and-travel-mediaidentities-places-and-imaginings-1st-ed-edition-ben-cocking/ ebookmass.com

Precalculus with Limits 4th Edition Ron Larson

https://ebookmass.com/product/precalculus-with-limits-4th-edition-ronlarson/

ebookmass.com

Abnormal Psychology in a Changing World 10th Edition, (Ebook PDF)

https://ebookmass.com/product/abnormal-psychology-in-a-changingworld-10th-edition-ebook-pdf/

ebookmass.com

Genes, Environment and Alzheimer's Disease 1st Edition Lazarov

https://ebookmass.com/product/genes-environment-and-alzheimersdisease-1st-edition-lazarov/

ebookmass.com

The Practice of Family Therapy: Key Elements Across Models 5th Edition, (Ebook PDF)

https://ebookmass.com/product/the-practice-of-family-therapy-keyelements-across-models-5th-edition-ebook-pdf/

ebookmass.com

Dermatology for the Primary Care Provider 1st Edition Reid

https://ebookmass.com/product/dermatology-for-the-primary-careprovider-1st-edition-reid-a-waldman/

ebookmass.com

Always the First to Die R J Jacobs

https://ebookmass.com/product/always-the-first-to-die-r-j-jacobs-5/

ebookmass.com

Contemporary Scientific Realism

Contemporary Scientific Realism

The Challenge from the History of Science

Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and certain other countries.

Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America.

© Oxford University Press 2021

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by license, or under terms agreed with the appropriate reproduction rights organization. Inquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above.

You must not circulate this work in any other form and you must impose this same condition on any acquirer.

Library of Congress Control Number: 2021934069

ISBN 978–0–19–094681–4

DOI: 10.1093/oso/9780190946814.001.0001

Printed by Integrated Books International, United States of America

1. History and the Contemporary Scientific Realism Debate 1 Timothy D. Lyons and Peter Vickers PART I HISTORICAL CASES FOR THE

2. Theoretical Continuity, Approximate Truth, and the Pessimistic Meta-Induction: Revisiting the Miasma Theory 11 Dana Tulodziecki

3. What Can the Discovery of Boron Tell Us About the Scientific Realism Debate? 33

Jonathon Hricko

4. No Miracle after All: The Thomson Brothers’ Novel Prediction that Pressure Lowers the Freezing Point of Water 56 Keith Hutchison

5. From the Evidence of History to the History of Evidence: Descartes, Newton, and Beyond 70 Stathis Psillos

6. How Was Nicholson’s Proto-Element Theory Able to Yield Explanatory as well as Predictive Success? 99 Eric R. Scerri

7. Selective Scientific Realism and Truth-Transfer in Theories of Molecular Structure 130 Amanda J. Nichols and Myron A. Penner

8. Realism, Physical Meaningfulness, and Molecular Spectroscopy 159 Teru Miyake and George E. Smith PART

9. The Historical Challenge to Realism and Essential Deployment 183 Mario Alai

10. Realism, Instrumentalism, Particularism: A Middle Path Forward in the Scientific Realism Debate 216 P. Kyle Stanford

11. Structure not Selection

James Ladyman

12. The Case of the Consumption Function: Structural Realism in Macroeconomics

Jennifer S. Jhun

13. We Think, They Thought: A Critique of the Pessimistic Meta-Meta-Induction

Ludwig Fahrbach

14. The Paradox of Infinite Limits: A Realist Response

Patricia Palacios and Giovanni Valente

15. Realist Representations of Particles: The Standard Model, Top Down and Bottom Up

Anjan Chakravartty

1 History and the Contemporary Scientific Realism Debate

1Department of Philosophy Indiana University–Purdue University Indianapolis tdlyons@iu.edu

2Department of Philosophy Durham University peter.vickers@durham.ac.uk

The scientific realism debate began to take shape in the 1970s, and, with the publication of two key early ’80s texts challenging realism, Bas van Fraassen’s The Scientific Image (1980) and Larry Laudan’s “A Confutation of Convergent Realism” (1981), the framework for that debate was in place. It has since been a defining debate of philosophy of science. As originally conceived, the scientific realism debate is one characterized by dichotomous opposition: “realists” think that many/most of our current best scientific theories reveal the truth about reality, including unobservable reality (at least to a good approximation); and they tend to justify this view by the “no miracles argument,” or by an inference to the best explanation, from the success of scientific theories. Further, many realists claim that scientific realism is an empirically testable position. “Antirealists,” by contrast, think that such a view is lacking in epistemic care. In addition to discussions of the underdetermination of theories by data and, less commonly, competing explanations for success, many antirealists—in the spirit of Thomas Kuhn, Mary Hesse, and Larry Laudan—caution that the history of science teaches us that empirically successful theories, even the very best scientific theories, of one age often do not stand up to the test of time.

The debate has come a long way since the 1970s and the solidification of its framework in the ’80s. The noted dichotomy of “realism”/“antirealism” is no longer a given, and increasingly “middle ground” positions have been explored. Case studies of relevant episodes in the history of science show us the specific ways in which a realist view may be tempered, but without necessarily collapsing into a full-blown antirealist view. A central part of this is that self-proclaimed “realists,” as well as “antirealists” and “instrumentalists,” are exploring historical cases in order to learn

Timothy D. Lyons and Peter Vickers, History and the Contemporary Scientific Realism Debate In: Contemporary Scientific Realism. Edited by: Timothy D. Lyons and Peter Vickers, Oxford University Press. © Oxford University Press 2021. DOI: 10.1093/oso/9780190946814.003.0001

the relevant lessons concerning precisely what has, and what hasn’t, been retained across theory change. In recent decades one of the most important developments has been the “divide et impera move,” introduced by Philip Kitcher (1993) and Stathis Psillos (1999)—possibly inspired by Worrall (1989)—and increasingly embraced by numerous other realists. According to this “selective” strategy, any realist inclinations are directed toward only those theoretical elements that are really doing the inferential work to generate the relevant successes (where those successes are typically explanations and predictions of phenomena). Such a move is well motivated in light of the realist call to explain specific empirical successes: those theoretical constituents that play no role, for instance, in the reasoning that led to the successes, likewise play no role in explaining those successes. Beyond that, however, and crucially, the divide et impera move allows for what appears to be a testable realist position consistent with quite dramatic theory-change: the working parts of a rejected theory may be somehow retained within a successor theory, even if the two theories differ very significantly in a great many respects. Even the working parts may be retained in a new (possibly approximating) form, such that the retention is not obvious upon an initial look at a theoretical system but instead takes considerable work to identify. This brings us to a new realist position, consistent with the thought that many of our current best theories may one day be replaced. Realism and antirealism are no longer quite so far apart, and this is progress.

A major stimulus for, and result of, this progress has been a better understanding of the history of science. Each new case study brings something new to the debate, new lessons concerning the ways in which false theoretical ideas have sometimes led to success, or the ways in which old theoretical ideas “live on,” in a different form, in a successor theory. At one time the debate focused almost exclusively on three famous historical cases: the caloric theory of heat, the phlogiston theory of combustion, and the aether theory of light. An abundance of literature was generated, and with good reason: these are extremely rich historical cases, and there is no simple story to tell of the successes these theories enjoyed, and the reasons they managed to be successful despite being very significantly misguided (in light of current theory). But the history of science is a big place, and it was never plausible that all the important lessons for the debate could be drawn from just three cases. This is especially obvious when one factors in a “particularist” turn in philosophy of science where focus is directed toward particular theoretical systems. These days, many philosophers are reluctant to embrace grand generalizations about “science” once sought by philosophers, taking those generalizations to be a dream too far. Science works in many different ways, in different fields and in different contexts. It follows that the realism debate ought to be informed by a rich diversity of historical cases.

It is with this in mind that the present volume is put forward. In recent years a flood of new case studies has entered the debate, and is just now being worked

through. At the same time it is recognized that there are still many more cases out there, waiting to be analyzed in a particular way sensitive to the concerns of the realism debate. The present volume advertises this fact, introducing as it does several new cases as bearing on the debate, or taking forward the discussion of historical cases that have only very recently been introduced. At the same time, the debate is hardly static, and as philosophical positions shift this affects the very kind of historical cases that are likely to be relevant. Thus the work of introducing and analyzing historical cases must proceed hand in hand with philosophical analysis of the different positions and arguments in play. It is with this in mind that we divide the present volume into two parts, covering “Historical Cases for the Debate” and “Contemporary Scientific Realism,” each comprising of seven chapters.

Many of the new historical cases are first and foremost challenges to the realist position, in that they tell scientific stories that are apparently in tension with even contemporary, nuanced realist claims. The volume kicks off in Chapter 2 with just such a case, courtesy of Dana Tulodziecki. For Tulodziecki, the miasma theory of disease delivered very significant explanatory and predictive successes, while being radically false by present lights. Further, even the parts of the theory doing the work to bring about the successes were not at all retained in any successor theory. Thus, Tulodziecki contends, the realist must accept that in this case false theoretical ideas were instrumental in delivering successful explanations and predictions of phenomena.

This theme continues in Chapter 3, with Jonathon Hricko’s study of the discovery of boron. This time the theory in question is Lavoisier’s oxygen theory of acidity. Hricko argues that the theory is not even approximately true, and yet nevertheless enjoyed novel predictive success of the kind that has the power to persuade. Just as Tulodziecki, Hricko argues that the realist’s divide et impera strategy can’t help—the constituents of the theory doing the work to generate the prediction cannot be interpreted as approximately true, by present lights.

We meet with a different story in Chapter 4, however. Keith Hutchison provides a new case from the history of thermodynamics, concerning the successful prediction that pressure lowers the freezing point of water. Hutchison argues that although, at first, it seems that false theoretical ideas were instrumental in bringing about a novel predictive success, there is no “miracle” here. It is argued that the older Carnot theory and the newer Clausius theory are related in intricate ways, such that in some respects they differ greatly, while in “certain restricted situations” their differences are largely insignificant. Thus the realist may find this case useful in her bid to show how careful we need to be when we draw antirealist morals from the fact that a significantly false theory enjoyed novel predictive success.

Chapter 5 takes a slightly different approach, moving away from the narrow case study. Stathis Psillos surveys a broad sweep of history ranging from

Descartes through Newton and Einstein. We have here a “double case study,” considering the Descartes–Newton relationship and the Newton–Einstein relationship. Psillos argues against those who see here examples of dramatic theorychange, instead favoring a limited retentionism consistent with a modest realist position. Crucially, Psillos argues, there are significant differences between the Descartes–Newton relationship and the Newton–Einstein relationship, in line with the restricted and contextual retention of theory predicted by a nuanced, and epistemically modest, contemporary realist position.

Chapter 6—courtesy of Eric Scerri—introduces another new historical case, that once again challenges the realist. This time the theory is John Nicholson’s atomic theory of the early 20th century, which, Scerri argues, “was spectacularly successful in accommodating as well as predicting some spectral lines in the solar corona and in the nebula in Orion’s Belt.” The theory, however, is very significantly false; as Scerri puts it, “almost everything that Nicholson proposed was overturned.” Hence, this case is another useful lesson in the fact that quite radically false scientific theories can achieve novel predictive success, and any contemporary realist position needs to be sensitive to that.

Chapter 7 turns to theories of molecular structure at the turn of the 20th century. Amanda Nichols and Myron Penner show how the “old” BlomstrandJørgensen chain theory was able to correctly predict the number of ions that will be dissociated when a molecule undergoes a precipitation reaction. While prima facie a challenge to scientific realism, it is argued that this is a case where the divide et impera strategy succeeds: the success-generating parts of the older theory are retained within the successor, Werner’s coordination theory.

The final contribution to the “History” part of the volume—Chapter 8— concerns molecular spectroscopy, focusing on developments in scientific “knowledge” and understanding throughout the 20th century and right up to the present day. Teru Miyake and George E. Smith take a different approach from the kind of historical case study most commonly found in the realism debate. Siding with van Fraassen on the view that Perrin’s determination of Avogadro’s number, so commonly emphasized by realists, does not prove fruitful for realism, and focusing on diatomic molecules, they emphasize instead the extraordinary amount of evidence that has accumulated after Perrin and over the past ninety years for various theoretical claims concerning such molecules. Taking van Fraassen’s constructive empiricism as a foil, they indicate that, in this case, so-called realists and antirealists may really differ very little when it comes to this area of “scientific knowledge.”

The second part of the volume turns to more general issues and philosophical questions concerning the contemporary scientific realist positions. This part kicks off in Chapter 9 with Mario Alai’s analysis of the divide et impera realist strategy: he argues that certain historical cases no longer constitute

counterexamples when hypotheses are essential in novel predictions. He proposes refined criteria of essentiality, suggesting that while it may be impossible to identify exactly which components are essential in current theories, recognizing in hindsight those which were not essential in past theories is enough to make the case for deployment realism.

In Chapter 10, Kyle Stanford makes the case that so-called realists and instrumentalists are engaged in a project together. For Stanford, these traditionally diametrically opposed protagonists are now working together to “actively seek to identify, evaluate, and refine candidate indicators of epistemic security for our scientific beliefs.” As philosophy of science, and the realism debate in particular, becomes increasingly “local,” Stanford also argues for “epistemic guidance intermediate in generality” between the sweeping generalizations of 1970s and ’80s realism, and a radical “particularism” where any realist claim should always be specific to one particular theory, or theoretical claim.

Chapter 11 turns to the relationship between the realist’s divide et impera strategy and the structural realist position. James Ladyman argues that structural realism is not a form of selective realism (or at least doesn’t have to be). For Ladyman, structural realism represents a departure from standard scientific realism, not a modification of it. He also argues that scientific realists face ontological questions (not only epistemic ones), and he defends a “real patterns” approach to what he calls the “scale relativity of ontology.” This allows for equally “realist” claims to be made at the level of fundamental physics and at the macroscopic level.

In Chapter 12, Jennifer Jhun explores the realism debate in a different territory. In particular, she considers the possibility of taking a structural realist attitude toward macroeconomic theory. Taking the consumption function as a case study, she argues that a better take on macroeconomic theory involves a compromise between structural realism and instrumentalism. For Jhun, when it comes to economics (at least), “theories are instruments used to find out the truth.”

In Chapter 13, Ludwig Fahrbach considers a prominent antirealist argument against the claim that realism can be defended against the pessimistic metainduction (PMI) by invoking the exponential growth of scientific evidence. The antirealist response to this defense depends on the claim that realists could have said the same thing in the past. He introduces this antirealist response as the “PMMI,” the pessimistic meta-meta-induction. Fahrbach’s challenge focuses on a particular weak spot common to both the traditional PMI and the PMMI. Thus realists unimpressed by the traditional PMI will not be moved by the new PMMI.

Chapter 14 turns to another important aspect of the modern realism debate: the use of “radically false” theoretical assumptions such as infinite limits in many contemporary, highly successful theories. Patricia Palacios and Giovanni Valente note how “infinite idealizations” misrepresent the target system, and

sometimes it appears that the introduction of such blatant falsity is necessary to achieve empirical success. Focusing on various examples in physics, such as classical and quantum phase transitions as well as thermodynamically reversible processes, Palacios and Valente propose a realist response to such cases.

Last but not least, in Chapter 15 Anjan Chakravartty tackles the standard model of particle physics, and in particular what the realist might say about representations of fundamental particles. Introducing the realist “tightrope,” Chakravartty discusses the trade-off between committing to too little, and committing to too much. But in the end, he argues, this is a tightrope that is not too thin to walk.

These articles have been carefully collected for this volume over many years, and in particular during the Lyons/Vickers 2014–18 Arts and Humanities Research Council (AHRC) project “Contemporary Scientific Realism and the Challenge from the History of Science.” The project enjoyed seven major events over its lifetime, out of which the fourteen substantive chapters of this volume ultimately grew. The seven events were:

(i) “The History of Chemistry and Scientific Realism,” a two-day workshop held at Indiana University–Purdue University Indianapolis, United States, December 6–7, 2014.

(ii) “The History of Thermodynamics and Scientific Realism,” a one-day workshop held at Durham University, UK, on May 12, 2015.

(iii) “Testing Philosophical Theories against the History of Science,” a oneday workshop held at the Oulu Centre for Theoretical and Philosophical Studies of History, Oulu University, Finland, on September 21, 2015.

(iv) “Quo Vadis Selective Scientific Realism?”—a symposium at the biennial conference of the European Philosophy of Science Association, Düsseldorf, Germany, September 23, 2015.

(v) “Contemporary Scientific Realism and the Challenge from the History of Science,” a three-day conference held at Indiana University–Purdue University, United States, February 19–21, 2016.

(vi) “Quo Vadis Selective Scientific Realism?”—a three-day conference held at Durham University, UK, August 5–7, 2017.

(vii) “The Structure of Scientific Revolutions,” a two-day workshop held at Durham University, UK, October 30–31, 2017.

The editors of this volume owe a great debt to the participants of all of these events, with special thanks in particular to those who walked this path with us a little further to produce the fourteen excellent chapters here presented. We are also grateful to the unsung heroes, the many anonymous reviewers, who not only helped us to select just which among the numerous papers submitted would

be included in the volume but also provided thorough feedback to the authors, helping to make each of the chapters that did make the cut even stronger. The volume has been a labor of love; we hope that comes across to the reader.

References

Kitcher, P. (1993). The Advancement of Science, Oxford: Oxford University Press.

Laudan, L. (1981). “A Confutation of Convergent Realism.” Philosophy of Science 48: 19–49.

Psillos, S. (1999). Scientific Realism: How Science Tracks Truth, London: Routledge.

Van Fraassen, B. (1980). The Scientific Image, Oxford: Oxford University Press.

Worrall, J. (1989). “Structural Realism: The Best of Both Worlds?” Dialectica 43: 99–124.

PART I HISTORICAL CASES FOR THE DEBATE

2 Theoretical Continuity, Approximate Truth, and the Pessimistic Meta-Induction: Revisiting the Miasma Theory

Department of Philosophy

Purdue University

dtulodzi@purdue.edu

2.1  Introduction

The pessimistic meta-induction (PMI) targets the realist’s claim that a theory’s (approximate) truth is the best explanation for its success. It attempts to do so by undercutting the alleged connection between truth and success by arguing that highly successful, yet wildly false theories are typical of the history of science. There have been a number of prominent realist responses to the PMI, most notably those of Worrall (1989), Kitcher (1993), and Psillos (1999). All of these responses try to rehabilitate the connection between a theory’s (approximate) truth and its success by attempting to show that there is some kind of continuity between earlier and later theories, structural in the case of Worrall and theoretical/referential in the cases of Kitcher and Psillos, with other responses being variations on one of these three basic themes.1

In this paper, I argue that the extant realist responses to the PMI are inadequate, since there are cases of theories that were both false and highly successful (even by the realist’s own, more stringent criteria for success) but that, nevertheless, do not exhibit any of the continuities that have been

1 It is worth pointing out that this is the case even for the (very) different structural realisms that abound. In particular, even ontic structural realism, which differs substantially from Worrall’s epistemic version, shares with Worrall’s approach an emphasis on mathematical continuities among successor theories (see, for example, Ladyman (1998) and French and Ladyman (2011)).

Dana Tulodziecki, Theoretical Continuity, Approximate Truth, and the Pessimistic Meta- Induction: Revisiting the Miasma Theory In: Contemporary Scientific Realism. Edited by: Timothy D. Lyons and Peter Vickers, Oxford University Press. © Oxford University Press 2021. DOI: 10.1093/oso/9780190946814.003.0002

suggested by realists as possible candidates for preservation. I will make my case through discussing an example of such a theory: the 19th century miasma theory of disease. Specifically, I show that this theory made a number of important and successful use-novel predictions, despite the fact that its central theoretical element— miasma— turned out not to exist. After showing that miasma was crucially involved in virtually every successful prediction the miasma theory made, I argue that not just is there no ontological continuity between the miasma theory and its successor, but neither can a case be made for any other kind of continuity, be it in terms of structure, laws, mechanisms, or kind- constitutive properties. I conclude by arguing that realists face problems regardless of whether the miasma theory is approximately true: if it is, the prospects for any kind of substantive realism are dim; if it is not, the miasma case constitutes a strike against the “convergent” part of convergent realism.

I will proceed as follows: In section 2.2, I briefly outline the PMI and the responses by Worrall, Kitcher, and Psillos, including the more stringent notion of success that Psillos argues is required in order for theories to be genuinely successful. In section 2.3, I outline the most sophisticated version of the miasma theory of disease and show that it ought to be considered a genuinely successful theory, even on Psillos’s own terms. Section 2.4 is concerned with showing that none of the extant realist responses to the PMI can account for the case of the miasma theory. After discussing some objections in section 2.5, I conclude in section 2.6.

2.2 The PMI and Realist Responses

The PMI received its most sophisticated and explicit formulation in Laudan (1981, 1984). The argument’s main target is the Explanationist Defense of Realism, according to which the best explanation for the success of science is the (approximate) truth of our scientific theories (see, for example, Boyd (1981, 1984, 1990)). Anti-realists employ the PMI to undercut this alleged connection between truth and success by pointing to the (in their view) large number of scientific theories that were discarded as false, yet regarded as highly successful. Laudan, for example, provides a list of such theories that includes, among others, the phlogiston theory of chemistry, the caloric theory of heat, and the theory of circular inertia (1984: 121). However, since “our own scientific theories are held to be as much subject to radical conceptual change as past theories” (Hesse 1976: 264), it follows, the anti-realist argues, that the success of our current theories cannot legitimize belief in unobservable entities and mechanisms: just as

past theories ended up wrong about their postulates, we might well be wrong about ours.2

Realist responses to the PMI typically come in several stages: first, realists try to winnow down Laudan’s list by including only those theories that are genuinely successful (Psillos 1999: chapter 5). While, according to Laudan, a theory is successful “as long as it has worked reasonably well, that is, so long as it has functioned in a variety of explanatory contexts, has led to several confirmed predictions, and has been of broad explanatory scope” (1984: 110), Psillos argues that “the notion of empirical success should be more rigorous than simply getting the facts right, or telling a story that fits the facts. For any theory (and for that matter, any wild speculation) can be made to fit the facts—and hence to be successful—by simply “writing” the right kind of empirical consequences into it. The notion of empirical success that realists are happy with is such that it includes the generation of novel predictions which are in principle testable” (1999: 100). The specific type of novel prediction that Psillos has in mind is so-called use-novel prediction: “the prediction P of a known fact is use-novel relative to a theory T, if no information about this phenomenon was used in the construction of the theory which predicted it” (101). Once success is understood in these more stringent terms, Psillos claims, Laudan’s list is significantly reduced.

With the list so shortened, the next step of the realist maneuver is to argue that those theories that remain on Laudan’s list ought to be regarded as (approximately) true, since they don’t, in fact, involve the radical discontinuity with later theories that Laudan suggests. Rather than being discarded wholesale during theory-change, it is argued that important elements of discarded theories get retained: according to Worrall (1989, 1994), theories’ mathematical structures are preserved, according to Kitcher (1993) and Psillos (1996, 1999), those parts of past theories that were responsible for their success are. Because these components are preserved in later theories, the argument goes, we ought to regard as approximately true those parts of our current theories that are essentially involved in generating their successes, since those are the parts that will carry over to future theories, just as essential elements from earlier theories were carried over to our own. As I will show in section 2.4, however, none of these strategies will work for the miasma theory: despite the fact that the miasma theory made a number of use-novel predictions, and so counts as genuinely successful by realist standards, none of the elements or structures that were involved in those successes were retained by its successor.3

2 Laudan’s argument is usually construed as a reductio (see Psillos (1996, 1999)). For some recent discussions about how to properly interpret the argument, see Lange (2002), Lewis (2001), and Saatsi (2005).

3 The miasma case is especially significant in view of the markedly different analysis that Saatsi and Vickers (2011) provide of Kirchhoff’s theory of diffraction. Saatsi and Vickers diagnose a specific kind of underdetermination at work in the Kirchhoff case and argue that “it should not be implausible

2.3

The Miasma Theory and Its Successes

The most sophisticated version of the miasma theory saw its heyday in the mid-1800s. The situation with respect to the various accounts of disease at that time was fairly complicated, however, and so it is in some sense misleading to speak of the miasma theory of disease, since there was a whole cluster of related views that went under this label, rather than one easily identifiable position (see, for example, Baldwin (1999), Eyler (2001), Hamlin (2009), Pelling (1978), and Worboys (2000)). However, since all members of that cluster shared basic assumptions about the existence and nature of miasma, I will disregard this complication here, and treat them as one. According to this basic miasma view, diseases were caused and transmitted by a variety of toxic odors (“miasmas”) that themselves were the result of rotting organic matter produced by putrefaction or decomposition. The resulting bad air would then act on individual constitutions to cause one of several diseases (cholera, yellow fever, typhus, etc.), depending on a number of more specific factors. Some of these were thought to be extraneous, such as weather, climate, and humidity, and would affect the nature of the miasmas themselves; others were related directly to the potential victims and thought to render them more or less susceptible to disease, such as their general constitution, moral sense, age, and so on. Lastly, there were a variety of local conditions that could exacerbate the course and severity of the disease, such as overcrowding and bad ventilation.

Although the miasma theory is sometimes contrasted with so-called “contagionist” views of disease (the view that diseases could be transmitted directly from individual to individual), this opposition is also misleading, for two reasons: first, because both contagionists and anti-contagionists subscribed to the basic miasmatic assumptions just described, with the debate not centering on the existence or nature of miasmas, but, rather, on whether people themselves were capable of producing additional miasma-like effects and through these “exhalations” directly give the disease to others. Second, although some diseases were generally accepted as contagious (smallpox, for example), and some were generally held to be non-contagious (malaria), most diseases (yellow fever, cholera, typhoid fever, typhus, and so on) fell somewhere in between these

to anyone that given the enormous variation in the nature and methods of scientific theories across the whole spectrum of ‘successful science’, some domains of enquiry can be more prone to this kind of underdetermination than others” (44). In fact, they believe that “witnessing Kirchhoff’s case, there is every reason to expect that from a realist stance we can grasp the features of physics and mathematics that contribute to such differences” (44). As a result, they take themselves to have “argued for the prima facie plausibility” of a realist response that focuses on “showing how the field of theorizing in question is idiosyncratic in relevant respects, so that Kirchhoff’s curious case remains isolated and doesn’t provide the anti-realist with grounds for projectable pessimism” (44). The miasma case, however, is not prone to any of the idiosyncrasies Saatsi and Vickers identify (or any other idiosyncrasies, as far as I can tell).

two extremes. Pelling (1978: 9) appropriately terms these diseases the “doubtful diseases”: instead of being straightforwardly contagionist or anti-contagionist about them, people espoused so called “contingent contagionism” with respect to them, holding that they could manifest as either contagious or non-contagious, depending on the exact circumstances and, sometimes, even transform from one into the other (see also Hamlin (2009)).

This version of the miasma theory was extremely successful with respect to a number of different types of phenomena. Most famous are probably the sanitary successes that it ultimately inspired, but it also managed to provide explanations of disease phenomena that any theory of disease at the time had to accommodate. These included the fact that diseases were known to be highly seasonal, that particular regions (especially marshy ones) were affected particularly harshly, that specific locations (prisons, workhouses, etc.) were often known to suffer worse than their immediate surroundings, that sickness and mortality rates in urban centers were much worse than those in rural areas, and why particular geographical regions/countries were struck much worse by disease than others. The miasma theory managed to explain all of these through its claims that decomposition and putrefaction of organic material was responsible for producing miasmas. Diseases peaked when conditions for putrefaction were particularly favorable: this was the reason why certain diseases were particularly bad during periods of high temperature and in certain geographical regions (for example, the many fevers in Africa), why urban centers were much more affected than rural areas, and why even specific locations in otherwise more or less healthy areas could be struck (sewage, refuse, and general “filth” would sit around in badly ventilated areas). It should also be noted that miasma theorists were not just making vague or simple-minded pronouncements about stenches producing toxic odors, but embraced very specific and often highly complex accounts of how various materials and conditions gave rise to miasmas—Farr, for example, drew in some detail on Liebig’s chemical explanations (1852, lxxx-lxxxiii; see also Pelling (1978:  chapters 3 and 4) and Tulodziecki (2016)). Moreover, there were debates about exactly what sorts of materials were prime for potential miasmas, such as debates about various sources of vegetable vs. animal matter, and so on. Since, however, I cannot do justice to the details of these accounts and their corresponding successes here, I will focus on a somewhat simpler, yet particularly striking, example of use-novel success, while merely noting that others could be given.

The example in question is that of William Farr’s (then famous) elevation law (1852). Farr (1807–83), although not himself a physician, was viewed as an authority on infectious diseases in mid-1800s Britain. Among the positions he held were that of Statistical Superintendent of the General Register Office and member of the Committee of Scientific Inquiries. Through his various writings,

especially those on the various big British cholera epidemics, he established his credentials as a medical authority.4

As we have seen, according to the miasma theory, any decomposing organic material could in principle give rise to miasmas. However, it was thought that the soil at low elevations, especially around riverbanks, was a particularly good source for producing highly concentrated miasma, since such soil held plenty of organic material and the conditions for putrefaction were particularly favorable. It thus followed directly from the miasma theory that, if miasmas were really produced in the manner described and responsible for disease, the concentration of noxious odors ought to be higher closer to such sources and dissipate with increasing distance. Correspondingly, it was to be expected that mortality and sickness rates would be higher in close proximity to sources of miasma, declining as one moved away. Farr confirmed that this was the case, in a number of different ways.

First, he found that “nearly 80 percent of the 53,000 registered cholera deaths in 1849 occurred among four-tenth of the population living on one-seventh of the land area” (Eyler 1979: 114), and, moreover, that, “cholera was three times more fatal on the coast than in the interior of the country” (Farr 1852: lii). Furthermore, he found that those deaths that did occur inland were in either seaport districts or close to rivers, noting that “[c]holera reigned wherever it found a dense population on the low alluvial soils of rivers” and that it “was almost invariably most fatal in the port or district lying lowest down the river” (lii). Concerning coastal deaths, he found that the “low-lying towns on the coast were all attacked by cholera,” while the high-lying coast towns “enjoyed as much immunity as the inland towns” (liv). Further, he noted that mortality increased and decreased relative to the size of the port, with smaller ports having lower mortality. The Welsh town of Merthyr-Tydfil constituted an exception to this, having a “naturally” favorable location, yet relatively high mortality. However, Farr also noted that Merthyr-Tydfil had a reputation, with the Health of Towns’ Commissioners’ Report noting that “[f]rom the poorer class of the inhabitants, who constitute the mass of the population, throwing all slops and refuse into the nearest open gutter before their houses, from the impeded courses of such channels, and the scarcity of privies, some parts of town are complete networks of filth emitting noxious exhalations” (liv). In addition, much of the refuse was being carried to the local riverbeds, with the result that “the stench is almost intolerable in many places” (lv). In one area, close to the river, an “open, stinking, and nearly stagnant gutter, into which the house refuse is as usual flung, moves slowly before the doors” (lvi).

4 For more on Farr’s life and ideas, see Eyler (1979).

All of these phenomena were exactly what was to be expected on the miasma theory: wherever there was disease, one ought to be able to trace it back to miasmatic conditions and, similarly, wherever conditions particularly favorable to decomposition were to be found, disease ought to be rampant. If there were exceptions to the general rule, towns such as Merthyr-Tydfil that were not naturally vulnerable, one ought to be able to find alternative sources of miasma without difficulty. Farr also proceeded to check these results against a variety of data from different countries and concerning different diseases, and further confirmed what he had found. Moreover, the miasma theory did not merely accommodate these findings but, rather, all of the phenomena followed naturally from the account.

In addition, while the miasma theory made predictions about what areas ought to be affected by cholera and to what degrees, for example, none of these predictions were confirmed until Farr analyzed the data from the General Board of Health in the late 1840s (indeed, due to the fact that much of this data was not collected until shortly before that time, it would have been impossible to confirm these predictions until then). Thus, since it was not even clearly known to what extent these predictions were borne out, they could not have played a role in formulating the miasma theory in the first place and, so, ought to count as usenovel. One might object that, use-novel or not, these predictions were simply too vague to qualify a theory as genuinely successful in the realist sense. I will note, however, that (i) the predictions were as specific as a theory of this type would allow, (ii) the predictions were no more vague than Snow’s later predictions about what ought to be expected on the (correct) assumption that cholera was waterborne (see Snow 1855a, 1855b) and, so, if one regards Snow’s predictions as successful, one ought to also regard the miasmatic predictions as successful, and (iii) that miasmatists actually did a lot better than this (and, indeed, better than Snow ever did) by providing some detailed and quantifiable results.

Farr’s elevation law is a particularly striking example of this and it is to this that I will turn now. Farr’s law related cholera mortality to the elevation of the soil. However, Farr did not just predict that there ought to be a relationship between these two variables, but upon analyzing more than 300 pages of data, he found that the “mortality from cholera is in the inverse ratio of the elevation” (Farr 1852: lxi). Farr grouped the various London districts by their altitude above the Thames high water mark, in brackets of 20 feet (0–20, 20–40, and so on) and was able to capture the exact relation between the decline of cholera and increased soil elevation in the form of an equation. Specifically, he found that:

The mortality from cholera on the ground under 20 feet high being represented by 1, the relative mortality in each successive terrace [i.e. the terraces numbered

Figure 2.1   Farr, William. (1852). “Report on the Mortality of Cholera in England, 1848–49.” London: Printed by W. Clowes, for H.M.S.O., 1852, p. lxii.

“2,” “3,” etc.] is represented by ½ [for terrace 2], ⅓ [for terrace 3], ¼, ⅕, ⅙: or the mortality on each successive elevation is ½, ⅔, ¾, ⅘, ⅚, &c. of the mortality on the terrace immediately below it. (ibid.: lxiii; see Figure 2.1)

He then generalized this result: “Let e be any elevation within the observed limits 0 and 350, and c be the average rate of mortality from cholera at that elevation; also let eʹ be any higher elevation, and cʹ the mortality at that higher elevation” (lxiii). Then, adding a as a constant, Farr found that the formula

c = c′ × (e′ + a)/(e + a)

represented the decreasing series he obtained when considering the mortality from cholera for districts with specific mean elevations. Farr then calculated the expected series according to the formula, compared it to the actual series recorded in London, and found remarkable agreement (see Figure 2.2, lxiii).

Seeking further confirmation, Farr immediately proceeded to “submit the principle to another test, by comparing the elevation and the mortality from cholera of each sub-district,” and found that this “entirely confirms the announced law” (xv–xvi).5 Trying to illustrate just how good Farr’s numbers were and how

5 The tables appear on pp. clxvi–ix of Farr’s Report. (1852).

Figure 2.2   Farr, William. (1852). “Report on the Mortality of Cholera in England, 1848–49.” London: Printed by W. Clowes, for H.M.S.O., 1852, p. lxiii

convincing they must have seemed, Langmuir (1961: 174) plotted Farr’s result about cholera mortality in the various London subdistricts, grouped by elevation (Figure 2.3). According to Langmuir, Farr had “found a confirmation that I believe would be impressive to any scientist at any time” (173).

Even more remarkably, it turned out that Farr’s predictions did not just hold for the (sub-)districts of London, but were also confirmed by others in different regions. For example, “William Duncan, Medical Officer of Health for Liverpool, wrote that when he grouped the districts of his city by elevation as Farr had done, that cholera mortality in the last epidemic obeyed Farr’s elevation law for Liverpool as well” (Eyler 1979: 228).

Now, these predictions of Farr’s were certainly use-novel: Farr was predicting new phenomena that were hitherto unknown and that were later borne out by a variety of data from different regions, in different contexts, and from different times. Moreover, Farr’s law clearly could not have played a role in the construction of the miasma theory since, first, it was obviously not even formulated by then, but, second, even the data on which the law was based (the statistics from the General Register Office, collected on Farr’s initiative) did not exist and, indeed, in the case of Duncan in Liverpool, no one had even thought about collecting the relevant information. In short: it followed from Farr’s law that cholera mortality and soil elevation ought to exhibit a specific relation that was then found to occur in the various sub-districts of London and various other parts of the country.

Turn static files into dynamic content formats.

Create a flipbook