Must Know Math Grade 8 Nicholas Falletta
https://ebookmass.com/product/must-know-math-grade-8-nicholasfalletta/
ebookmass.com
The Social Work Workbook 8th Edition Barry R. Cournoyer
https://ebookmass.com/product/the-social-work-workbook-8th-editionbarry-r-cournoyer/
ebookmass.com
Richter Scale (Shadow Zone Brotherhood Book 1) Dalia Davies
https://ebookmass.com/product/richter-scale-shadow-zone-brotherhoodbook-1-dalia-davies/
ebookmass.com
The European Reception of John D. Caputo's Thought: Radicalizing Theology Martin Koci
https://ebookmass.com/product/the-european-reception-of-john-dcaputos-thought-radicalizing-theology-martin-koci/
ebookmass.com
Human Rights: Moral Or Political? Adam Etinson
https://ebookmass.com/product/human-rights-moral-or-political-adametinson/
ebookmass.com
The Good Society: An Introduction to Comparative Politics
3rd Edition, (Ebook PDF)
https://ebookmass.com/product/the-good-society-an-introduction-tocomparative-politics-3rd-edition-ebook-pdf/
ebookmass.com
CALCULATED SURPRISES
OXFORD STUDIES IN PHILOSOPHY OF SCIENCE
General Editor: Paul Humphreys, University of Virginia
Advisory Board
Anouk Barberousse (European Editor)
Robert W. Batterman
Jeremy Butterfield
Peter Galison
Philip Kitcher
Margaret Morrison
James Woodward
Science, Truth, and Democracy
Philip Kitcher
The Devil in the Details: Asymptotic Reasoning in Explanation, Reduction, and Emergence
Robert W. Batterman
Science and Partial Truth: A Unitary Approach to Models and Scientific Reasoning
Newton C. A. da Costa and Steven French
The Book of Evidence
Peter Achinstein
Inventing Temperature: Measurement and Scientific Progress
Hasok Chang
The Reign of Relativity: Philosophy in Physics 1915–1925
Thomas Ryckman
Inconsistency, Asymmetery, and NonLocality: A Philosophical Investigation of Classical Electrodynamics
Mathias Frisch
Making Things Happen: A Theory of Causal Explanation
James Woodward
Mathematics and Scientific Representation
Christopher Pincock
Simulation and Similarity: Using Models to Understand the World
Michael Weisberg
Systemacity: The Nature of Science
Paul Hoyningen-Huene
Causation and its Basis in Fundamental Phyiscs
Douglas Kutach
Reconstructing Reality: Models, Mathematics, and Simulations
Margaret Morrison
The Ant Trap: Rebuilding the Foundations of the Social Sciences
Brian Epstein
Understanding Scientific Understanding
Henk de Regt
The Philosophy of Science: A Companion
Anouk Barberousse, Denis Bonnay, and Mikael Cozic
Calculated Surprises: A Philosophy of Computer Simulation
Johannes Lenhard
Introduction
What is calculable? This is not a new question—astronomic phenomena were being handled mathematically even in Antiquity. However, it became particularly relevant with the emergence of modern science, in which it was often related to the insight that motions on our planet follow mathematical regularities (Galileo) and that one can even formulate universal laws of nature that make no distinction between the physics of heaven and those of earth (Newton).
This approach of modeling something mathematically and thereby making it calculable was a major turning point in the emergence of our scientific world.1 This is where computer simulation comes in, because it offers new recipes with which anything can be made calculable.
However, is computer simulation simply a continuation of existing forms of mathematical modeling with a new instrument—the digital computer? Or is it leading to a transformation of mathematical modeling? The present book addresses these questions. To extend the recipe metaphor, what is new about simulation has more to do with the recipe than with the actual ingredients. Nonetheless, anybody who is willing to try out a new recipe may well have to face a calculated surprise. Georg Christoph Lichtenberg (1742–1799) was already thinking along these lines when he wrote the following in one of his Sudelbücher (Waste books):
1. The classic accounts by Koyré (1968), Dijksterhuis (1961), or Burtt (1964) emphasize a standpoint viewing mathematization as the central feature of modern physics.
How many ideas float about dispersed in my mind of which many pairs, if they were ever to meet, could produce the greatest discoveries. But they lie as far apart as the sulfur of Goslar from the saltpeter of the East Indies and the charcoal piles of the Eichsfeld, which when combined would produce gunpowder. How long must the ingredients of gunpowder have existed before gunpowder!
(Lichtenberg, 2012, pp. 159–160, Waste book K 308)
Admittedly, there are also numerous recipe variants that are hardly worth discussing at all. Is computer simulation an excitingly new and powerful mixture or just a minor variation? A conclusive answer to this question requires a philosophical analysis that not only explores current forms of the scientific practice of computer modeling but also integrates the historical dimension of mathematical modeling. This is the only way to determine which continuities and differences are actually of philosophical interest. The key thesis in this book is that computer and simulation modeling really do form a new type of mathematical modeling.
This thesis consists of a twofold statement: first, that simulation reveals new, philosophically relevant aspects; and second, that these new aspects do not just stand unconnected alongside each other but also combine in ways that justify talking about simulation as a “type.” Both elements of this premise are controversial topics in the philosophy of science. One side completely denies any philosophical newness; another side emphasizes single aspects as characteristics that—according to my thesis—reveal their actual type only when combined.
The widespread skepticism regarding any novelty is based largely on a misinterpretation. This views the computer as a logical-mathematical machine that seems only to strengthen trends toward mathematization and formalization. Viewing the amazing speed of computation as the main defining feature of the computer suggests such a premise as “everything as before, just faster and more of it.” This ignores the philosophically relevant newness of simulation modeling right from the start. However, such a standpoint fails to recognize the transformative potential of computer modeling.
The far-reaching popularity of this standpoint is explained in part by the history of the use of the computer as a scientific instrument. Even
when the computer reveals an unparalleled penetration rate and its use today has become a matter of course in (amazingly) many sciences, the situation was completely different in early decades.
In the pioneering days of the computer, a number of methods were developed that often laid the foundations for simulation procedures that have since come into common use. These include, for example, the Monte Carlo method (Metropolis & Ulam, 1949; von Neumann & Richtmyer, 1947) and the closely related Markov chain Monte Carlo method (Metropolis et al., 1953), cellular automata (Rosenblueth & Wiener, 1945; von Neumann, 1951), or artificial neural networks (McCulloch & Pitts, 1943). Other methods such as finite differences also received much attention and refinement through the application of computers. However, these contributions were of a more programmatic nature and were often ingenious suggestions for developing computer methods and techniques. Naturally, they were far in advance of contemporary scientific practice.
At that time, a computer was an exotic and enormous machine, a kind of calculating monster requiring much care and attention, and it was viewed with some skepticism by mathematically schooled natural scientists. Few fields had a need for extensive calculations, and these were also viewed as a “brute-force” alternative to a theoretically more advanced mathematization. Although the study of computer methods might not have been thought to reduce scientists to the rank of menial number crunchers, it did at least seem to distract them from more valuable mathematical modeling.
It took two further decades and the use of mainframe computers at computer centers throughout the world for a simulation practice to develop that would involve a significant number of researchers. Simulation now appeared on the scientific agenda, as witnessed by, for example, the annual Winter Simulation Conferences—international meetings of scientists from various disciplines who define themselves as both developers and users of simulation methods.
The need to consolidate the work of the growing group of computer modelers resulted in a few isolated works on computer simulation. These viewed the computer as being fully established as a new tool that would secure its own place in the canon of scientific methods. An early example is Robert Brennan’s (1968) significant initial question in his title Simulation Is Wha-a-t? However, within less than one decade, Bernard Zeigler’s book
Theory of Modelling and Simulation (1976) had already become a standard reference.
Starting roughly in the mid-1980s, computer-based modeling spread rapidly. The decisive factors underlying this development seem to have been the readily available computing capacity and the extended possibilities of interaction based on new visualization technologies. At the same time, semantically advanced languages were developed that opened up computer modeling to a large group of scientists who were not experts in machine languages. This merged both epistemological and technological aspects. With a rapidly growing store of available programs and program packets, science joined industry in making simulation and computer modeling a standard work approach. The topic of simulation was now visible from the “outside,” as to be seen in the regular reporting on simulation-related topics in popular science magazines such as Scientific American.
One could expect this to have led to more analyses addressing the topic of simulation in the philosophy of science. However, that was not the case. I believe that this has been because the long-established, aforementioned view of the computer as a logical machine continued to suggest that its breadth and limitations had been determined for all time, so to say, by its logical architecture. My thesis, in contrast, is that mathematical modeling has undergone a fundamental change. To back up this diagnosis, I shall take examples from scientific practice and evaluate their methodological and epistemological implications. These examples assign a central role to modeling as an autonomous process that mediates between theory and intended applications. I shall discuss this further in this introduction.
First, however, I wish to return to the second part of my key thesis that simulation is not determined by one particular new feature or by the quantity of its epistemological and methodological features but, rather, by the ways in which these features are combined. Here, I can draw on the existing literature that has analyzed such characteristics of simulation.
Simulations only began to attract the attention of the philosophy of science from about 1990 onward. Paul Humphreys’s and Fritz Rohrlich’s contributions to the 1990 Philosophy of Science Association (PSA)
conference can be seen as early indicators of this gradual growth of interest. For roughly two decades now, there has been a real growth in the literature on simulation in the philosophy of science. Modeling and simulation have become regular topics in specialist journals and they have their own series of conferences. There are now also publications on various aspects of simulation and computer modeling that address a broader audience from authors such as Casti (1997) or Mainzer (2003). However, the first monograph to take a philosophy of science approach to computer simulation came from Paul Humphreys (2004). This has been followed recently by the books of Winsberg (2010), Weisberg (2013), and Morrison (2014) that offer elucidating perspectives on the topic. A competently commentated overview of the most important contributions to the discussion in the philosophy of science can be found in Humphreys (2012).
An important starting point for the present work is the recent debate on models in the theory of science. This debate arose from the controversy over what role the theories and laws in the natural sciences take when it comes to applications.2 Models, it has now been acknowledged generally for roughly a decade, play an important role by mediating between theories, laws, phenomena, data, and applications. Probably the most prominent presentation of this perspective can be found in Morgan and Morrison’s edited book Models as Mediators (1999) that continues to be recognized as a standard work.
This model debate delivers an appropriate framework for simulation. I agree particularly with Margaret Morrison (1999) when she describes models as “autonomous agents.” This role, I shall argue, is even taken to an extreme in simulation modeling. Their instrumental components along with their more comprehensive modeling steps make simulation models even less dependent on theory; they have become, so to say, more autonomous. The increasing differentiation of the model debate suggests that philosophical, historical, and sociological aspects could be considered together. The books edited by Gramelsberger (2011), Knuuttila et al.
2. Cartwright (1983) can be viewed as an important instigator of this controversy that I shall not be discussing further here.
(2006), Lenhard et al. (2006), and Sismondo and Gissis (1999) offer a broad spectrum of approaches to this debate.
The next question to ask is: Which particular features characterize simulations? The answers to be found in contemporary literature take different directions. The largest faction conceives the most important feature to be simulation experiments, often combined with viewing simulation as neither (empirical) experiment nor theory. Members of this faction include Axelrod (1997), Dowling (1999), Galison (1996), Gramelsberger (2010), or Humphreys (1994). Others consider the outstanding feature to be increasing complexity. Simulation and computer methods in general are treated as an adequate means of studying a complex world or, at least, complex phenomena within that world. Back in 1962, Herbert Simon already had the corresponding vision that has come to be viewed as standard practice since the 1990s. “Previously we had no models or frames or any set of mathematical equations for dealing with situations that combined two properties: a large number of individual parts, that is, degrees of freedom, and a complex structure regulating their interaction” (quoted in Braitenberg & Hosp, 1995, p. 7, translated). Closely related to this is the new syntax of mathematics—that is, the form of the models tailored to fit computers that moves away from the previously standard mathematical analysis. Instead, simulation draws strongly on visualizations (see, e.g., Rohrlich, 1991).
The emphasis on single characteristics is generally accompanied by a trend toward specialization. Most studies restrict their field of study in advance to a specific modeling technique. For example, Galison’s (1996) claim that simulation represents a tertium quid between experiment and theory is based on a study of the early Monte Carlo method; Rohrlich (1991) or Keller (2003) see the true potential of simulation as being linked to cellular automata; whereas Humphreys (1991, 2004) or Winsberg (2003) basically derive their propositions from the method of finite differences. Others derive characteristics from the ways in which modeling techniques have become established in specific disciplines. For example, Lehtinen and Kuorikoski (2007) address agent-based models in economics, whereas Heymann and Kragh (2010) along with Gramelsberger and Feichter (2011) focus on the construction and use of climate models.
Without wishing to question the advantages of such a focused and particularistic approach, I wish to orient the present book toward more fundamental commonalities that hold for a large class of modeling techniques. Indeed, a thesis proposing a new type of mathematical modeling needs to possess a broad validity. Therefore, I place great value on basing my argumentation across the following chapters on a range of different modeling techniques and illustrating these with a series of examples from different scientific fields.
However, the task of characterizing simulation both epistemologically and methodologically raises a problem similar to that of having to steer one’s way between Scylla and Charybdis: a too general perspective that views simulation as merging completely into mathematical modeling— and purports to see the computer as an instrument without any decisive function—is bound to be unproductive because the specifics become lost. Simulation would then seem to be nothing new—merely the mathematical modeling we have already known for centuries presented in a new garb. On the other hand, although a too detailed perspective would cast light on a host of innovations, these would remain disparate and end up presenting simulation as a compendium of novelties lacking not only coherence but also any deeper relevance for the philosophy of science. Accordingly, one challenge for the present study is to select both an adequate range and an adequate depth of field.
Hence, my starting point is determined in two ways: First, I focus attention on the process of modeling, on the dynamics of construction and modification. Second, I take mathematical modeling as a counterfoil. Here, I am indebted mostly to the work of Humphreys (1991, 2004) that convinced me of the value of this counterfoil approach. My main thesis is that by taking the form of simulation modeling, mathematical modeling is transformed into a new mode. Or to put it another way, simulation modeling is a new type of mathematical modeling. On the one hand, this means that simulation is counted into the established classical and modern class of mathematical modeling. On the other hand, a more precise analysis reveals how the major inherent properties of simulation modeling contribute to a novel explorative and iterative mode of modeling characterized by the ways in which simulation models are constructed and fitted.
Hence, as stated, I shall try to steer my way between Scylla and Charybdis—that is, to avoid taking a perspective that is too general because it would then be impossible to appreciate the characteristics of simulation, but to simultaneously avoid selecting a focus that is too specific because this would make it necessary to address each different technique such as Monte Carlo or cellular automata separately. Admittedly, this will inevitably overlook several interesting special topics such as the role of stochasticity (implemented by a deterministic machine)—but even Odysseus had to sacrifice six of his crew.
What is it that makes simulation modeling a new type of modeling? First, it has to be seen that this question addresses a combination of philosophy of science and technology. Not only can mathematical modeling address the structure of some phenomena as an essential feature of the dynamics in question; it can also apply quantitative procedures to gain access to prediction and control. This general description then has to be modified specifically owing to the fact that the modeling in which I am interested here is performed in close interaction with computer technology. One direction seems self-evident: the (further) development of computers is based primarily on mathematical models. However, the other direction is at least just as important: the computer as an instrument channels mathematical modeling.
Initially, this channeling can be confirmed as an epistemological shift. Traditionally, mathematical modeling has been characterized by its focus on the human subjects who are doing the active modeling in order to gain insight, control, or whatever. Now, an additional technological level has entered into this relationship. Mathematical modeling is now (also) performed with computers; it integrates computers into its activities: the adequacy of models is measured in terms of the phenomena, the intended application, and then the instrument as well. However, this subjects mathematization to a profound change. Throughout this book, it will often become apparent how a dissonance emerges between continuity with the older type of mathematical modeling and the formation of the new type. Simulation has to achieve a sort of balancing act that compensates for the transformations introduced by the computer, making it, so to speak, compatible through additional constructions within the model.
A further aspect of this channeling that needs to be considered here is the special relationship to complexity. In the twentieth century, this term grew into a central concept in the sciences and it has been shaped decisively through computer-based modeling. Complex systems are considered to lack transparency, because their components define the system dynamics only after a multitude of interactions. This somewhat vague paraphrase is rendered more precise by the specific and defined meaning of computational complexity (proceeding from the work of the mathematician Kolmogorov). This complexity sets a limit on computer models in the sense that complex problems either cannot be solved or—depending on the degree of complexity—can be solved only slowly by computational power. Computers consequently stand for an unfettered increase in computational power, but nonetheless are restricted to the domain of simple or less complex problems. The groundbreaking insights of the meteorologist Lorenz have led to recognition of how relevant complex systems are in natural events. Parallel to this—and that is what is important—computer models themselves are complex: complexity characterizes both the object and the instrument.
Broadly speaking, complexity is not restricted to exceptionally large systems that are studied with computer methods. The degree and extent of complexity are far more comprehensive and have permeated into broad areas of simulation modeling itself.
Therefore, studying the dynamics of the system being examined and also those of the model itself become separate tasks. The specific reflexivity of simulation is documented in the way that this type of mathematical modeling has to mediate between an excess of power measured in “flops” and a multifaceted problem of complexity that limits the range of this power.
In many cases, it is not possible to reduce complexity: This would be neither appropriate for the phenomena concerned nor methodologically performable. This is what changes the core activity of mathematical modeling. It has less to do with overcoming complexity than with handling it. In other words, simulation modeling is a mode of modeling that balances high complexity with usability—even when this makes it necessary to abandon some of the “classical” virtues of mathematical modeling.
The main characteristics of simulation modeling presented as follows depend on and mutually reinforce each other. This delivers the stability that enables them to build a new type of modeling. Each of the first four chapters of part I deals with one of the following elements of simulation modeling.
• Experiment and artificiality: This is where the channeling effect of the technological level is revealed, because computers can process only discrete (stepwise and noncontinuous) models. In contrast to traditional mathematical modeling, such models seem artificial. If a continuous theoretical model is available that is then redesigned as a computer model, this discretization results in a flood of local interactions (e.g., between finite elements). The model dynamics then have to be observed in a computer experiment. Vice versa, in the course of the modeling process, the simulated dynamics can be approximated to theoretically determined dynamics or to known phenomena using special built-in parameters. This requires the addition of artificial components—that is, components that are formed for instrumental reasons.
• Visualization and interaction: Exploiting the enormous processing capacity of computers generally requires a powerful interface on which it is possible to prepare very large datasets for researchers in ways that enable them to estimate what they will then need for further stages of construction. Visualization offers an excellent possibility here, because of the quite amazing human capacities in this field. When the modeling process calls for the repeated running of an adjustment loop, this interface is correspondingly of great importance.
• Plasticity: This describes a property of models. Some classes of models intentionally use a weak structure in order to exploit not their fit but their ability to fit. One can then talk about structural underdetermination. In other model classes, the dynamics are codetermined strongly by a theoretical structure. Even when such a model correspondingly cannot be taken as being
structurally underdetermined, plasticity can still be an important feature. It then draws on the effects that arise from the ways in which the (artificial) parameters are set. The more flexible a model is, the more significant is the phase of modeling during which the parameters are adjusted. Put briefly, the plasticity of a model is proportional to the breadth of an iterative and exploratory adjustment during the modeling.
• Epistemic Opacity: This kind of opacity has several sources. By their very definition, complex models are nontransparent. Things are similar for case distinctions that are very simple for a computer to process algorithmically, but practically impossible for human beings to follow. Moreover, the artificial elements discussed in chapter 1 contribute to opacity, and these are added particularly for instrumental reasons and in response to the performance observed—that is, for nontheoretically motivated reasons. The value of such elements first becomes apparent during the further course of the modeling process. This lack of insight into the model dynamics is a fundamental difference compared to traditional mathematization that promises to produce transparency through idealization. Things are channeled differently in simulation modeling, and the iterative sounding out of the model dynamics is applied as a methodological surrogate that should replace intelligibility.
Chapter 5 summarizes how these components all fit together to form an iterative and explorative mode of modeling. Even when the proportions of these components may vary from case to case, it is nonetheless their combination that confirms simulation modeling to be an independent and novel type. Simulation models develop their own substantial dynamics that are derived not (just) from the theories entered into them, but (also) are essentially a product of their construction conditions. The emphasis on these intrinsic dynamics and the way the construction mediates between empirical data and theoretical concept place computer simulation in the framework of a Kantian epistemology.
I agree with Humphreys (2009) when he refers to the concept of the “style of reasoning” when categorizing simulation.3 Based on the results discussed in this book, simulation modeling has to be specified as a combinatory style of reasoning. The new mode of modeling is having—I wish to argue further—a major influence on the progress of science; namely, it is pointing to a convergence of the natural sciences and engineering. This contains both instrumental and pragmatic approaches focusing on predictions and interventions. Simulation modeling brings the natural sciences and engineering more closely together into a systematic proximity.
Part II is dedicated to the conceptual shifts triggered by the establishment of the new mode or style. The extent of the transformative power that simulation and computer modeling are exerting on traditional mathematical modeling can be pursued further on the basis of two concepts. Chapter 6 discusses the concept of solution that is actually formulated very clearly with respect to an equation or, at least, to a mathematically formulated problem. In the course of simulation, the talk is about a numerical solution that is actually not a solution at all in the older, stricter sense but, rather, something that has to be conceived in a highly pragmatic way. The concept of the solution is then no longer defined according to criteria coming purely from within mathematics, but instead according to whether the numerical “solution” functions adequately enough relative to some practical purpose. I shall draw on the dispute between Norbert Wiener and John von Neumann when discussing the dissonance between a standpoint that views numerical solutions as being derived from mathematical ones and the opposite position that regards numerical solutions as existing in their own right.
When it comes to applications, one central problem is how to validate a model. Simulation represents a kind of extreme case regarding both the number of modeling steps and the complexity of a model’s
3. The differentiation of scientific thinking styles goes back to Alistair Crombie (1988; see also, for more detail, 1994). Ian Hacking specified the concept more rigorously as “style of reasoning” (1992). The main point is that such a style first specifies “what it is to reason rightly” (Hacking, 1992, p. 10). “Hypothetical modeling” is one of the six styles identified by Crombie.
dynamics. Does this create new validation problems? I address this issue in chapter 7, in which I present a two-layered answer. On the first layer, the validation problem remains the same: the extent of the modeling simply makes it more precarious. On the second layer, a conceptual problem emerges: simulation reaches the limits of analysis in complex cases, and this leads to the problem of confirmation holism. A key concept here is modularity. Organizing complex tasks into modules is seen as the fundamental strategy when working with complex designs. I argue that the modularity of simulation models erodes, for systematic reasons. As modeling proceeds, submodels in the planning phase that were clearly circumscribed almost inevitably start to overlap more and more. The practices of parameterization and kluging are analyzed as principal drivers of this erosion. These limit the depth of the analysis of model behavior; that is, under some circumstances, it may no longer be possible to trace individual features of behavior back to individual modeling assumptions. The only method of validation this leaves is a (stricter) test on the level of the model’s behavior—as performed in technological testing procedures.
Part III, which is the final chapter 8, addresses two issues. First, it draws a conclusion from the autonomy of simulation modeling and what this means in terms of novelty. Based on my previous analyses, I wish to defend a moderate position. The newness of simulation modeling consists essentially in the ways in which it constructs mathematical models and operates with them. This is the source of its potential as a “combinatorial style of reasoning.” At the same time, I reject the exaggerated demands and claims often linked to the topic of simulation—for example, that modern science refers only to a world in silico nowadays, or that simulation has finally given us a tool with which to unveil complexity. Simulation modeling functions in an explorative and iterative mode that includes repeated revision of the results. Or, to put it another way, what is certain about simulation is its provisional nature.
Second, chapter 8 offers an outlook for a philosophical and historical thesis on the relation between science and technology. Chapter 5 already related the mode of simulation modeling to the convergence of the natural sciences and engineering. It is particularly the types of mathematization striven toward in both domains that are drawing closer together. This
movement affects the concept of scientific rationality. I illustrate this claim by looking at the divide between a realist and an instrumentalist standpoint. Does science flourish by subscribing to theoretical insight or, rather, to material manipulation? Often, this divide is treated as irreconcilable. Simulation modeling, however, changes the picture because it intertwines theory, mathematics, and technological instrumentation, thus making the divide look much less severe and significant. Finally, the analysis of simulation modeling suggests the need to rethink what constitutes modern scientific rationality.