Proceedings of the 42nd Hawaii International Conference on System Sciences - 2009
A simple model for the reliability of an infrastructure system controlled by agents B. A. Carreras BACV Solutions, Oak Ridge, TN 37831 bacv@comcast.net
D. E. Newman Physics Department University of Alaska, Fairbanks, AK 99775 ffden@uaf.edu
Abstract A simple dynamic model of agent operation of an infrastructure system is presented. This system evolves over a long time scale by a daily increase in consumer demand that raises the overall load on the system and an engineering response to failures that involves upgrading of the components. The system is controlled by adjusting the upgrading rate of the components and the replacement time of the components. Two agents operate the system. Their behavior is characterized by their risk-averse and risk-taking attitudes while operating the system, their response to large events, and the effect of learning time on adapting to new conditions. A risk-averse operation causes a reduction in the frequency of failures and in the number of failures per unit time. However, risk aversion brings an increase in the probability of extreme events.
1. Introduction Infrastructure systems suffer from rare non-periodic large-scale breakdowns that lead to large economic and other losses by the community and sometimes can cause personal injuries and even loss of lives. The initiating causes of these events are very diverse, ranging from incidents caused by weather, to malfunction of system components, and to willful acts. Whether intentional or not, these events can threaten national security. Some examples of such extreme events are the August 14th, 2003 blackout in Northeastern America, the consequences of the Katrina hurricane in the New Orleans area etc. The real cause of this behavior is that infrastructures are pushed to their capability limits by the continuously rising demands by consumers and the economical constraints upon them. The infrastructures are normally operating close to a critical point where events of all sizes are possible.
I. Dobson ECE Department, University of Wisconsin, Madison, WI 53706 dobson@engr.wisc.edu
Matthew Zeidenberg Teachers College, Columbia University, New York, NY 10027 zeidenberg@tc.columbia. edu
Present infrastructure systems are complex technological systems for which such extreme events are “normal accidents� [1]. As Perrow indicates, such normal accidents are characteristic of these systems and it is not possible to eliminate them. Additionally, these extreme events tend to generate a risk-averse attitude in the people managing and operating infrastructure systems. This change in attitude in turn, modifies the probabilities of occurrence of such events. Some negative consequences of risk-averse operation on complex systems have been explored by Bhatt et al. [2], who used a model for the propagation of the failures inspired by how forest fires spread [3]. Altmann et al. [4], using a model of human reactions to river floods, have shown that the commonly-employed method of fighting extreme events by changing protection barriers in reaction to them is generally less efficient than the use of constant barriers to contain them. In this paper, we use a simple model of infrastructures, to further explore some of the consequences of such changes in operational attitudes to extreme events. We explore a range of behavior by system operators, varying from risk-taking to riskaverse operation. Initial results of our studies were published in [5]. In studying infrastructures, we can consider them as static systems with external forcing or as dynamical systems. As static system, that is with an external forcing at a given fixed time, they can be tuned to be closer or farther from a critical point where they run into the limit of their operational limits. As a dynamical system under the constant pressure of increase demands from the consumers and economical constraints, they are constantly pushed, in a self-consistent manner, against this critical point. Although treating the infrastructure as a static system lacks realism, it has the advantage of allowing us to better understand how the system performs as it gets close to this critical point and also how policies that may be reasonable away from critical point can become dangerous as we get closer to the operational limits.
978-0-7695-3450-3/09 $25.00 Š 2009 IEEE
1