Control and Dynamical Systems Caltech Control and Dynamical Systems
Research  |  Technical Reports  |  Seminars  |  Conferences & Workshops  |  Related Events

Overview of CDS applications in networks, biology, and physics

Professor John Doyle, Control and Dynamical systems, California Institute of Technology

Tuesday, September 26, 2000
1:00 PM to 2:00 PM
Steele 102

Through design or evolution, complex systems in engineering and biology develop highly structured, elaborate internal configurations, with layers of feedback and signaling. This makes them robust to the uncertainties in their environment and components for which such complexity was selected, but also makes the resulting system potentially vulnerable to rare or unanticipated perturbations. Such fragility can lead to large cascading failures from tiny initiating events. Perturbation of one gene or a single line of software code, or the introduction of an exotic specie or trace amounts of a toxin, rarely causes significant system-wide impact, yet occasionally can cascade into complete system failure.

The traditional view of stability is the robustness of an attractor to uncertainty in initial conditions. It is now well known, however, that many practical applications, uncertainty due to external disturbances and noise, and to parameter variations and unmodelled dynamics, are more important, often substantially so. In addition, even systems with a single globally, exponentially stable equilibria can exhibit arbitrarily poor robustness to other kinds of uncertainty. As a result of decades of controls research, there are now a rich variety of theory and tools to analyze and optimize the robustness of control and dynamical systems, and much progress has been made in recent years in classifying the complexity of many problems, and in developing practical algorithms. Much of this theory makes rich connections between problems in matrix analysis, operator theory, control, and dynamical systems. While the theory has been motivated by the robust control of complex engineering systems, a variety of new applications in physics and biology have recently emerged, including shear flow turbulence, time irreversibility, measurement and control of quantum systems, and biological regulatory networks. Perhaps the most exciting new application area is in control of communication/computer networks, and their future convergence with other engineering and financial networks.

The extreme "robust, yet fragile" character of complex systems severely complicates the challenge of connecting phenomena on widely different time and space scales, and in particular, exactly those phenomena most critical to understanding and preventing large cascading events. A consequence is that "typical" behavior of complex systems is often quite simple, so that a naive view leads to simple models and (wrong) explanations of most phenomena. Only extreme circumstances not easily replicable in laboratory experiments or simulations reveal the role of the enormous internal complexity in biological and engineering systems. This talk will argue that "robust, yet fragile" is not an accident, but is the inevitable result of fundamental tradeoffs. It is the single most important common feature of complexity in technology and biology, with profound implications for computational issues. Motivating examples will be drawn from gene regulation and signal transduction networks, web/internet traffic, power outages, forest fires and other large ecosystem events, stock market volatility, distributed software, commercial aircraft design, automobile airbags, weather and climate forecasting, and formula one racing.

What is perhaps surprising is that the tools of robust control contribute not only to our understanding of complex engineering and biological systems, but are beginning to resolve some of the most persistent unresolved problems at the foundations of physics. Of particular interest is the nature of shear flow turbulence, the connections between microscopic reversibility and macroscopic irreversibility, including fluctuation/dissipation "theorems," and the measurement problem in quantum mechanics. What these problems have in common is the "robust, yet fragile" character of their connection of microscopic to macroscopic phenomena. Control theoretic tools allow for the systematic creation of robust mesoscopic models which offer new and more rigorous interpretations of classic physical observations in fluid, statistical, and quantum mechanics. This talk will briefly review new research directions in these areas and the promising, though preliminary, results that have been obtained.

Additional details and background:

Recently, Carlson and Doyle introduced Highly Optimized Tolerance (HOT) to describe essential and common properties of complex systems in biology, ecology, technology, and socio-economic systems. To function outside an idealized laboratory setting, such systems must be robust to uncertainty in their environment and components, but occasional catastrophic breakdowns remind us that this cannot be taken for granted. HOT systems arise when deliberate robust design aims for a specific level of tolerance to uncertainty, which is traded off against the cost of the compensating resources. Optimization of this tradeoff may be associated with some mixture of explicit planning as in engineering, or mutation and natural selection, as in biology, but we use the word ``design'' loosely to encompass both. The resulting features of HOT systems are high performance and high throughput, ubiquitous power law distributions of event sizes, and potentially high sensitivities to design flaws and unanticipated or rare uncertainties.

HOT leads to new models in which design and evolution result in rare, structured states which are both robust to perturbations they were designed to handle, yet fragile to design flaws and unanticipated perturbations. As biological systems evolve and man made systems are designed, they follow a spiral of increasing complexity in order to suppress unwanted sensitivities or take advantage of some opportunity for increased productivity or throughput. However, each step towards increasing complexity is inevitably accompanied by new sensitivities, so that the spiral continues. We argue that such robustness tradeoffs are the most essential feature of real complex systems, and under a wide range of conditions they naturally leads to power laws. HOT systems are a particular challenge in the study of multiscale phenomena, since their macroscopic behavior is both highly robust and extremely sensitivity to different microscopic details. For example, organisms and ecosystems exhibit remarkable robustness to large variations in temperature, moisture, nutrients, and predation, but can be catastrophically sensitive to tiny perturbations, such as a genetic mutation, an exotic specie, or a novel virus. Engineers deliberately design systems to be robust to common uncertainties. Cost and performance tradeoffs force an acceptance of some hypersensitivity to hopefully rare perturbations. In evolved or designed systems, this tradeoff leads to the "robust, yet fragile" characteristic of complexity.

An example which has been studied in some detail is Web layout design viewed as a coding problem (Doyle and Carlson; Zhu, Yie, and Doyle), and its impact on self-similar internet traffic. The aim here is to minimize the average size, and thus approximately the latency, of files downloaded during Web browsing sessions, but with the novel object of design being Web site layout rather than codeword selection. This modest introduction of geometry into a coding context completely changes the results, producing distributions for file transfers that are heavy tailed. It has also been shown (e.g., in the work by Willinger, Paxson, Floyd, Crovella, and their colleagues) that both LAN and WAN traffic have strongly self-similar characteristics, quite unlike the traditionally assumed Poisson traffic models. These discoveries have inspired recent but extensive research in the modeling of network traffic statistics, their relationship to network protocols and the (often huge) impact on network performance. Real network traffic exhibits long-range dependence and high burstiness over a wide range of time scales. It has further been widely argued that the dominant source of this behavior is due to heavy-tailed Web and other application traffic being streamed out onto the network by TCP to create long-range correlations in packet rates. Thus the HOT web layout work can be thought of as bringing some initial closure to the origins of self-similar traffic, but raises new questions and suggests new congestion control strategies. That self-similar statistics are ubiquitous in complex systems is not a new observation (e.g. see advocates of self-organized criticality, SOC), nor is it one without controversy. HOT offers a radically different alternative theory for the nature of complexity. The origin of both power laws and ``phase transitions'' in complex networks are viewed as just two of the more obvious features of their intrinsic ``robust, yet fragile'' character: intrinsic, natural, and permanent features of not only Web traffic over TCP/IP but the statistics of complex systems in general, including power grids, ecosystems, and financial markets. Thus beyond web layout, HOT offers a remarkably rich conceptual framework for thinking about complex networks. HOT also shows how statistical physics can blend with robust control and information theory to give new perspectives on complex networks, but that the existing tools are inadequate to answer the questions that arise. For example, while the web layout problem can be described in terms familiar to physicists and information theorist, the obvious standard tools that would appear relevant such as Shannon coding and the renormalization group, are of no use in solving the resulting design problem or in explaining observed data.

It is becoming widely recognized that important research challenges in biology have many parallels with those in complex engineering systems. Emphasis is shifting from components and molecules to the study of the vast networks that biological molecules create that regulate and control life. And the central role that robustness plays in complex systems has begun to move from an exclusively engineering perspective to one of interest to biologists as well. Biological systems are particularly extreme in their "robust, yet fragile" characteristics. This is the most critical and universal characteristic of complexity, not only in biology, but also in technological and social systems, and is motivating new efforts to extend and integrate the currently fragmented theoretical foundations for studying robustness and complexity. This is in sharp contrast to the emphasis on self-organization, emergence, phase transitions, criticality, fractals, self-similarity, edge-of-chaos, and so on, that have been popularized as a "new science of complexity." This talk will take a critical look at these and several other issues relevant not only to internet traffic but also to biological complexity, and particularly how "robust, yet fragile" features dominate a systems view of everything from gene regulation to medical treatment to ecosystem structure. For recent links to background material, see a website for a talk at SFI:

http://www.cds.caltech.edu/~doyle/SFI_networks.html

©2003-2011 California Institute of Technology. All Rights Reserved
webmastercdscaltechedu