What Is a System?
A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time. The system may be buffeted, constricted, triggered, or driven by outside forces. But the system’s response to these forces is characteristic of itself, and that response is seldom simple in the real world.
The system, to a large extent, causes its own behavior! An outside event may unleash that behavior, but the same outside event applied to a different system is likely to produce a different result.
Once you start listing the elements of a system, there is almost no end to the process. You can divide elements into sub-elements and then sub-sub-elements. Pretty soon you lose sight of the system.
System structure is the source of system behavior. System behavior reveals itself as a series of events over time
System purposes need not be human purposes and are not necessarily those intended by any single actor within the system. In fact, one of the most frustrating aspects of systems is that the purposes of sub-units may add up to an overall behavior that no one wants.
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems
More than the Sum of Its Parts
The behavior of a system cannot be known just by knowing the elements of which the system is made
Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate.
Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.
Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation. A single balancing loop brings a system stock back to its desired state
Many chronic diseases, such as cancer and heart disease, come from breakdown of resilience mechanisms that repair DNA, keep blood vessels flexible, or control cell division. Ecological disasters in many places come from loss of resilience, as species are removed from ecosystems, soil chemistry and biology are disturbed, or toxins build up
Hard to See
Resilience is something that may be very hard to see, unless you exceed its limits, overwhelm and damage the balancing loops, and the system structure breaks down. Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.
The most marvelous characteristic of some complex systems is their ability to learn, diversify, complexify, evolve
Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability. Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes.
Threatens Power Structures
Self-organization produces heterogeneity and unpredictability. It is likely come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder. These conditions that encourage self-organization often can be scary for individuals and threatening to power structures. As a consequence, education systems may restrict the creative powers of children instead of stimulating those powers. Economic policies may lean toward supporting established, powerful enterprises rather than upstart, new ones. And many governments prefer their people not to be too self-organizing
The world, or at least the parts of it humans think they understand, is organized in subsystems aggregated into larger subsystems, aggregated into still larger subsystems. A cell in your liver is a subsystem of an organ, which is a subsystem of you as an organism, and you are a subsystem of a family, an athletic team, a musical group, and so forth. These groups are subsystems of a town or city, and then a nation, and then the whole global socioeconomic system that dwells within the biosphere system.
Hierarchies are brilliant systems inventions, not only because they give a system stability and resilience, but also because they reduce the amount of information that any part of the system has to keep track of.
Hierarchical systems are partially decomposable. They can be taken apart and the subsystems with their especially dense information links can function, at least partially, as systems in their own right. When hierarchies break down, they usually split along their subsystem boundaries
To be a highly functional system, hierarchy must balance the welfare, freedoms, and responsibilities of the subsystems and total system—there must be enough central control to achieve coordination toward the large system goal, and enough autonomy to keep all subsystems flourishing, functioning, and self-organizing.
Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
Limits to Growth
In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.
For any physical entity in a finite environment, perpetual growth is impossible. Ultimately, the choice is not to grow forever but to decide what limits to live within. If a company produces a perfect product or service at an affordable price, it will be swamped with orders until it grows to the point at which some limit decreases the perfection of the product or raises its price. If a city meets the needs of all its inhabitants better than any other city, people will flock there until some limit brings down the city’s ability to satisfy peoples’ needs.
There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed
Growth has costs as well as benefits, and we typically don’t count the costs—among which are poverty and hunger, environmental destruction, and so on—the whole list of problems we are trying to solve with growth! What is needed is much slower growth, very different kinds of growth, and in some cases no growth or negative growth
Dominance is an important concept in systems thinking. When one loop dominates another, it has a stronger impact on behavior. Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine the behavior.
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
Changing the delays in a system can make it much easier or much harder to manage. You can see why system thinkers are somewhat fanatic on the subject of delays. We’re always on the alert to see where delays occur in systems, how long they are, whether they are delays in information streams or in physical processes. We can’t begin to understand the dynamic behavior of systems unless we know where and how long the delays are →
The ability to delay gratification is a demonstration of wisdom
Balancing Feedback Loops
]] are goal-seeking or stability-seeking. Each tries to keep a stock at a given value or within a range of values.
Any balancing feedback loop needs a goal (the thermostat setting), a monitoring and signaling device to detect deviation from the goal (the thermostat), and a response mechanism (the furnace and/or air conditioner, fans, pumps, pipes, fuel, etc.).
The presence of a feedback mechanism doesn’t necessarily mean that the mechanism works well. The feedback mechanism may not be strong enough to bring the stock to the desired level. Feedbacks—the interconnections, the information part of the system—can fail for many reasons. Information can arrive too late or at the wrong place. It can be unclear or incomplete or hard to interpret. The action it triggers may be too weak
Every balancing feedback loop has its breakdown point, where other loops pull the stock away from its goal more strongly than it can pull back. That can happen in this simulated thermostat system, if I weaken the power of the heating loop (a smaller furnace that cannot put out as much heat)
The second kind of feedback loop is amplifying, reinforcing, self-multiplying, snowballing—a vicious or virtuous circle that can cause healthy growth or runaway destruction (
Reinforcing Feedback Loops
Thinking in Systems
You’ll be thinking not in terms of a static world, but a dynamic one. You’ll stop looking for who’s to blame; instead you’ll start asking, “What’s the system?”
Look at Flows and Stocks
When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system. That’s because long term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.
Economists Tend to Miss Stocks
These behavior-based models are more useful than event-based ones, but they still have fundamental problems. First, they typically overemphasize system flows and underemphasize stocks. Economists follow the behavior of flows, because that’s where the interesting variations and most rapid changes in systems show up. Economic news reports on the national production (flow) of goods and services, the GNP, rather than the total physical capital (stock) of the nation’s factories and farms and businesses that produce those goods and services. But without seeing how stocks affect their related flows through feedback processes, one cannot understand the dynamics of economic systems or the reasons for their behavior.
Second, and more seriously, in trying to find statistical links that relate flows to each other, econometricians are searching for something that does not exist. There’s no reason to expect any flow to bear a stable relationship to any other flow. Flows go up and down, on and off, in all sorts of combinations, in response to stocks
Biases in Working with Systems
You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy.
Thinking Linearly, Statically
Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.
So the world often surprises our linear-thinking minds. If we’ve learned that a small push produces a small response, we think that twice as big a push will produce twice as big a response. But in a nonlinear system, twice the push could produce one-sixth the response, or the response squared, or no response at all.
Forgetting about Delays
When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem. (example of climate change)
Missing the Whole
Different Levels of Magnification
You can see some things through the lens of the human eye, other things through the lens of a microscope, others through the lens of a telescope, and still others through the lens of systems theory. Everything seen through each kind of lens is actually there. Each way of seeing allows our knowledge of the wondrous world in which we live to become a little more complete. (Ways of Looking)
Map is Not the Territory
the map is not the territory -menu is not the food-
Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head
our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions.
Forgetting that Boundaries are Fabricated
systems rarely have real boundaries. Everything, as they say, is connected to everything else, and not neatly. There is no clearly determinable boundary between the sea and the land, between sociology and anthropology, between an automobile’s exhaust and your nose. There are only boundaries of word, thought, perception, and social agreement—artificial, mental-model boundaries.
If we’re to understand anything, we have to simplify, which means we have to make boundaries. Often that’s a safe thing to do
boundaries can produce problems when we forget that we’ve artificially created them.
]] means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system
Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires. It’s amazing how quickly and easily behavior changes can come, with even slight enlargement of bounded rationality, by providing better, more complete, timelier information
Problematic System Archetypes
Policy Resistant Systems (Balancing Loops)
Balancing loops stabilize systems; behavior patterns persist.
When various actors try to pull a system stock toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the stock farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining.
THE WAY OUT
Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized—or redefinitions of larger and more important goals that everyone can pull toward together
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing.
Tragedy of the Commons (delays in feedback)
The Tragedy of the Commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
The more users there are, the more resource is used. The more resource is used, the less there is per user. If the users follow the bounded rationality of the commons (“There’s no reason for me to be the one to limit my cows!”), there is no reason for any of them to decrease their use. Eventually, then, the harvest rate will exceed the capacity of the resource to bear the harvest. Because there is no feedback to the user, overharvesting will continue
3 Options to Deal with it
- Educate and exhort. Help people to see the consequences of unrestrained use of the commons. Appeal to their morality. Persuade them to be temperate. Threaten transgressors with social disapproval or eternal hellfire.
- Privatize the commons. Divide it up, so that each person reaps the consequences of his or her own actions. If some people lack the self-control to stay below the carrying capacity of their own private resource, those people will harm only themselves and not others.
- Regulate the commons. Garrett Hardin calls this option, bluntly, “mutual coercion, mutually agreed upon.” Regulation can take many forms, from outright bans on certain behaviors to quotas, permits, taxes, incentives. To be effective, regulation must be enforced by policing and penalties
Drift to Low Performance (eroding goals)
The actor tends to believe bad news more than good news (Negativity Bias). As actual performance varies, the best results are dismissed as aberrations, the worst results stay in the memory. The actor thinks things are worse than they really are.
And to complete this tragic archetype, the desired state of the system is influenced by the perceived state. Standards aren’t absolute. When perceived performance slips, the goal is allowed to slip
The balancing feedback loop that should keep the system state at an acceptable level is overwhelmed by a reinforcing feedback loop heading downhill. The lower the perceived system state, the lower the desired state. The lower the desired state, the less discrepancy, and the less corrective action is taken. The less corrective action, the lower the system state. If this loop is allowed to run unchecked, it can lead to a continuous degradation in the system’s performance.
Drift to low performance is a gradual process. If the system state plunged quickly, there would be an agitated corrective process. But if it drifts down slowly enough to erase the memory of (or belief in) how much better things used to be, everyone is lulled into lower and lower expectations, lower effort, lower performance
modern industrial culture has eroded the goal of morality. The workings of the trap have been classic, and awful to behold.
Examples of bad human behavior are held up, magnified by the
]], affirmed by the culture, as typical. This is just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are “not news.” They are exceptions. Must have been a saint. Can’t expect everyone to behave like that
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst. If perceived performance has an upbeat bias instead of a downbeat one, if one takes the best results as a standard, and the worst results only as a temporary setback, then the same system structure can pull the system up to better and better performance.
(Negative) Escalation (price wars)
Escalation comes from a reinforcing loop set up by competing actors trying to get ahead of each other. The goal of one part of the system or one actor is not absolute, like the temperature of a room thermostat being set at 18°C (65°F), but is related to the state of another part of the system, another actor. Like many of the other system traps, escalation is not necessarily a bad thing. If the competition is about some desirable goal, like a more efficient computer or a cure for AIDS, it can hasten the whole system toward the goal. But when it is escalating hostility, weaponry, noise, or irritation, this is an insidious trap indeed.
Can be Positive but Tread Lightly
Escalation also could be about peacefulness, civility, efficiency, subtlety, quality. But even escalating in a good direction can be a problem, because it isn’t easy to stop. Each hospital trying to outdo the others in up-to-date, powerful, expensive diagnostic machines can lead to out-of-sight health care costs. Escalation in morality can lead to holier-than-thou sanctimoniousness. Escalation in art can lead from baroque to rococo to kitsch. Escalation in environmentally responsible lifestyles can lead to rigid and unnecessary puritanism.
One way out of the escalation trap is unilateral disarmament—deliberately reducing your own system state to induce reductions in your state. Within the logic of the system, this option is almost unthinkable. But it actually can work, if one does it with determination, and if one can survive the short-term advantage of the competitor.
The only other graceful way out of the escalation system is to negotiate a disarmament. That’s a structural change, an exercise in system design. It creates a new set of balancing controlling loops to keep the competition in bounds
Rich Get Richer
whenever the winners of a competition receive, as part of the reward, the means to compete even more effectively in the future. That’s a reinforcing feedback loop, which rapidly divides a system into winners who go on winning, and losers who go on losing. (seems exactly what happens in PoW and PoS in
]] and what
]] is avoiding)
There are many devices to break the loop of the rich getting richer and the poor getting poorer: tax laws written (unbeatably) to tax the rich at higher rates than the poor; charity; public welfare; labor unions; universal and equal health care and education; taxation on inheritance
Example in Food Systems
Are insects threatening the crops? Rather than examine the farming the monocultures, the destruction of natural ecosystem controls that have led to the pest outbreak, just apply pesticides. That will make the bugs go away, and allow more monocultures, more destruction of ecosystems. That will bring back the bugs in greater outbursts, requiring more pesticides in the future.
]] is painful. It may be the physical pain of heroin withdrawal, or the economic pain of a price increase to reduce oil consumption, or the consequences of a pest invasion while natural predator populations are restoring themselves. Withdrawal means finally confronting the real (and usually much deteriorated) state of the system and taking the actions that the addiction allowed one to put off
If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself. If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be - Tao Te Ching 17 - Invisible Leadership
Rule beating means evasive action to get around the intent of a system’s rules—abiding by the letter, but not the spirit, of the law
Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above.
There are two generic responses to rule beating.
- One is to try to stamp out the self-organizing response by strengthening the rules or their enforcement—usually giving rise to still greater system distortion. That’s the way further into the trap.
- The way out of the trap, the opportunity, is to understand rule beating as useful feedback, and to revise, improve, rescind, or better explain the rules. Designing rules better means foreseeing as far as possible the effects of the rules on the subsystems, including any rule beating they might engage in, and structuring the rules to turn the self-organizing capabilities of the system in a positive direction. Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules
Seeking the Wrong Goal
One of the most powerful ways to influence the behavior of a system is through its purpose or goal. That’s because the goal is the direction-setter of the system, the definer of discrepancies that require action, the indicator of compliance, failure, or success toward which balancing feedback loops work. If the goal is defined badly, if it doesn’t measure what it’s supposed to measure, if it doesn’t reflect the real welfare of the system, then the system can’t possibly produce a desirable result
GDP as Prime Example
The gross national product does not allow for the health of our children, the quality of their education or the joy of their play. It does not include the beauty of our poetry or the strength of our marriages, the intelligence of our public debate or the integrity of our public officials. It measures neither our wit nor our courage, neither our wisdom nor our learning, neither our compassion nor our devotion to our country, it measures everything in short, except that which makes life worthwhile. - Kennedy
The world would be a different place if instead of competing to have the highest per capita GNP, nations competed to have the highest per capita stocks of wealth with the lowest throughput, or the lowest infant mortality, or the greatest political freedom, or the cleanest environment, or the smallest gap between the rich and the poor.
Counterintuitive—that’s Forrester’s word to describe complex systems.
]] frequently are not intuitive. Or if they are, we too often use them backward
Constants and parameters such as subsidies, taxes, standards
Numbers, the sizes of flows, are dead last on my list of powerful interventions. Diddling with the details, arranging the deck chairs on the Titanic. Probably 90—no 95, no 99 percent—of our attention goes to parameters, but there’s not a lot of leverage in them. It’s not that parameters aren’t important—they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about such variables as taxes and the minimum wage, and so fight fierce battles over them. But changing these variables rarely changes the behavior of the national economy system. If the system is chronically stagnant, parameter changes rarely kick-start it. If it’s wildly variable, they usually don’t stabilize it. If it’s growing out of control, they don’t slow it down.
Putting different hands on the faucets may change the rate at which the faucets turn, but if they’re the same old faucets, plumbed into the same old system, turned according to the same old information and goals and rules, the system behavior isn’t going to change much. Electing Bill Clinton was definitely different from electing the elder George Bush, but not all that different, given that every president is plugged into the same political system
Can be Powerful
Parameters become leverage points when they go into ranges that kick off one of the items higher on this list. Interest rates, for example, or birth rates, control the gains around reinforcing feedback loops. System goals are parameters that can make big differences.
The sizes of stabilizing stocks relative to their flows
stocks that are big, relative to their flows, are more stable than small ones. In chemistry and other fields, a big, stabilizing stock is known as a buffer.
buffers are usually physical entities, not easy to change.
10. Stock-and-Flow Structures
Physical systems and their nodes of intersection
often physical rebuilding is the slowest and most expensive kind of change to make in a system. Some stock-and-flow structures are just plain unchangeable.
Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place.
The lengths of time relative to the rates of system changes
Not easy to change
I would list delay length as a high leverage point, except for the fact that delays are not often easily changeable. Things take as long as they take. You can’t do a lot about the construction time of a major piece of capital, or the maturation time of a child, or the growth rate of a forest. It’s usually easier to slow down the change rate, so that inevitable feedback delays won’t cause so much trouble. That’s why growth rates are higher up on the leverage point list than delay times
8. Balancing Feedback Loops
The strength of the feedbacks relative to the impacts they are trying to correct
Market as a (Distorted) Balancing Feedback Loop
Take markets, for example, the balancing feedback systems that are all but worshipped by many economists. They can indeed be marvels of self correction, as prices vary to moderate supply and demand and keep them in balance. Price is the central piece of information signaling both producers and consumers. The more the price is kept clear, unambiguous, timely, and truthful, the more smoothly markets will operate. Prices that reflect full costs will tell consumers how much they can actually afford and will reward efficient producers. Companies and governments are fatally attracted to the price leverage point, but too often determinedly push it in the wrong direction with subsidies, taxes, and other forms of confusion.
These modifications weaken the feedback power of market signals by twisting information in their favor. The real leverage here is to keep them from doing it. Hence, the necessity of antitrust laws, truth-in-advertising laws, attempts to internalize costs (such as pollution fees), the removal of perverse subsidies, and other ways of leveling market playing fields
7. Reinforcing Feedback Loops
The strength of the gain of driving loops
Reinforcing feedback loops are sources of growth, explosion, erosion, and collapse in systems. A system with an unchecked reinforcing loop ultimately will destroy itself. That’s why there are so few of them. Usually a balancing loop will kick in sooner or later. The epidemic will run out of infectible people—or people will take increasingly stronger steps to avoid being infected.
Reducing the gain around a reinforcing loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening balancing loops, and far more preferable than letting the reinforcing loop run. (Good analogy: better to slow down the car than to just get better brakes)
Rich people collect interest; poor people pay it. Rich people pay accountants and lean on politicians to reduce their taxes; poor people can’t. Rich people give their kids inheritances and good educations. Antipoverty programs are weak balancing loops that try to counter these strong reinforcing ones. It would be much more effective to weaken the reinforcing loops. That’s what progressive income tax, inheritance tax, and universal high-quality public education programs are meant to do.
6. Information Flows
The structure of who does and does not have access to information
Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure. The Tragedy of the Commons that is crashing the world’s commercial fisheries occurs because there is little feedback from the state of the fish population to the decision to invest in fishing vessels. Contrary to economic opinion, the price of fish doesn’t provide that feedback. As the fish get more scarce they become more expensive, and it becomes all the more profitable to go out and catch the last few. That’s a perverse feedback, a reinforcing loop that leads to collapse. It is not price information but population information that is needed.
Accountability Requires Transparency
There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops—and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen
Incentives, punishments, constraints
Power over the rules is real power. That’s why lobbyists congregate when Congress writes laws, and why the Supreme Court, which interprets and delineates the Constitution—the rules for writing the rules—has even more power than Congress. If you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them
The power to add, change, or evolve system structure
The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself. The human immune system has the power to develop new responses to some kinds of insults it has never before encountered.
Any system, biological, economic, or social, that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet. The intervention point here is obvious, but unpopular. Encouraging variability and experimentation and diversity means “losing control.” Let a thousand flowers bloom and anything could happen! Who wants that? Let’s play it safe and push this lever in the wrong direction by wiping out biological, cultural, social, and market diversity (
The purpose or function of the system
Unknown Goals and Corporations as Cancer
Even people within systems don’t often recognize what whole-system goal they are serving. “To make profits,” most corporations would say, but that’s just a rule, a necessary condition to stay in the game. What is the point of the game? To grow, to increase market share, to bring the world (customers, suppliers, regulators) more and more under the control of the corporation, so that its operations becomes ever more shielded from uncertainty. John Kenneth Galbraith recognized that corporate goal—to engulf everything—long ago.6 It’s the goal of a cancer too. Actually it’s the goal of every living population—and only a bad one when it isn’t balanced by higher level balancing feedback loops that never let an upstart power-loop-driven entity control the world. The goal of keeping the market competitive has to trump the goal of each individual corporation to eliminate its competitors, just as in ecosystems, the goal of keeping populations in balance and evolving has to trump the goal of each population to reproduce without limit.
The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises
The shared idea in the minds of society, the great big unstated assumptions, constitute that society’s paradigm, or deepest set of beliefs about how the world works. (Giorgio’s examples: growth is always good, money measures value, nature is to be used for our benefit, one can “own land”) (
Everything follows Paradigms
Paradigms are the sources of systems. From them, from shared social agreements about the nature of reality, come system goals and information flows, feedbacks, stocks, flows, and everything else about systems.
Kepler showing that the earth is not the center of the universe, or Einstein hypothesizing that matter and energy are interchangeable, or Adam Smith postulating that the selfish actions of individual players in markets wonderfully accumulate to the common good, people who have managed to intervene in systems at the level of paradigm have hit a leverage point that totally transforms systems. You could say paradigms are harder to change than anything else about a system, and therefore this item should be lowest on the list, not second to- highest. But there’s nothing physical or expensive or even slow in the process of paradigm change. In a single individual it can happen in a millisecond.
You keep pointing at the anomalies and failures in the old paradigm. You keep speaking and acting, loudly and with assurance, from the new one. You insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather, you work with active change agents and with the vast middle ground of people who are open-minded.
Systems modelers say that we change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. I say that because my own paradigms have been changed that way.
1. Transcending Paradigms
There is yet one leverage point that is even higher than changing a paradigm. That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension (
If no paradigm is right, you can choose whatever one will help to achieve your purpose. If you have no idea where to get a purpose, you can listen to the universe.
It is in this space of mastery over paradigms that people throw off addictions, live in constant joy, bring down empires, get locked up or burned at the stake or crucified or shot, and have impacts that last for millennia.
The higher the leverage point, the more the system will resist changing it—that’s why societies often rub out truly enlightened beings.
There are no cheap tickets to mastery. You have to work hard at it, whether that means rigorously analyzing a system or rigorously casting off your own paradigms and throwing yourself into the humility of not-knowing.
Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionist science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty.
The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned.
We can’t control systems or figure them out. But we can dance with them!
I already knew that, in a way. I had learned about dancing with great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.
But there it was, the message emerging from every computer model we made. Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity—our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality
A society that talks incessantly about “productivity” but that hardly understands, much less uses, the word “resilience” is going to become productive and not resilient. A society that doesn’t understand or use the term “carrying capacity” will exceed its carrying capacity. A society that talks about “creating jobs” as if that’s something only companies can do will not inspire the great majority of its people to create jobs, for themselves or anyone else (
language shapes our world
- Before you disturb the system in any way, watch how it behaves
- Pay Attention to What Is Important, Not Just What Is Quantifiable
- Make Feedback Policies for Feedback Systems
- You can imagine why a dynamic, self-adjusting feedback system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops (Example of Jimmy Carter, proposing to help the Mexican economy until immigration stopped and to tax gas imports depending on the quantity (the higher the imports the higher the tax))
- Locate Responsibility in the System
- Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. (Intrinsic responsibility seems like something to strive for. So decision makers have feedback on their action and can quickly learn and correct. Example of central university thermostat vs one thermostat in each room)
- Stay Humble and
- In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes.
everything is interconnected -interbeing-
- The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
- At any given time, the input that is most important to a system is the one that is most limiting.