Thesis #14: Complexity is subject to diminishing returns


Joseph Tainter’s 1988 *The Collapse of Complex Societies* remains the definitive work in the field of collapse. Tainter reviews other explanations of collapse–including economics, invasion and environmental problems–and finds them all insufficient. While these factors certainly play their roles, these are also the very same stressors that complexity is *supposed* to deal with. Thus, while these might suffice as proximate causes, it only underlines the ultimate cause all the more. Why do complex societies become vulnerable to the very kinds of stress which, at an earlier time in its history, the society in question would simply shrug off?

Tainter’s answer lies with complexity itself, and the law of diminishing returns. As a society becomes more complex, greater complexity becomes more costly. The escalation of complexity becomes increasingly difficult to maintain, until it finally becomes impossible.

It is well worth noting, as Tainter does, that complexity is a function of energy. He writes:

Human societies and political organizations, like all living systems, are maintained by a continuous flow of energy. From the simplest familial unit to the most complex regional hierarchy, the institutions and patterned interactions that comprise a human society are dependent on energy. At the same time, the mechanisms by which human groups acquire and distribute basic resources are conditioned by, and integrated within, sociopolitical institutions. Energy flow and sociopolitical organization are opposites sides of an equation. Neither can exist, in a human group, without the other, nor can either undergo substantial change without altering both the opposite member and the balance of the equation. Energy flow and sociopolitical organization must evolve in harmony.

Not only is energy flow required to maintain a sociopolitical system, but the amount of energy must be sufficient for the complexity of that system. Leslie White observed a number of years ago that cultural evolution was intricately linked to the quantities of energy harvested by a human population. The amounts of energy required per capita to maintain the simplest human institutions are incredibly small compared with those needed by the most complex. White once estimated that a cultural system activated primarily by human energy could generate only about 120 horsepower per capita per year. This contrasts sharply with the hundreds to thousands of horsepower at the command of members of industrial societies. Cultural complexity varies accordingly. Julian Steward pointed out the quantitative difference between the 3,000 to 6,000 cultural elements early anthropologists documented for the native populations of western North America, and the more than 500,000 artifact types that U.S. military forces landed at Casa Blanca in World War II.

More complex societies are more costly to maintain than simpler ones, requiring greater support levels per capita. As societies increase in complexity, more networks are created among individuals, more hierarchical controls are created to regulate these networks, more information is processed, there is more centralization of information flow, there is increasing need to support specialists not directly involved in resource production, and the like. All this complexity is dependent upon energy flow at a scale vastly greater than that characterizing small groups of self-sufficient foragers or agriculturalists. The result is that as a society evolves toward greater complexity, the support costs on each individual will also rise, so that the population as a whole must allocate increasing portions of its energy budget to maintaining organizational institutions. This is an immutable fact of societal evolution, and is not mitigated by type of energy source.

So, we see with the rise of complexity two distinct phenomena arising with relation to energy. First, greater complexity allows for more energy to be unlocked. Agriculture is more complex than foraging, and yields more calories than foraging; an oil rig is far more complex than a bow drill for making fire, and yields far more energy. At the same time, complexity also has an energy cost–a cost which grows greater the more complex a society is. Thus, complexity is an investment. It has a benefit, and it has a cost, both in terms of energy.

It is also worth noting that, for a variety of reasons, including the fact that human population is a function of food supply (thesis #4) and thus, energy, as well as the Prisoner’s Dilemma that forces complex societies into a positive feedback loop of increasing investment in complexity (thesis #12), that societies are often compelled to make every investment into complexity that they are capable of making, due both to their own population pressures, as well as the threat of competition from those societies that *do* make such investments. As such, complexity becomes a function of energy flow, such that given information about a society’s energy flow, its level of complexity can be accurately predicted.

However, Tainter has also highlighted the *cost* of complexity–a cost which, due to the law of diminishing returns, is constantly increasing, while the benefits of complexity are likewise diminishing. This provides a counter-force to the positive feedback loop of societal complexity. Eventually, further complexity becomes far too costly, making the positive feedback loop impossible to pursue any longer. When that occurs, as Tainter highlights, it means collapse.

Tainter discusses four aspects of complexity in his discussion of complexity’s marginal returns:

  1. Agriculture and resource production.
  2. Information processing.
  3. Sociopolitical control and specialization.
  4. Overall economic productivity.

To this, I would like to add for the purposes of our current discussion:

  1. Technological innovation.

It stands to reason that if each of these five elements of complexity are subject to diminishing returns, then we may also conclude that the thesis, “Complexity is subject to diminishing returns,” is also reasonable.

Agriculture and resource production.

The Law of Diminishing Marginal Returns was originally formulated in the context of agricultural production. It was observed that adding more workers to a field would increase productivity. However, when this was pursued far enough, it became evident that the added productivity of any given worker was not strictly additive. Two workers could double the yield of just one, but eventually a point was reached where each additional worker meant less of an increase over the previous one. Each new worker still added some additional yield, but that additional yield began to approach zero. Meanwhile, the investment of one more worker remained the same. Thus, the marginal return–how much is returned per investment–went down. The point at which adding another unit of investment, such as another worker, ceased to have a simple additive effect on returns, is called the point of diminishing returns. Past that point, investment cost remains the same, but the benefits returned begin to approach zero.

When we abstract to any kind of subsistence technology, we see this is also tied to the “low-hanging fruit” problem–in this case, *literal* fruit. If a forager band picks the largest, sweetest, most nutritious, and easiest to acquire fruit first, then any expansion of harvesting must, necessarily, involve more effort (as they took the easiest to acquire fruit first, so the remaining fruit must be more difficult to obtain), for less reward (as they took the largest, sweetest and most nutritious fruit first, so the remaining fruit must be smaller, more bitter, and/or less nutritious). The same principle extends to horticulture and agriculture, as well. The first fields will be planted in the most fertile, easily tilled soil; further cultivation must, then, take place in less fertile and/or more difficult soil. Thus, either the cost will go up, the yield will go down, or–as is usually the case–both.

Information processing.

Jeff Vail has often written on the inefficiency of hierarchy’s information processing capabilities. The span of control limits how many subordinates any hierarch can effectively administer (usually around 5), while the SNAFU principle and signal degradation limits how deep a hierarchy can go before suffering severe efficiency problems (see thesis #11). Thus, while hierarchy provides the only readily available alternative to simply working inside the limit of Dunbar’s number imposed by human neurology, it has a set of limits all its own. To expand this hierarchy beyond those limits means either overwhelming each hierarch beyond the span of control, and/or creating a hierarchy too deep, such that signal degradation becomes an overwhelming concern. This seriously limits the effectiveness of each new investment to expand such a hierarchy, necessitating the use of a new class of specialists dedicated simply to information processing. This increases the cost of expanding a hierarchical information processing structure–costs which yield increasingly little benefit as signal degradation sets in. As an example, in “‘Span of Control’ and Inefficiency of Hierarchy,” Vail writes:

The US Federal Government’s National Incident Management System (NIMS) is based upon the Incident Control System (ICS) methodology developed by wildfire fighters to create a standard for command and control systems (hierarchy) as government agencies respond to incidents. NIMS and ICS both state that the maximum desirable span of control is 5, meaning that one supervisor should control no more than 5 subordinates. The US Military follows a similar formula: one commander controls three subordinate units, as well as a staff function, which results in a span of control of roughly 5. This military formula is virtually identical around the world–a time-tested formula for maximum span of control. The military formula, however, is more revealing, for while it uses a 5:1 span of control, the *operational* span of control is only 3:1 (that is, the number of subordinate units that actually carry out the fundamental mission of the organization). The remaining two (roughly) staff positions under each commander are actually information processing assistants necessary to make even the 3:1 span of control effective. Without getting in to two much details, those staff positions are normally broken down to an executive officer, who is in turn responsible for the commander’s administrative staff, and a deputy commander, who is in turn responsible for the commander’s non-administrative staff (Intelligence, Logistics, Human Resources, etc.). As a result of the executive officer and deputy commander concept, the non-operational tail actually extends down two layers from each “operational” commander at the higher levels.

Tainter discusses education and R&D under the heading of “information processing,” and shows that each of them are also subject to diminishing returns, and both for much the same reason. Basic information is not only easily obtained, it is the foundation for all other information. By comparison, more advanced knowledge is more difficult to obtain, but is much more narrowly applicable–as it applies only to a specific field of research or learning. In education, one can look at how easily children learn to read, and how universally important that skill is, versus the extreme cost of a Ph.D., which is much more narrow in its usefulness. In science, research and development, we can note the low cost of a “paradigm shift” like evolution and how much such shifts have informed our knowledge, versus much costlier information that is much more esoteric in its application. Thus, we see a problem of “low-hanging fruit” applied to knowledge itself. The knowledge we come to first forms a basis of all other things we learn, making it by definition more widely applicable. The knowledge we gain based on that comes at a greater cost, but it is much more esoteric. It is worth noting that Tainter *does* discuss the role of a “paradigm shift” in essentially “resetting” a new marginal return curve for such fields.

Education especially faces an increasing burden as society becomes more complex, and there is simply more society that each individual is expected to be conversant in. An American child requires some two decades of education in order to become fully conversant in the various areas of mathematics, science and culture that is expected of any individual in contemporary America. By comparison, most forager cultures had taught their entire culture to their children by their sixth birthday, leaving plenty of time to learn up to 1,000 different species of wild, edible plants, as well as advanced hunting techniques, so that they could be fully self-sufficient by the age of 12.

Sociopolitical control and specialization.

The diminishing returns of sociopolitical complexity are the bread and butter of 24 hour news networks and any politican running on a platform of “reform.” It is precisely the inefficiencies engendered by such diminishing returns that has so often been bemoaned in the political process–and it is precisely because this is an intractable feature of sociopolitical complexity that every politician’s promise to “clean up government” ultimately fails. Tainter identifies six reasons for diminishing sociopolitical marginal returns:

  1. Increasing size of bureaucracies.
  2. Increasing specialization of bureaucracies.
  3. The cumulative nature of organizational solutions.
  4. Increasing taxation.
  5. Increasing costs of legitimizing activities.
  6. Increasing costs of internal control and external defense.

Very often, more efficient administration is an excellent response to some stress. After 911, noting the failure of information processing that allowed the attacks to take place, the Bush administration created the Department of Homeland Security in order to effect better information processing across many of the diverse federal agencies involved. Ultimately, however, this added several more levels of hierarchy–and thus, decreased the information processing capabilities of hierarchy (by introducing more signal degradation), while increasing the cost (by requiring more information processing personnel–more bureaucracy–to handle such inefficiencies). Thus we see that much of the reason for the diminishing returns on sociopolitical complexity, are the diminishing returns on information processing through a complex structure.

Sociopolitical structures must also undertake legitimizing activities in order to justify their existence. Ancient Rome had “bread and circuses” on a monumental scale; today, welfare programs take up the bulk of the non-military federal budget in the United States. Tainter explains:

The appeasement of urban mobs presents the classic illustration of this principle. Any level of activities undertaken to appease such populations–the bread and circuses syndrome–eventually becomes the expected minimum. An increase in the cost of bread and circuses, which seems to have been required in Imperial Rome to legitimize such things as the accession of a new ruler or his continued reign, may bring no increased return beyond a state of non-revolt. Rewards to Roman military personnel would often follow the same pattern, particularly when bounties were granted upon a ruler’s accession. Roman soldiers regarded such bounties as a right.

Though by far the greater expense, for both Rome and the United States, was the military. Tainter explains why this is also subject to diminishing returns:

If increased complexity develops to deal with internal unrest or external threats, this solution may yield no tangible benefit for much of the population. Arms races present a classic example. Increasing costs of military hardware, and military and civilian personnel, when undertaken to meet a competitor’s like increases, yield no increased security for the added cost. Such increased costs are often undertaken merely to maintain the balance-of-power status quo. As a military apparatus increases in complexity its administrative costs increase disproportionately, as Parkinson’s figures indicate, usually to little or no competitive advantage.

Overall economic productivity

Economics does not call many things “laws,” but it has granted that honor to the Law of Diminishing Marginal Returns, because it governs nearly every facet of the economy–and thus, the economy itself.

As GNP rises, per capita rates of economic growth decline, so that as an economy expands, its rate of growth slows down. Many economists tie this to “using up” innovations, requiring that new innovations be made–thus, incurring the cost of further R&D, which is itself bound by diminishing marginal returns, as we have already discussed. Tainter hypothesizes that this may be but one application of a more abstract principle: as the marginal return curves of other areas of complexity require more and more resources simply to maintain the status quo, there is less and less capital available for investment in the future growth of the economy.

Technological innovation

One aspect of complexity which Tainter does not specifically address as such is that of technological innovation, the oft-cited counterbalance that makes no trend “inevitable.” This faith in the messianic power of technology to save us from all ills is an irrational statement of religious belief. There is no rational, logical or scientific reason to believe this to be so. In fact, logic, science and reason more often present us with the *limitations*of technology. For instance, Einstein showed that no one can go faster than the speed of light for very real reasons. Science fiction authors often like to compare this to old pronouncements–made without any logic case–that the sound barrier could never be breached. The difference is not the type of claim, of course, but the evidence backing it up. Computational theory recognizes a large set of problems which are impossible for a computer to solve, and another class that can only be solved in exponential time, making them forever impractical, regardless of what innovations we make in computer hardware. Jevon’s Paradox highlights the futility of more efficient technologies to limit the use of resources–by making the use of that resource more efficient, such a technology results in *greater* overall use, not less. We all know pronouncements like that falsely attributed to Charles H. Duell, U.S. Commissioner of Patents, in 1899, “Everything that can be invented has been invented.” Such statements were wrong in the past, therefore, any similar statements made in the future must also be wrong. This is nearly as egregious a logical error as the belief that technology can solve all problems itself.

Yet, technology is, itself, subject to diminishing returns. Tainter explains:

Technical innovation, particularly the institutionalized variety we know today, is unusual in human history. It requires some level of investment in research and development. Such investment is difficult to capitalize in an agriculturally-based society that produces little surplus per capita. Technical innovation often responds to labor shortages, which in the ancient world were the exception. As a result, technical development in societies not based on a fossil fuel economy tends to be minimal. Where technical innovation in ancient societies did occur, it often tended actually to depress the productivity of labor.

In industrial societies, technical innovation responds to market factors, particularly physical needs and economic distress. It is not, though, always the panacea that is imagined. In an input-output analysis of the U.S. economy from 1947-58, corrected for inflation, Carter found that ‘technological change (or progress!) had actually added about $14 billion to the task of satisfying the same final [national] demand.’ Technological innovation, as discussed above, is subject to the law of diminishing returns, and this tends to reduce (but not eliminate) its long-term potential for resolving economic weakness. Using the data cited by Wolfle, Schrerer observes that if R&D expenditures must grow at 4-5 percent per year to boost productivity 2 percent, such a trend cannot be continued indefinitely or the day will come when we must all be scientists. He is accordingly pessimistic about the prospects for long-term productivity growth. Colin Renfrew correctly points out (in the context of discussing the development of civilization in the Aegean) that economic growth is itself susceptible to declining marginal productivity.

The lever is perhaps the simplest technology possible. It is cheap, virtually impossible to break, and highly effective for all manner of tasks. The lever is incorporated in many other kinds of technology. As a piece of technology becomes more complex, it becomes more prone to breaking. As any computer programmer knows, simplicity and robustness are usually the same thing, leading to the elegance of simplicity incorporated as an ideal in Eric Raymond’s definition of the bazaar model. Many of our greatest technological achievements have been achieved so cheaply, they were actually accidents. Penicillin, perhaps our greatest medical achievement, was discovered by accident. Its total development cost approximately $20,000. Compare this to the usual R&D budget of contemporary drug companies, running well into the millions of dollars and more, and taking an average of about 20 years.

Ultimately, a new technology is another piece of complexity, and ultimately it is precisely that complexity, rather than any one crisis we presently face, that is the ultimate cause of collapse. Other crises may serve as a proximate cause, but it is the marginal return curve on complexity itself that seals the fate of any complex society. Thus, any “techno-fix” solution may succeed in solving any given proximate cause for collapse, only by contributing still more to the ultimate cause of collapse–complexity itself. Neither is this considering the profoundly negative, unexpected consequences that so many technologies yield.

Technology is subject to diminishing returns; that means that innovation will not end, only that it will become (on average) increasingly mundane, but it will continue to cost more and more. Moreover, technology cannot solve the underlying, systemic issues we face. Technology has its place, and it can be a wonderful thing–but it is not a panacea, and the exuberant faith of the Enlightenment in it is certainly misplaced.


Agriculture, information processing, sociopolitical control, economic activity and technological innovation are all subject to diminishing returns, because complexity itself is subject to diminishing returns. Tainter writes:

A society increasing in complexity does so as a system. That is to say, as some of its interlinked parts are forced in a direction of growth, others must adjust accordingly. For example, if complexity increases to regulate regional subsistence production, investments will be made in hierarchy, in bureaucracy, and in agricultural facilities (such as irrigation networks). The expanding hierarchy requires still further agricultural output for its own needs, as well as increased investment in energy and minerals extraction. An expanded military is needed to protect the assets thus created, requiring in turn its own sphere of agricultural and other resources. As more and more resources are drained from the support population to maintain this system, an increased share must be allocated to legitimization or coercion. This increased complexity requires specialized administrators, who consume further shares of subsistence resources and wealth. To maintain the productive capacity of the base population, further investment is made in agriculture, and so on.

The illustration could be expanded, tracing still further the interdependencies within such a growing system, but the point has been made: a society grows in complexity as a system. To be sure, there are instances where one sector of a society grows at the expense of others, but to be maintained as a cohesive whole, a social system can tolerate only certain limits to such conditions.

Thus, it is possible to speak of sociocultural evolution by the encompassing term ‘complexity,’ meaning by this the interlinked growth of the several subsystems that comprise a society.

Tainter then presents the idealized marginal returns curve below, and adds some discussion regarding key points along the way.

Tainter's graph of the diminishing marginal returns on complexity

At point B~1~C~1~, the marginal returns of complexity reach an inflection point as they near the point of diminishing returns (B~2~C~2~). Between *B~1~C~1~* and B~1~C~3~, a complex society is at increasing risk of collapse. It is at *B~1~C~3~* that collapse actually occurs. The costs of complexity relative to its benefits are simply too high, and substantial numbers across the society begin to see benefits to “dropping out” of the complexity of that society. In ancient Rome, we might see the baugaudae or the Allamanni as examples of this trend among the lower classes; various landlords who essentially “seceeded” from Rome as their wealthier analogues. In the contemporary United States, we might see the first stirrings of such signs among the Hippies; currently, we might see echoes of it among permaculture enthusiasts, voluntary simplicity advocates, and of course, primitivists. We might even see the open source movement itself as a reaction, trying to maintain the investments in technological complexity by creating greater simplicity in administration and information processing. We might find an upper-class echo of this behavior in the kind of elite resignment that Peggy Noonan discusses in her 27 October 2005 editorial for the Wall Street Journal, “A Separate Peace.”

It is at this point that collapse occurs, because the costs of complexity have become so high that the society is no longer willing to put forward any further investment in it. Tainter discusses the effect of energy subsidies–such as fossil fuels–which can extend the curve, heighten the curve, or even allow one curve to follow another. But these merely modify the situation; they do not change the basic fact that complexity is subject to diminishing marginal returns, and thus, any society that pursues greater complexity as the answer to every stress–that is, any civilization (see thesis #13)–must *eventually*collapse. The question is not if, but when.