Friday, October 24, 2008

Societal Complexity and Diminishing Returns in Security

By Nader Elhefnawy

Originally published in International Security 29.1 (Summer 2004), pp. 152-174
Copyright 2004 by the President and Fellows of Harvard College and the Massachusetts Institute of Technology

Discussions of security in recent years have frequently concerned the ways in which the complexity of technologically advanced societies can be an Achilles' heel. Various observers and real-life events have drawn attention to how relatively unsophisticated threats against air travel, power grids, computer networks, financial systems, lean manufacturing processes, and the like can have devastating effects. There has been little effort, however, to approach the problems that complexity poses for security in a comprehensive way, though this is not to say that work applicable to such a purpose has failed to appear.

In The Collapse of Complex Societies, Joseph Tainter argues that societies canreach and pass a point of diminishing marginal returns to investment in societal complexity.1 Eventually, this eats away at their slack, which can be understood as that "human and material buffering capacity that allows organizations and social systems to absorb unpredicted, and often unpredictable, shocks." In other words, slack is the untapped human and material resources for pursuing new endeavors or meeting emergencies. This lack of slack leaves these organizations and systems increasingly vulnerable to collapse as a result of such a shock, for example, military invasion.2 Although concern with diminishing returns in the areas of defense and economics has long been a part of discussions of international security, appearing in the theoretical summations of authors such as Robert Gilpin and John Mearsheimer, Tainter's work (and the field of complexity studies in general) have had little impact on security studies to date.3

This article argues that security is becoming an area of diminishing returns to complexity for today's advanced societies because of the diminishing returns from investments in complexity in general, the risks posed by the interconnections this growing complexity creates, and the rising cost of security forces. Before proceeding to the argument's details, however, some clarification of what "complexity" means, why it rises, and why it can lead to diminishing returns is in order. According to one definition, complexity refers to "asymmetric relationships that reflect organization and restraint" between the parts of a system.4 As such, the characteristic features of complex systems are their composition from a large number of components with a dense web of connections between them; a high degree of interdependence within them; an openness to outside environments, rather than their being self-contained; "synergy," meaning that the whole is more than the sum of its parts; and nonlinear functioning, so that changes in these systems have effects disproportionate to their size, either larger or smaller.5 Such nonlinearity and synergy come with an exponentially increased range of possible interactions, including unplanned interactions, making an incomplete understanding of at least some processes also an aspect of complex systems.6

A more complex human society therefore "has more institutions, more subgroups and other parts, more social roles, greater specialization, and more networks between its parts. It also has more vertical and horizontal controls and a greater interdependence of parts," which may interact in unexpected ways.7 Also in keeping with the greater interdependence and interconnection between a larger number of components, more complex societies have a larger information load. This is the case for both communications and information processing.

As a larger information-processing burden indicates, complexity carries costs. Institutions, networks, and the like require energy for their sustenance; vertical and horizontal controls inhibit personal freedom; and so forth. Given that such costs make human beings generally "complexity averse," why does complexity tend to rise over time? W. Brian Arthur offers three explanations.8

The first is that competition and interdependence among entities create niches that new entities can fill in a process of "coevolutionary development." "Diversity begets diversity"—that is, a multiplicity and variety of elements allowing a greater variety of possible connections, which translates into an ever-more elaborate web, in technology or economics as well as ecosystems.9 The advent of the automobile, for instance, created niches for paved roads, motels, and traffic lights.10

The second explanation Arthur offers is that competitive environments encourage individual entities to improve their performance by adding new subsystems in what is known as "structural deepening." In other words, a system becomes more complex so that it can operate in a wider range of environments, sense and react to exceptional circumstances, service other systems so they operate better, or enhance reliability.11 Arthur offers the example of the jet engine, which in Frank Whittle's design in the 1930s contained only a single moving part. Today's jet engines, which can put out thirty to fifty times as much thrust, are much more complex with up to 22,000 parts.

The third explanation Arthur offers is that large systems of entities can incorporate simpler systems to boost performance in what is known as "software capture."12 (The entry of new member states into the European Union, for instance, results in the EU becoming a more complex entity.)

Nevertheless, even though complexity proves a successful strategy much or most of the time, this is not always the case. As Murray Gell-Mann has observed, a complex system "under the influence of selection pressures in the real world, [engages] in a search process... that is necessarily imperfect."13 A system does not always adapt appropriately, and even adaptations adequate to a particular challenge have effects that are maladaptive.14

This article explores the three trends mentioned above. The first trend, analyzed in the section "Complexity and Slack," is one of diminishing returns from investments in complexity in general. This section establishes that advanced societies are becoming more complex. It then advances the economic evidence for diminishing returns to investments in complexity. Finally, it demonstrates that societal slack is shrinking as a result.

The second trend, analyzed in the section "Interconnection and Vulnerability," is the tendency in social and technological systems toward tighter coupling between their components, and more "scale-free" networks, with their attendant vulnerabilities, as societies grow more complex. The result is that more points exist to be attacked while the effect of any single attack is magnified, creating a need for stronger protection at more points.

The third trend, discussed in the section "Rising Security Costs," is the increasing cost of the means of security in the face of threats such as terrorism and weapons of mass destruction. In particular, this section examines security expenses rarely looked at in such studies, such as the cost of police and other emergency and law enforcement units, private security outlays, and "hidden" expenditures such as insurance rates, all of which are apparently headed upward.

In the conclusion this article brings the implications of all three trends together. The end result, it argues, is that as today's advanced societies grow more complex, they become less able to absorb shocks. At the same time they offer more points to attack, any one of which can have greater effects because of their heightened interconnectivity. The cost of defending against any of these shocks also rises. As a result, these societies are less and less secure. The conclusion also discusses some possible ways of coping with the problem, central to which is the judicious development and application of a range of new technologies.

Complexity and Slack
Tainter's focus is on a particular category of maladaptation, societal investment in complexity for diminishing returns, in which the returns to each unit of investment shrink (or even become negative). This shrinking in turn reduces societal slack over the long term, which is something that Tainter has argued is the case for today's advanced societies. There are three necessary prerequisites to substantiating this line of argument: (1) establishing that advanced societies are actually becoming more complex; (2) determining that such changes correspond to a pattern of diminishing returns; and (3) confirming that these diminishing returns are consuming slack.

Complexity and Advanced Societies
The measurement of complexity is a highly controversial matter, with some experts questioning even the feasibility of trying to do so.15 Nevertheless, a basis for measurement has been suggested by theorists, with some consensus existing that the quantity of information required to operate or represent a given structure is a guide to a system's level of complexity.16 Where social systems are concerned, a feudal, agrarian economy requires less information to operate than an advanced industrial economy, and is therefore less complex. The change in an entity's information load is one way of measuring whether that entity is becoming more complex, as would be the case with a shift from an agrarian economy to an industrial one.

The indicators report a dramatically rising information load as economies are "informatized."17 One expression of this is that spending on information and communications technology as a share of nonresidential, fixed investment rose from 15.2 percent to 31.4 percent in the United States between 1980 and 2000, with comparable increases in the European Union and Japan.18 There is also a rising volume of communication, travel, and trade—in short, interconnection and interactivity, which can be identified with increasing complexity. Virtually every indicator of the level of such traffic shows a long, upward trend, with per capita traffic volume doubling in North America between 1960 and 1990.19 The daily volume of person-to-person email messages, virtually zero twenty years ago, stood at 21 billion in 2002, a figure that the International Data Corporation estimates will rise to 35 billion by 2006.

Between 1950 and 2000, world trade generally expanded faster than gross domestic product (GDP), and more than three times as fast during the 1990s.20 Although it is often stated that international trade levels were higher in the pre-1914 period than they are currently, this observation tends to ignore the more complex character of the trade. One aspect of this greater complexity is increased "production sharing," that is, trade in components or parts rather than fully fabricated manufactures.21 The growing trade in services, exemplified by the outsourcing of work from accounting to software writing internationally, and the sheer volume of international financial flows have no previous analog. Lowered trade barriers, moreover, have commonly brought more rather than less legal and administrative infrastructure, as with entities such as the World Trade Organization. At the same time, contrary to the claims of those who believe that globalization is bringing about the death of the state, governments—a major source of complexity—have grown larger rather than smaller in this period.22

In short, complexity, and in particular the complexity created by more technology and economic integration, is increasing rapidly—but to what end? Thesecond issue, namely whether these investments are producing diminishing returns, remains unanswered.

Complexity and Diminishing Returns
An obvious approach to determining the relationship between complexity and diminishing returns is to look at economic trends, given the overwhelmingly economic orientation of the increased complexity. There is widespread evidence that several economic sectors are showing diminishing returns to investment in complexity, including agriculture, energy production from fossil fuels, and heavy, bulk-processing manufacturing such as steelmaking.23 The same is true for certain elements of the service sector, such as education and health care.24 The costs of these two services are increasing at a markedly more rapid rate than economic growth, and without showing commensurate improvements in either their contribution to economic productivity or health standards.25

Several areas of high technology, such as aerodynamics, are also producing diminishing returns, requiring much greater investment for much more modest results.26 The most striking exception to this pattern, at least according to the conventional wisdom, is information technology, which is widely credited with driving the continuing increases reported in productivity.27 It should be remembered, however, that the "information" society remains underpinned by the older technologies of moving parts and fossil-fuel energy sources.28 It should also be remembered that the full costs of information technology are rarely taken into account, this being an area where capital (i.e., software and computers) depreciates very rapidly, eating more deeply into productivity increases than is generally noted by economists.29

Moving beyond single sectors of the economy, there is evidence that economies in the aggregate are also producing diminishing returns. The world economy grew by 5.3 percent a year in the 1960s, 3.9 percent in the 1970s, 3.2 percent in the 1980s, and only 2.3 percent in the 1990s.30 Moreover, alternative indicators suggest that even the sagging figures for GDP growth are misleading with regard to the brightness of the picture. GDP does not take into account the ecological, social, or long-term economic costs of such activities (i.e., the maladaptations accompanying rising complexity, as Gell-Mann might put it). For that reason, some economists have turned to other indicators that would take into account such costs and that, incidentally, make explicit a connection between certain kinds of complexification and diminishing returns.

Net domestic product (NDP), for instance, is broadly comparable to GDP but takes into account the depreciation of capital. Given the rapid depreciation of information technology relative to more traditional kinds of capital, the gap between rates of U.S. GDP and NDP growth has increased from 0.1 percent in the 1960s and 1970s to 2 percent from the late 1990s on.31 Such a wide gap suggests that "real" economic growth in the advanced economies is lower than reported, with equipment depreciation consuming a significant portion of their gains. A 4 percent growth rate would be effectively cut to a "real" rate of 2 percent for the 1995-2000 period.

Another alternative is the Genuine Progress Indicator (GPI), which accounts for a still wider range of the side effects of economic activity that can undermine growth in the long run, including resource depletion, environmental damage, lopsided income distributions, unemployment, and debt.32 While U.S. per capita GDP grew 55 percent between 1973 and 1993, GPI per capita declined some 45 percent.33 Rather than a slowing increase in national wealth, the conclusion that can be drawn from the use of GPI is its gradual erosion through negative growth.

Diminishing Returns and Reduced Slack
A long-term trend of diminishing returns to heightened complexity is strongly suggestive of shrinking slack but insufficient to prove it, the two trends being closely connected but not synonymous. Another approach is necessary to settle the third issue, whether societal slack is actually shrinking. The most obvious is an examination of the level of governmental activity relative to a state's income over an extended period, particularly taxation, spending, and debt. Government, after all, is uniquely positioned to command slack in the event of exogenous shocks, making its ability to do so a way of measuring how much slack exists in the system. A combination of increased taxation, spending, and debt indicates that governments are less and less able to live within their means, that public goods are becoming more expensive, and that a society is spending a higher share of its income on debt service rather than investment or consumption.34 With taxes and debt levels high, governments also have less leeway to raise taxes further or undertake new types of activity. In other words, because more of their resources are already committed elsewhere, less slack is available to be mobilized when it is needed.

As already stated, government has grown in recent decades, with taxes and spending rising throughout the developed world—as have debt burdens. Tax revenue rose from 31.5 percent to 38.4 percent of GDP between 1970 and 2002 among the Group of Seven advanced industrial nations.35 Spending rose at an even swifter rate, with the result that the proportion of gross debt to income almost doubled between 1977 and 2002 alone.36 Notably this rise continued despite post-Cold War reductions in military spending; the scaling back of welfare states, as with lower public spending on education; major reductions in public spending on infrastructure and research and development; and the savings that privatizing and decentralizing government services were supposed to generate.37

There are also reasons to think that states will continue moving in this direction of greater spending and indebtedness.38 The most widely discussed of these is that mandatory spending is increasing as a percentage of government expenditures, so that even with more money being levied, governments have less leeway in making spending decisions.39 This is partly because of the pressure that aging populations put on social safety nets, which appear to be growing less effective as a way of achieving their goals, particularly pension plans and health care systems.40 Also consistent with a trend toward older populations (but not solely due to it), savings rates have declined throughout the advanced world, and private debt has risen, meaning a shallower well from which governments can draw in times of need.41

Defense economics in recent years underscore this tightening of finances. Yet defenders of recent increases in U.S. defense spending have frequently argued that the United States managed to spend 37.5 percent of GDP on defense in 1945, and then 5 percent or more of its annual income in the Cold War.42 Implicitly, the United States could do the same today. What those making this argument commonly miss is that the federal government was much smaller at the outset of World War II, and also less indebted.43 Postwar economic growth was also sufficient to enable the United States to "grow out of" its debt, cutting it by three-quarters as a share of GDP.44 By contrast, the spike in U.S. national debt in the 1980s suggests that the decade's defense expenditures, in the range of 5-6 percent of GDP, were less supportable than the much higher levels of the 1950s.45 Current defense spending in the area of 4 percent of GDP in the early years of the twenty-first century produces budget deficits and increases in the debt burden comparable to those of the 1980s.

In short, the less-taxed, less indebted United States of the World War II era had considerable slack on which to draw, and a high growth rate enabled a rapid fiscal recovery. More recent trends, however, have been toward diminishing returns to investment in complexity, as evidenced by slowing growth. Moreover, these investments have consumed slack, as seen in rising debt levels and shrinking savings. Nor is this to be regarded as a temporary aberration, as indicators suggest this pattern will continue well into the early decades of the twenty-first century.

Interconnection and Vulnerability
In addition to leaving advanced societies less slack, greater complexity may mean more vulnerability because of the higher level of interconnection it entails. Certainly, the opposite is typically considered true: that is, a high level of interconnection between components can often contain or ameliorate disruptions, a point well established in the ecological literature. Where societies are concerned, the existence of a large number of interconnections empowers a state to respond to security threats by enabling it to better monitor its domain and move military and police forces as needed.46 A large number of interconnections also suggests that it is a fairly simple matter to "summon aid to the injured points, erect bypasses around them, and find substitutes for them" in the event of disruptions, as Martin Van Creveld put it in Technology and War.47

The reverse, however, can be true just as often. In practice, the same infrastructures that allow states to cope with threats also open avenues for thosesame threats they mean to guard against, be they an invading army or a terrorist cell. Consequently, the conduits must be guarded not only against exploitation by a hostile force but also against attacks on the conduits themselves—passenger aircraft, for instance, being favorite targets of terrorists. More important, the dense web of connections within a society can more widely propagate the effects of any attacks that do occur.

The question then becomes: what sort of interconnections give a society the ability to recover quickly and which do the opposite? Charles Perrow has made the case that it is a question of the tightness of coupling within a system. Tightly coupled systems, which are short on slack, are also intolerant of delay and contain invariant sequences not allowing for improvisation.48 For that reason there is no room for failure, which means that buffers and redundancies have to be built in, rather than being "fortuitously available." Consequently, tightly coupled systems are highly susceptible to "idiosyncratic threats," meaning that "if one can find a weakness through which safety factors can be overloaded or bypassed, he can cause imploding, catastrophic failure."49 Power grids demonstrate how this can happen. In November 1965 the shutdown of one of six lines carrying power into Ontario's electric grid from the Beck plant outside Toronto disabled much of the Canadian system.50 When the demand for electricity from Canada went off-line, Beck's output into New York doubled, surging through the U.S. grid, endangering plants all over the northeastern United States, and compelling utilities to take their systems off-line. The result was that in the space of four seconds, much of Canada and the northeastern United States were left in the dark. Although this blackout resulted from an accident rather than an attack, it does suggest possibilities for sabotage capable of producing similar effects.

Oil pipelines offer another example of a tightly coupled system, one entailing more localized but also longer-term disruption than is generally the case with power grid failures. The sabotage of the pipeline between Iraq and Turkey in August 2003 closed off the flow of oil from the fields in Kirkuk—40 percent of Iraq's total production.51 No other way exists to move the oil, and the shutdown of the damaged section rendered the rest of the pipeline inoperative at a cost of an estimated $7 million a day in lost oil sales and a small but noticeable rise in world oil prices. Moreover, even after the repair of such a pipeline, it is a matter of days before the process can function normally again. Meanwhile the infrastructure remains vulnerable to further disruption, the Kirkuk pipeline proving no exception to the rule. Consequently, the time given by authorities for which the critical pipeline would be inoperable increased from three weeks to three months (early November). In the face of later attacks, the authorities suggested that the pipeline could be out of service indefinitely, though operations did finally resume in March 2004.52

Certainly, electric grids and oil pipelines may be dismissed as relatively old-fashioned technology. Nevertheless, they will remain critical parts of modern infrastructures for a long time to come. There is also good reason to believe that the connections most characteristic of advanced societies are creating tighter coupling, and the greater vulnerability that goes along with them. One reason is that tight coupling is widely seen as the key to extracting greater efficiency and productivity from a system.

This pattern is reflected in many of the post-Fordist production approaches that rely on computers and other information technologies. The combination of accelerated production rates and smaller, more highly specialized workforces makes a given disruption (i.e., the loss of a man-hour of work) more costly. The number of man-hours required to produce a ton of steel, for instance, dropped from 10.5 in 1980 to 2.2 by 2000 largely because of automation and information technology.53 Older, "less efficient" ways of going about an activity are eliminated, and user autonomy is restricted through standardization of process and procedure, making sequences more rigid and leaving even less room for improvisation.54

Just-in-time (JIT) manufacturing, in which components are delivered just as they are needed, is an example of such a low-slack, tightly coupled process—and can be quickly disrupted when these are interfered with, as demonstrated by the terrorist attacks of September 11, 2001. The tightening of security following the attacks produced major bottlenecks in supply-chain management systems, with trucking coming to a near halt along the borders the United States shares with Canada and Mexico.55 According to a U.S. government report, "Transportation issues played havoc with order flows and drove up shipping costs"; the plants most dependent on JIT (such as those belonging to Ford, Honda, and Toyota) stopped working. At the macroeconomic level, the result was a 1 percent drop in U.S. industrial production in the month of September, largely resulting from the disruption of industry in the week before normality returned.56

Tighter coupling can also be regarded as a consequence of "coevolutionary development," in which niches create still other niches, as with the example of the automobile cited earlier. Without automobiles, there is little need for motels, traffic lights, and paved roads, so that economic sectors involved with these similarly suffer. Given the connection of such trends with rises in complexity, it stands to reason that more complex economies are more dependent on niche activities tightly interconnected with one another. The effects of this in the face of disruption were also demonstrated by the September 11 attacks.

The terrorist attacks on one niche of the aviation industry (air travel, which suffered $15 billion in losses in 2001 and 2002 because of the September 11 attacks and subsequent disruption) translated into losses for other sectors.57 Deliveries by Boeing, the world's largest supplier of commercial airliners, were down 28 percent in 2002 from the previous year, to 381 from 527.58 In anticipation of the reduced demand for its aircraft after the attacks, Boeing alone announced 30,000 layoffs.59 Along with the major airlines and Boeing, airports and other related concerns scaled back operations, eventually cutting about 400,000 jobs worldwide according to one estimate.60 These job losses in turn had "multiplier" effects in the areas where they were located, while the falloff in air travel and international travel generally damaged other sectors such as tourism, which many estimates indicate lost millions of jobs as a result of the attacks.61 Meanwhile insurance companies, faced with paying out indemnities in the tens of billions of dollars, increased premiums for some types of insurance more than 300 percent in a year's time, with a doubling across the board thought likely in a permanent and significant elevation of the cost of doing business.62 In short, even excluding elevated spending on security and longer-term effects such as higher insurance rates, the September 2001 attacks did hundreds of billions of dollars in economic damage in the United States and abroad.63

Buffers and redundancies can be built into tightly coupled systems, such as a backup pipeline, larger inventories, or greater reserves of capital, to keep companies going in lean times. These are deliberate investments rather than something fortuitously available, however, and the redundancies themselves, aside from being costly, can be a cause of additional breakdowns. This appears to have been the case with, for instance, nuclear reactors, where the addition of redundant components was "the main line of defense" against failure, but also "the main source of failures."64

Planners may also consciously choose not to invest in such buffers. Redundancy and slack are conventionally perceived as a drag on efficiency, and therefore a cost to be cut rather than as a good in social and technological systems.65 At the same time, the tendency is toward leaner operations and ever-shorter time horizons in business, as well as rising competitive pressures.66 The pressure is therefore significant to opt for efficiency over reserve capabilities that may be useful in the event of an emergency. One result is that a lack of unused capacity, or "headroom," in an electric grid's power lines frequently turns what might otherwise be an isolated system failure or local problem into a large-scale power failure.67 Such attitudes also tend to persist even after such incidents, as has been the case with just-in-time manufacturing, calls to rethink such practices post-September 11 proving short lived.68

It also stands to reason that with some aspects of a complex system's functioning typically ill understood, decisionmaking in this area is likely to be of poorer quality than in other cases. In some instances, buffers are inherently difficult to build, as with certain kinds of interconnection. Scale-free networks, such as the internet, tend to have critical hubs or nodes that are connected to a far larger number of elements than average (as opposed to random nets such as the highway system, where each element connects with only a few others).69

The internet, for instance, is held together by a relative handful of web pages of disproportionate value. The same is also the case with the air travel system in the United States, which relies on hubs where passengers catch connecting flights. One consequence of the "hub-and-spokes" system is the ease with which disruptions can translate into major, widespread delays. Recent research has shown that networks such as these may be highly vulnerable to coordinated attacks, the critical threshold for the propagation of a "contagion" through them being virtually zero. Additionally, attacking as few as 5 percent to 15 percent of the elements in such a network can bring down the entire network.70 Such possibilities have been hinted at by the internet's susceptibility to spam email and viruses, single examples of which, such as the "Love Bug," inflicted billions of dollars in damages—and in the case of the Love Bug, also remained pervasive a year after its supposed eradication. Even though the key nodes in a system can be better protected, they are not always easy to identify; in any event, the connectivity of systems rules out a "silver-bullet solution."71

Tighter coupling between the components of today's complex societies, and in particular the vulnerability of scale-free networks to coordinated attack, mean that an attack on any one point can have wider effects, reducing the "tolerance for breakdowns and errors" anywhere along the line.72 The tendency is also toward more components, as niches for new ones are created and new subsystems added, so that more points exist to be attacked. A larger effort is therefore required to protect a larger number of targets, a rough analog with defending a longer front, making for an abrupt increase in a society's security burden without a comparable expansion in its resources. This has already been the case, the relative rise in U.S. security expenditures since September 11 being far greater than economic growth for the period of that rise in a process that may be in only its early phases, and which, given the nature of the conflict, may also have no clear end point.

This is very much what is suggested in the prospect of "terrorist-proofing" modern society, as by improving ventilation systems and providing better safeguards for energy distribution systems as called for in a June 2002 report of the National Research Council.73 This was in line with a historical pattern: an attack on a single target leads to the fortification (terrorist-proofing) of entire classes of targets indefinitely, as with the heightened precautions surrounding the boarding of airliners following September 11. Indeed, in this event it also led to the fortification of facilities not targeted by the terrorists in the attacks, such as the broad effort to upgrade harbor security. The resulting effort to deploy U.S. law enforcement personnel in foreign ports to clear ships bound for the United States raises another key point: a characteristic of complex systems is an openness blurring the boundaries, the absence rather than presence of connection becoming what is difficult to establish. Where security is concerned, this fosters a propensity to keep "push[ing] the border outwards."74 In the case of the war on terror, any state that is seen to harbor terrorists is regarded as a threat to be met with force as necessary, effectively placing the whole planet within that border and pushing the perimeter out to its theoretical limit.

Even so, the cost of even the most ambitious counterterror effort may seem paltry as compared with arms races between great powers. It is well worth remembering, however, that empires facing no great power threats have been overwhelmed by unconventional threats against which they attempted to fortify themselves. Following the barbarian attacks and the spike in banditry of the third century, the Roman Empire shifted from preclusively defending its borders to creating broad areas of military control, and ultimately civilians were safe only behind fortifications.75 The attendant loss of key linkages, the sacrifice of economic productivity to security accompanying such circumstances, and the massive cost of fortifying the empire (and other, related controls) fed into a pattern that eventually drove the Roman Empire to collapse.

Continue.

No comments:

Subscribe Now: Feed Icon