macroresilience

resilience, not stability

Archive for August, 2010

Evolvability, Robustness and Resilience in Complex Adaptive Systems

with 14 comments

In a previous post, I asserted that “the existence of irreducible uncertainty is sufficient to justify an evolutionary approach for any social system, whether it be an organization or a macro-economy.” This is not a controversial statement – Nelson and Winter introduced their seminal work on evolutionary economics as follows: “Our evolutionary theory of economic change…is not an interpretation of economic reality as a reflection of supposedly constant “given data” but a scheme that may help an observer who is sufficiently knowledgeable regarding the facts of the present to see a little further through the mist that obscures the future.”

In microeconomics, irreducible uncertainty implies a world of bounded rationality where many heuristics become not signs of irrationality but a rational and effective tool of decision-making. But it is the implications of human action under uncertainty for macro-economic outcomes that is the focus of this blog – In previous posts (1,2) I have elaborated upon the resilience-stability tradeoff and its parallels in economics and ecology. This post focuses on another issue critical to the functioning of all complex adaptive systems: the relationship between evolvability and robustness.

Evolvability and Robustness Defined

Hiroaki Kitano defines robustness as follows: “Robustness is a property that allows a system to maintain its functions despite external and internal perturbations….A system must be robust to function in unpredictable environments using unreliable components.” Kitano makes it explicit that robustness is concerned with the maintenance of functionality rather than specific components: “Robustness is often misunderstood to mean staying unchanged regardless of stimuli or mutations, so that the structure and components of the system, and therefore the mode of operation, is unaffected. In fact, robustness is the maintenance of specific functionalities of the system against perturbations, and it often requires the system to change its mode of operation in a flexible way. In other words, robustness allows changes in the structure and components of the system owing to perturbations, but specific functions are maintained.”

Evolvability is defined as the ability of the system to generate novelty and innovate thus enabling the system to “adapt in ways that exploit new resources or allow them to persist under unprecedented environmental regime shifts” (Whitacre 2010). At first glance, evolvability and robustness appear to be incompatible: Generation of novelty involves a leap into the dark, an exploration rather than an act of “rational choice” and the search for a beneficial innovation carries with it a significant risk of failure. It’s worth noting that in social systems, this dilemma vanishes in the absence of irreducible uncertainty. If all adaptations are merely a realignment to a known systemic configuration (“known” in either a deterministic or a probabilistic sense), then an inability to adapt needs other explanations such as organisational rigidity.

Evolvability, Robustness and Resilience

Although it is typical to equate resilience with robustness, resilient complex adaptive systems also need to possess the ability to innovate and generate novelty. As Allen and Holling put it : “Novelty and innovation are required to keep existing complex systems resilient and to create new structures and dynamics following system crashes”. Evolvability also enables the system to undergo fundamental transformational change – it could be argued that such innovations are even more important in a modern capitalist economic system than they are in the biological or ecological arena. The rest of this post will focus on elaborating upon how macro-economic systems can be both robust and evolvable at the same time – the apparent conflict between evolvability and robustness arises from a fallacy of composition where macro-resilience is assumed to arise from micro-resilience, when in fact it arises from the very absence of micro-resilience.

EVOLVABILITY, ROBUSTNESS AND RESILIENCE IN MACRO-ECONOMIC SYSTEMS

The pre-eminent reference on how a macro-economic system can be both robust and evolvable at the same time is the work of Burton Klein in his books “Dynamic Economics” and “Prices, Wages and Business Cycles: A Dynamic Theory”. But as with so many other topics in evolutionary economics, no one has summarised it better than Brian Loasby: “Any economic system which is to remain viable over a long period must be able to cope with unexpected change. It must be able to revise or replace policies which have worked well. Yet this ability is problematic. Two kinds of remedy may be tried, at two different system levels. One is to try to sensitize those working within a particular research programme to its limitations and to possible alternatives, thus following Menger’s principle of creating private reserves against unknown but imaginable dangers, and thereby enhancing the capacity for internal adaptation….But reserves have costs; and it may be better , from a system-wide perspective, to accept the vulnerability of a sub-system in order to exploit its efficiency, while relying on the reserves which are the natural product of a variety of sub-systems….
Research programmes, we should recall, are imperfectly specified, and two groups starting with the same research programme are likely to become progressively differentiated by their experience, if there are no strong pressures to keep them closely aligned. The long-run equilibrium of the larger system might therefore be preserved by substitution between sub-systems as circumstances change. External selection may achieve the same overall purpose as internal adaptation – but only if the system has generated adequate variety from which the selection may be made. An obvious corollary which has been emphasised by Klein (1977) is that attempts to preserve sub-system stability may wreck the larger system. That should not be a threatening notion to economists; it also happens to be exemplified by Marshall’s conception of the long-period equilibrium of the industry as a population equilibrium, which is sustained by continued change in the membership of that population. The tendency of variation is not only a chief cause of progress; it is also an aid to stability in a changing environment (Eliasson, 1991). The homogeneity which is conducive to the attainment of conventional welfare optima is a threat to the resilience which an economy needs.”

Uncertainty can be tackled at the micro-level by maintaining reserves and slack (liquidity, retained profits) but this comes at the price of slack at the macro-level in terms of lost output and employment. Note that this is essentially a Keynesian conclusion, similar to how individually rational saving decisions can lead to collectively sub-optimal outcomes. From a systemic perspective, it is more preferable to substitute the micro-resilience with a diverse set of micro-fragilities. But how do we induce the loss of slack at firm-level? And how do we ensure that this loss of micro-resilience occurs in a sufficiently diverse manner?

The “Invisible Foot”

The concept of the “Invisible Foot” was introduced by Joseph Berliner as a counterpoint to Adam Smith’s “Invisible Hand” to explain why innovation was so hard in the centrally planned Soviet economy: “Adam Smith taught us to think of competition as an “invisible hand” that guides production into the socially desirable channels….But if Adam Smith had taken as his point of departure not the coordinating mechanism but the innovation mechanism of capitalism, he may well have designated competition not as an invisible hand but as an invisible foot. For the effect of competition is not only to motivate profit-seeking entrepreneurs to seek yet more profit but to jolt conservative enterprises into the adoption of new technology and the search for improved processes and products. From the point of view of the static efficiency of resource allocation, the evil of monopoly is that it prevents resources from flowing into those lines of production in which their social value would be greatest. But from the point of view of innovation, the evil of monopoly is that it enables producers to enjoy high rates of profit without having to undertake the exacting and risky activities associated with technological change. A world of monopolies, socialist or capitalist, would be a world with very little technological change.” To maintain an evolvable macro-economy, the invisible foot needs to be “applied vigorously to the backsides of enterprises that would otherwise have been quite content to go on producing the same products in the same ways, and at a reasonable profit, if they could only be protected from the intrusion of competition.”

Entry of New Firms and the Invisible Foot

Burton Klein’s great contribution along with other dynamic economists of the time (notably Gunnar Eliasson) was to highlight the critical importance of entry of new firms in maintaining the efficacy of the invisible foot. Klein believed that “the degree of risk taking is determined by the robustness of dynamic competition, which mainly depends on the rate of entry of new firms. If entry into an industry is fairly steady, the game is likely to have the flavour of a highly competitive sport. When some firms in an industry concentrate on making significant advances that will bear fruit within several years, others must be concerned with making their long-run profits as large as possible, if they hope to survive. But after entry has been closed for a number of years, a tightly organised oligopoly will probably emerge in which firms will endeavour to make their environments highly predictable in order to make their environments highly predictable in order to make their short-run profits as large as possible….Because of new entries, a relatively concentrated industry can remain highly dynamic. But, when entry is absent for some years, and expectations are premised on the future absence of entry, a relatively concentrated industry is likely to evolve into a tight oligopoly. In particular, when entry is long absent, managers are likely to be more and more narrowly selected; and they will probably engage in such parallel behaviour with respect to products and prices that it might seem that the entire industry is commanded by a single general!”

Again, it can’t be emphasised enough that this argument does not depend on incumbent firms leaving money on the table – on the contrary, they may redouble their attempts at static optimisation. From the perspective of each individual firm, innovation is an incredibly risky process even though the result of such dynamic competition from the perspective of the industry or macro-economy may be reasonably predictable. Of course, firms can and do mitigate this risk by various methods but this argument only claims that any single firm, however dominant cannot replicate the “risk-free” innovation dynamics of a vibrant industry in-house.

Micro-Fragility as the Hidden Hand of Macro-Resilience

In an environment free of irreducible uncertainty, evolvability suffers leading to reduced macro-resilience. “If firms could predict each others’ advances they would not have to insure themselves against uncertainty by taking risks. And no smooth progress would occur” (Klein 1977). Conversely, “because firms cannot predict each other’s discoveries, they undertake different approaches towards achieving the same goal. And because not all of the approaches will turn out to be equally successful, the pursuit of parallel paths provides the options required for smooth progress.”

The Aftermath of the Minsky Moment: A Problem of Micro-Resilience

Within the context of the current crisis, the pre-Minsky moment system was a homogeneous system with no slack which enabled the attainment of “conventional welfare optima” but at the cost of an incredibly fragile and unevolvable condition. The logical evolution of such a system post the Minsky moment is of course still a homogeneous system but with significant firm-level slack built in which is equally unsatisfactory. In such a situation, the kind of macro-economic intervention matters as much as the force of intervention. For example, in an ideal world, monetary policy aimed at reducing borrowing rates of incumbent banks and corporates will flow through into reduced borrowing rates for new firms. In a dynamically uncompetitive world, such a policy will only serve the interests of the incumbents.

The “Invisible Foot” and Employment

Vivek Wadhwa argues that startups are the main source of net job growth in the US economy and Mark Thoma links to research that confirms this thesis. Even if one disagrees with this thesis, the “invisible foot” thesis argues that if the old guard is to contribute to employment, they must be forced to give up their “slack” by the strength of dynamic competition and dynamic competition is maintained by preserving conditions that encourage entry of new firms.

MICRO-EVOLVABILITY AND MACRO-RESILIENCE IN BIOLOGY AND ECOLOGY

Note: The aim of this section is not to draw any false precise equivalences between economic resilience and ecological or biological resilience but simply to highlight the commonality of the micro-macro fallacy of composition across complex adaptive systems – a detailed comparison will hopefully be the subject of a future post. I have tried to keep the section on biological resilience as brief and simple as possible but an understanding of the genotype-phenotype distinction and neutral networks is essential to make sense of it.

Biology: Genotypic Variation and Phenotypic Robustness

In the specific context of biology, evolvability can be defined as “the capacity to generate heritable, selectable phenotypic variation. This capacity may have two components: (i) to reduce the potential lethality of mutations and (ii) to reduce the number of mutations needed to produce phenotypically novel traits” (Kirschner and Gerhart 1998). The apparent conflict between evolvability and robustness can be reconciled by distinguishing between genotypic and phenotypic robustness and evolvability. James Whitacre summarises Andrew Wagner’s work on RNA genotypes and their structure phenotypes as follows: “this conflict is unresolvable only when robustness is conferred in both the genotype and the phenotype. On the other hand, if the phenotype is robustly maintained in the presence of genetic mutations, then a number of cryptic genetic changes may be possible and their accumulation over time might expose a broad range of distinct phenotypes, e.g. by movement across a neutral network. In this way, robustness of the phenotype might actually enhance access to heritable phenotypic variation and thereby improve long-term evolvability.”

Ecology: Species-Level Variability and Functional Stability

The notion of micro-variability being consistent with and even being responsible for macro-resilience is an old one in ecology as Simon Levin and Jane Lubchenco summarise here: “That the robustness of an ensemble may rest upon the high turnover of the units that make it up is a familiar notion in community ecology. MacArthur and Wilson (1967), in their foundational work on island biogeography, contrasted the constancy and robustness of the number of species on an island with the ephemeral nature of species composition. Similarly, Tilman and colleagues (1996) found that the robustness of total yield in high-diversity assemblages arises not in spite of, but primarily because of, the high variability of individual population densities.”

The concept is also entirely consistent with the “Panarchy” thesis which views an ecosystem as a nested hierarchy of adaptive cycles: “Adaptive cycles are nested in a hierarchy across time and space which helps explain how adaptive systems can, for brief moments, generate novel recombinations that are tested during longer periods of capital accumulation and storage. These windows of experimentation open briefly, but the results do not trigger cascading instabilities of the whole because of the stabilizing nature of nested hierarchies. In essence, larger and slower components of the hierarchy provide the memory of the past and of the distant to allow recovery of smaller and faster adaptive cycles.”

Misc. Notes

1. It must be emphasised that micro-fragility is a necessary, but not a sufficient condition for an evolvable and robust macro-system. The role of not just redundancy but degeneracy is critical as is the size of the population.

2. Many commentators use resilience and robustness interchangeably. I draw a distinction primarily because my definitions of robustness and evolvability are borrowed from biology and my definition of resilience is borrowed from ecology which in my opinion defines a robust and evolvable system as a resilient one.

Bookmark and Share

Written by Ashwin Parameswaran

August 30th, 2010 at 8:38 am

Amar Bhide on “Robotic Finance”: An Adaptive Explanation

with 6 comments

In the HBR, Amar Bhide notes that models have replaced discretion in many areas of finance, particularly in banks’ mortgage lending decisions: “Over the past several decades, centralized, mechanistic finance elbowed aside the traditional model….Mortgages are granted or denied (and new mortgage products like option ARMs are designed) using complex models that are conjured up by a small number of faraway rocket scientists and take little heed of the specific facts on the ground.” For the most part, the description of the damage done by “robotic finance” is accurate but the article ignores why this mechanisation came about. It is easy to assume that the dominance of models over discretion may have been a grand error by the banking industry. But in reality, the “excessive” dependence on models was an entirely rational and logical evolution of the banking industry given the incentives and the environment that bankers faced.

An over-reliance on models over discretion cripples the adaptive capabilities of the firm: “No contract can anticipate all contingencies. But securitized financing makes ongoing adaptations infeasible; because of the great difficulty of renegotiating terms, borrowers and lenders must adhere to the deal that was struck at the outset. Securitized mortgages are more likely than mortgages retained by banks to be foreclosed if borrowers fall behind on their payments, as recent research shows.” But why would firms choose such rigid and inflexible solutions? There are many answers to this question but all of them depend on the obvious fact that adaptable solutions entail a higher cost than rigid solutions. It is far less expensive to analyse the creditworthiness of mortgages with standardised models than with people on the ground.

This increased efficiency comes at the cost of catastrophic losses in a crisis but long periods of stability inevitably select for efficient and rigid solutions rather than adaptable and flexible solutions. This may be a consequence of moral hazard or principal-agent problems as I have analysed many times on this blog but it does not depend on either. A preference for rigid routines may be an entirely rational response to a long period of stability under uncertainty – both from an individual’s perspective and an organisation’s perspective. Probably the best exposition of this problem was given by Brian Loasby in his book “Equilibrium and Evolution” (pages 56-7): “Success has its opportunity costs. People who know how to solve their problems can get to work at once, without considering whether some other method might be more effective; they thereby become increasingly efficient, but also increasingly likely to encounter problems which are totally unexpected and which are not amenable to their efficient routines…The patterns which people impose on phenomena have necessarily a limited range of application, and the very success with which they exploit that range tends to make them increasingly careless about its limits. This danger is likely to be exacerbated by formal information systems, which are typically designed to cope with past problems, and which therefore may be worse than useless in signalling new problems. If any warning messages do arrive, they are likely to be ignored, or force-fitted into familiar categories; and if a crisis breaks, the information needed to deal with it may be impossible to obtain.”

Now it is obvious why banks stuck with such rigid models during the “Great Moderation” but it is less obvious why banks don’t discard them voluntarily post the “Minsky Moment”. The answer lies in the difficulty that organisations and other social systems face in making dramatic systemic U-turns even when the logic for doing so is clear, thus the importance of mitigating the TBTF problem and enabling entry of new firms. As I have asserted before: “A crony capitalist economic system that protects the incumbent firms hampers the ability of the system to innovate and adapt to novelty. It is obvious how the implicit subsidy granted to our largest financial institutions via the Too-Big-To-Fail doctrine represents a transfer of wealth from the taxpayer to the financial sector. It is also obvious how the subsidy encourages a levered, homogenous and therefore fragile financial sector that is susceptible to collapse. What is less obvious is the paralysis that it induces in the financial sector and by extension the macroeconomy long after the bailouts and the Minsky moment have passed.”

Bookmark and Share

Written by Ashwin Parameswaran

August 23rd, 2010 at 4:34 am

Coming Out of Anonymity

with 6 comments

I recently exited banking to start work on an entrepreneurial venture I’ve been excited about for a while which means that I have updated my “About” page. The most rewarding aspect of writing on this blog has been the feedback that I’ve received. Hopefully, my “unanonymisation” will encourage some of you who are uncomfortable with the idea of engaging with an anonymous person to converse with me. Thanks for reading.

Bookmark and Share

Written by Ashwin Parameswaran

August 9th, 2010 at 12:31 pm

Posted in Uncategorized

Raghuram Rajan on Monetary Policy and Macroeconomic Resilience

with 16 comments

Amongst economic commentators, Raghuram Rajan has stood out recently for his consistent calls to raise interest rates from “ultra-low to the merely low”. Predictably, this suggestion has been met with outright condemnation by many economists, both of Keynesian and monetarist persuasion. Rajan’s case against ultra-low rates utilises many arguments but this post will focus on just one of these arguments that is straight out of the “resilience” playbook. In 2008, Raghu Rajan and Doug Diamond co-authored a paper, the conclusion of which Rajan summarises in his FT article: “the pattern of Fed policy over time builds expectations. The market now thinks that whenever the financial sector’s actions result in unemployment, the Fed will respond with ultra-low rates and easy liquidity. So even as the Fed has maintained credibility as an inflation fighter, it has lost credibility in fighting financial adventurism. This cannot augur well for the future.”

Much like he accused the Austrians, Paul Krugman accuses Rajan of being a “liquidationist”. This is not a coincidence – Rajan and Diamond’s thesis is quite explicit about its connections to Austrian Business Cycle Theory: “a central bank that promises to cut interest rates conditional on stress, or that is biased towards low interest rates favouring entrepreneurs, will induce banks to promise higher payouts or take more illiquid projects. This in turn can make the illiquidity crisis more severe and require a greater degree of intervention, a view reminiscent of the Austrian theory of cycles.” But as the summary hints, Rajan and Diamond’s thesis is fundamentally different from ABCT. The conventional Austrian story identifies excessive credit inflation and interest rates below the “natural” rate of interest as the driver of the boom/bust cycle but Rajan and Diamond’s thesis identifies the anticipation by economic agents of low rates and “liquidity” facilities every time there is an economic downturn as the driver of systemic fragility. The adaptation of banks and other market players to this regime makes the eventual bust all the more likely. As Rajan and Diamond note: “If the authorities are expected to reduce interest rates when liquidity is at a premium, banks will take on more short-term leverage or illiquid loans, thus bringing about the very states where intervention is needed.”

Rajan and Diamond’s thesis is limited to the impact of such policies on banks but as I noted in a previous post, market players also adapt to this implicit commitment from the central bank to follow easy money policies at the first hint of economic trouble. This thesis is essentially a story of the Greenspan-Bernanke era and the damage that the Greenspan Put has caused. It also explains the dramatically diminishing returns inherent in the Greenspan Put strategy as the stabilising policies of the central bank become entrenched in the expectations of market players and crucially banks – in each subsequent cycle, the central bank has to do more and more (lower rates, larger liquidity facilities) to achieve less and less.

Bookmark and Share

Written by Ashwin Parameswaran

August 3rd, 2010 at 6:30 am