Robust distributed solutions – a different view of uncertainty

Graham Harris v3By Graham Harris

In a series of books and papers Andreas Wagner has explored the basis of the robustness of living organisms: his discoveries have been breaking new ground. In books like Robustness and evolvability of living systems (2005), and The arrival of the fittest (2015), he has shown how living organisms depend for their survival on genetic and metabolic networks which possess modularity and distributed robustness.

These networks show not duplication and redundancy of parts, but distributed robustness through having many alternative routes for synthesis and control. By possessing many different alternative and complementary pathways they are robust to gene deletions (“knock outs”) and alternative substrates. Robustness is achieved and function is retained if pathways are changed, blocked or deleted.

The many alternative pathways and solutions ensure that life goes on in the face of uncertainty. There is robustness in the face of noise and perturbation. The evolved strategy that is not necessarily efficient or optimal but it is (in Nassim Taleb’s terminology) “anti-fragile.” It is capable of recovering from change and, indeed, can come back stronger than before. Wagner has also pointed out that what we see are the persistent solutions – the neutral spaces – in which life can operate whilst absorbing change and the vagaries of mutation and chance.

The key to this, as Wagner has discovered, is the fact that despite the enormous multidimensional set of possibilities in diverse networks and pathways, evolution can always find adjacent solutions when confronted by mutations or by environmental change. Thus life copes with uncertainty by rapidly and easily exploring what Richard Lewontin called “the adjacent possible.”

Ecological systems are not themselves alive but their component organisms are. Evolution and biodiversity ensures modularity in function and distributed robustness. We call these persistent states “ecosystems,” but they are not in the least like the models we build of them. Networks and linkages fluctuate in space and time, and the whole ensemble of species and populations is open to immigration and emigration. Historical contingency and local interactions drive unpredictable unfolding and development (Jonah’s second law applies).

Networks of ecological interactions are not optimal – there are neither “maximum entropy” solutions nor maximisation of water or nutrient use – but they are not random assemblages either. There is pattern in nature but it lies, self-organized, between chance and necessity.

These properties of living systems ensure robustness, evolvability and adaptability. They are not “efficient” in the modern sense of highest performance at least cost. Metabolic overheads are costly but ensure robustness to noise and longer-term change. There is a mismatch between the infrastructural and economic systems we, as humans, construct and the natural world in which they are embedded. The naïve realist world of science and the Enlightenment is at odds with the natural world.

Our naïve realist systems methodology is an abstraction from a fluid, heterogeneous and robustly distributed network of interactions. We build trivial models of highly dynamic networks and interactions. We expect Jonah’s first law to apply and are surprised when it does not. Our expectations are rarely met. We treat all uncertainty as aleatory when much of it is epistemic and arises from unknown and deeply contingent causes.

Our view of “uncertainty” arises from a wish to statistically define and make predictions about (and therefore manage) the unfolding of eco- and other “systems.” As such, uncertainty is only a problem if you take a realist, rationalist view of the world. Species are robust to noise and change, and have evolved anticipatory models of change through genetics, metabolism, behaviour and culture. These responses to “uncertainty” may take time and may not be optimal, some experiments fail, but life (and societies) persist and develop over time.

I have written before about the lack of outcomes and the low statistical power achieved by programs of works and measures in environmental management. Now we begin to see why there is such a disconnect. As Robert Ulanowicz wrote, heterogeneity defeats laws. Stuart Kauffman has also observed that “no laws entail” the properties and evolution of life.

Fluid and heterogeneous networks exhibiting distributed robustness make it hard to define the effects of single genes, agents or species in such networks – so chance, contingency and distributed robustness are responsible for the low power displayed by statistical searches for “genes for” specific diseases, the causes of events or the effects of particular management measures. Despite this we still desire to patent genes and find drugs to act as “silver bullets.” Such treatments and responses are the exceptions, not the norm.

“Big data” and the requirements for large population samples are the rationalist response to this situation. Projects such as Genome Wide Association Studies are a response to the problem of low power and are merely large and expensive fishing expeditions. John Ioannidis has repeatedly stated that we need to be very cautious about “big error” in such projects and that many (or most) published research findings are false. The pressure to only publish positive results leads to errors in statistical practice and the increasing trend towards retractions. Publish in haste and repent at leisure.

Distributed robustness also leads to weak relationships between biodiversity and ecosystem services despite the current fascination of economists and natural resource managers for monetization of such services and for market-based instruments. Again naïve realist and rationalist – physics envious – approaches have infected ecology, sociology and economics. Species lists are not good indicators of ecosystem performance. Yes, there is evidence of a decline in ecosystem function with a decline in biodiversity but, as we might expect, the effect is noisy and weak and does not apply at all times in all places.

Given the above it is easy to see why, in river management for example, it is proving difficult to “chase through” cause and effect in catchments and receiving waters. There are too many alternative routes and metabolisms (many of them hidden underground) and there is fluidity in space and time coupled with distributed robustness. Even hydrology is not just a problem for physicists – living organisms inhabit catchments and influence outcomes. Keith Beven has discussed the modeling and prediction issues.

Reductionist approaches and methodologies cannot be successfully applied to the “continuum” from soil to water – cause and effect get diluted and convoluted as space and time are traversed. Exceptions and “paradoxical properties” abound, laws are hard to define and, as in metabolism, there is a strong effect of past events and “upstream” influences.

What, then, is the practical effect of distributed robustness? While systems biology is beginning to sort some of this out at the level of the organism, we still predominantly see naïve realist approaches applied to inappropriate problems. Physics envy has its place for certain kinds of trivial systems but ecology needs a new mind-set and environmental management needs a rethink.

As Dan Schindler and Ray Hilborn have recently written (Science, 347: 953-4, 2015) there is a big difference between robust methodologies and the usual approaches to uncertainty and risk management.

Robustness can be likened to a ball rolling about in a basin of attraction. The position of the ball is not static but it stays within a defined region and is rarely tossed out of the basin. Even if it is hit by an unexpected perturbation, life is robust and is able to exploit the ready availability of the adjacent possible. Robustness entails interactions involving the whole of the system, provides long term security and solutions, is self-regulating or self organising, tolerates or actively exploits variability, exhibits persistent and dynamic neutral spaces, and is only amenable to indirect management. Monitoring may not provide convincing evidence of outcomes; statistical power will be low. When dealing with living and social systems, surprises will be expected.

Risk management, as it is usually practiced, can be likened to keeping a ball perched precariously on the top of a convex surface. The usual approach to risk management involves definition and management of uncertainty through continuous monitoring, a requirement for evidence and of feedback from interventions for adaptive management, the elimination of variability wherever possible, the control of perturbations to steady states or equilibria, and short-term security from single or defined risks. Models and software systems that are used to assess and manage risk will always suffer from the rationalist bias and from contingencies and “unkown unknowns.”

Life and risk management treat uncertainty quite differently. Life actively exploits uncertainty through distributed channels and anticipatory models. Solutions change over time, they are not optimal but are robust and persistent. We humans hate uncertainty and attempt to “flat line” variability wherever possible – think flood control, river regulation and dam building to ensure secure water supplies. Enlightened rationalists want to “control” or “predict” uncertainty to achieve a secure return on investment. Life finds an adjacent work-around and persists.

This fundamental disjunction in worldview and practice ensures that there is, at present, a total misalignment between human ambition and the rest of life on this planet. Change is afoot however, and we are learning how to do things differently. Innovation and new technologies are producing new solutions to these apparently intractable problems. I shall return to this topic in future blogs.

2 thoughts on “Robust distributed solutions – a different view of uncertainty

  1. Research on and development of resilient and fit-for-purpose infrastructure solutions need more of your blogs, Graham!
    As long as infrastructure planning and investment processes rely on linear and feed-forward trend analyses without any considerations given to feedback effects and inherent uncertainties, Governments will continue to make the same mistakes with increasing risks of failure and diminishing returns.

  2. Thanks Pascal, this same disjunction is a key to better environmental management too. See my next (rather long) upcoming blog….

Leave a Reply

Your email address will not be published. Required fields are marked *