Epistemic uncertainty is important: ask the Thanksgiving turkeys

Graham Harris v3By Graham Harris

As Dr Nick Winder has pointed out, since the 1970s we have found ourselves more and more having to comprehend and deal with recursive, open, non-stationary and evolving entities that we have come to call systems or “systems of systems.”

When dealing with these “complex” second­-order cybernetic systems of various kinds – be they “soft” biological and social, or “hard” physical and technological – scientists have run through a series of conceptual abstractions: a series of intellectual bandwagons each of which has tried to grasp some aspect of this “complexity.” We have seen waves of interest in fractals, self organized criticality, power laws, “small world” networks and the like.

All are attempts to find simplified and universal explanatory models for messy, recursive and individually based networks. As the scale of our dealings and the complexity of the interactions have increased over time – think of the increasing scope of global communications, supply chains and climate modifications – then our concept of a “system” is increasingly a conceptual abstraction. So also is the increasing use of the word “complexity” – the naïve realist’s attempt to nail porridge to the wall.

Yes, some physical and engineered “hard systems” can still be approximated as closed and final entities, where universal models can be constructed and predictions can be made. (Remember Jonah’s Law.) Realism and rationalism have their place. However, the view that all abstracted systems can be viewed as such – AKA physics envy – has led to all pervasive and simple-minded attitudes to risk and uncertainty. There are no simple solutions to these problems.

If the theoretical tools of equilibrium analysis and perturbation theory are applied to uncertainty then it is seen as merely aleatory and statistically tractable, and we expect to see phenomena such as regression to the mean. This is the widespread attitude of science, naïve realism and rationalism that has spread to the social sciences, even to investment and business risk analysis.

Because evidence is the flip side of uncertainty, conceptually minimising uncertainty and making it tractable leads to an increased reliance on evidence. This is as widespread a business bandwagon as it is a scientific one and leads to a focus on narrowly defined profit, efficiency, cost-effectiveness and ROI. It automatically generates social and environmental externalities. But, as Nassim Taleb has repeatedly pointed out, “we have been fooled by randomness.”

While in second-order cybernetic systems there is no environment without a reflexive observer, in some trivial cases the role of the observer can be neglected. Since the Industrial Revolution, technology and commerce were treated in this way. Many such projects were successful. But Nick Winder’s point is valid, since the 1970s we seen the increasing global interpenetration of “hard” and “soft” interactions through the inclusion of behavioural, social and environmental factors to the point where we now see almost complete fusion. This is forcing changes in policies and practices – but many are not giving up without a fight.

Given the deep roots of reflexive and contingent causation that we see in second-order cybernetic problems, it is entirely possible that many current problems are not only uncertain, they may well be incomplete, undecidable and contradictory. Remember “complex” problems are not computable – there is no largest, universal model. We shall have to find better ways to live with this or we shall go on making the same mistakes.

Trying to lean more on theory than observation and to abstract “well behaved” concepts and models from a much messier Nature that we barely understand, leads inevitably to problems arising from epistemic uncertainty. Keith Beven has written extensively about this problem in the context of hydrology and he has been fighting a long running battle with his scientific colleagues about the significance of epistemic uncertainty, especially when it is traditionally treated as being merely aleatory. Donald Rumsfeld’s unknown unknowns can come back to bite us; and do. As Beven points, out the fundamental problem is one of the reliability of induction. And this was David Hume’s problem also a long time ago. Taleb characterizes this as the Thanksgiving turkey problem: well, it got cared for and well fed every other day in its life did it not?

It is ironic that while computers have led us astray – through giving us the ability to conceive and build massive realist and rationalist computational models and fooling us into thinking that this would lead to a new nirvana of wisdom, predictability and democracy (see Adam Curtis’s series of films “All watched over by machines of loving grace”) – it is now the communication and information processing aspects of computer networks that are opening up and interlinking the worlds of science, technology, society and the environment. The Internet of people and things is causing huge change and old business models are being destroyed by innovation, high-speed communications and distributed reflexive intelligence. Uncertainty is increasing: major change is now afoot.

As Taleb has also pointed out these interlinked networks of hard and soft infrastructure and environmental interactions are characterised not by tractable Gaussian probability density distributions and regression to the mean but by fat-tailed log-normal and power law distributions in which extreme evens are more frequent and less predictable than we expect. Such distributions undermine our presumed reliance on evidence and prediction, especially when epistemic uncertainty is actually widespread and induction is fundamentally unsafe.  The risks are increasing but the US President can only save one Thanksgiving turkey each year!

Given the predilections towards widespread physics envy, rationalist assumptions, and risk assessments, it is not surprising that we are seeing so many large scale infrastructure projects and attempts at financial and environmental management founder on the rocks of social and natural variability.  One classic example is the EU Water Framework Directive where, so far, more than €80 billion has been spent on water and environmental management infrastructure to little effect. The goal was to return European surface waters to good environmental condition (originally by 2015) but recent reviews have shown how the project was designed around a particular set of unrealistic assumptions about ecological dynamics, the roles of “experts” with particular naïve realist mind sets, the associated design of “appropriate” works and measures, indicators and “evidence,” and the business models of rationalist government bureaucracies and commercialised water companies and institutions. Because of the way it was set up the entire enterprise has come unstuck. It has foundered on a mismatch between assumptions and reality.

Physics envy works when dealing with trivial, closed and predictable problems (Jonah’s Law holds) but more and more we are dealing with what Nick Winder called plesionic problems (from the Greek plesion, neighbour). Plesionic problems are open and evolving and unpredictable. They involve the interaction of agents in neighbourhood arenas; where the agents interact with each other and with their unique environments in adaptive and anticipatory ways because of contingent and evolutionary history. Charles Darwin was right.

Working on these problems requires reform of our mindsets and assumptions (fundamental reform of basic mindsets is more important that the narrower usage of economic efficiency that is so common these days).  To succeed we need to bring a range of mind-sets and skills to bear and the skills of collaboration and synthesis must replace competition and analysis. Much new learning is required because we generally lack the social tools and skills to manage these interdisciplinary interactions.

Plesionic problems place great emphasis on relationships, trust and phronesis (systemic wisdom). In the last century there was a psychological shift from Alistair MacIntyre’s virtue ethics – humility, kindness, honesty and goodness – to what might be called résumé ethics – wealth, fame, and status (misquoting David Brooks). Excessive competition has destroyed trust and collaboration and encouraged short-term thinking and instrumental reasoning. Plesionic interactions have suffered accordingly.  We can see these outcomes around us.

Dealing with plesionic problems is a complete change from present practice and it is difficult – but I do note a shift in emphasis as we come to grips with these more “complex” problems. The Executive Director of Financial Stability at the Bank of England, Andrew Haldane, reflects the change in spirit. In a recent interview with New Scientist he is quoted as saying “In a system that’s very noisy and messy, sometimes the best you can do is avoid the worst. In other words, build lots of safeguards into that system.”

There is no safe method of induction – especially in “systems” abstracted from plesionic arenas. Epistemic uncertainty is rife. Models and mindsets fail us; major environmental management programs fail to deliver. Innumerable Thanksgiving turkeys get eaten every year yet we try to muddle through. We need to find better ways to invest under uncertainty. Living with greater risk and uncertainty involves an increased emphasis on robustness and resilience – a strategy that life employs at a variety of levels, but one which is quite different from strategies defined by the usual naïve rationalist approaches. I shall return to this topic in my next blog.

2 thoughts on “Epistemic uncertainty is important: ask the Thanksgiving turkeys

  1. Dear Graham,

    In a world full of uncertainties, the fate of thanksgiving turkeys is unfortunately highly predictable! Your witty take on uncertainty and dangerous use of rational determinism to unpack plesionic problems took me back to my beloved Jay Gould and his punctuated equilibrium theory of evolution. I suspect that many plesionic problems are highly path dependant and tend to lock a given system into a sub-set of plausible futures. Often, this speciation process narrows the range of plausible future states of the system. Sometimes, the system reaches some of its boundaries and it is forced to evolve the other way, using all the degrees of freedom offered to reach a new state of stasis in a firework of totally unpredictable attempts to ‘find a solution’ without guaranty it is the best one, as a pure peripatric speciation process. What is true for Pliocene fossils might also be true for modern, self-organising socio-technical systems (including their political dimension).

Leave a Reply to pascal perez Cancel reply

Your email address will not be published. Required fields are marked *