Thinking Systems #3: Thinking about Systems Theory

Graham Harris v3By Graham Harris

It is always instructive to take a step back from the coalface occasionally and to take a look at the history of particular disciplines. More often than not it is possible to see that certain sets of ideas became the bedrock of our thinking – and that others did not.

For the last 60 or 70 years – since World War II at least – the systems theorists have had a fascination with cybernetics: the construction of mathematical computer models of systems based on diagrams with boxes and arrows and on the solutions of sets of differential equations. The roots of this can be found in activities such as the work of the Rand Corporation and the development of war games and military logistics for WWII. After the War the discipline was further developed through a series of Macy Conferences in the USA, chaired by Warren McCulloch.

What the Macy Conferences focused on was a particular approach to cybernetics: an approach that has subsequently been called first-order cybernetics. First-order cybernetics are based on the very familiar mathematical modeling relations which entail simple machine analogies, Newtonian dynamics and naïve realism.  A full explanation of the formal assumptions of this approach to systems science can be found in Robert Rosen’s three key books; “Anticipatory Systems” (1985), “Life itself” (1991) and “Essays on life itself” (2000).

When I say that these approaches are very familiar what I actually mean is that they are widely used. In truth they are used uncritically and without thinking. Scientists use a “method” – a particular methodology – that is founded on very strict assumptions about ontology (the nature of being; the way the world is) and epistemology (what we know about the world and how we know it). Rosen’s books are widely ignored: first, because scientists and engineers simply do not read such philosophical works, and second, because Rosen provides a cogent critique of these assumptions when they are applied to life and living systems.

Newtonian physics forces a particular, and restricted, epistemology onto the ontology. Newtonian dynamics, which makes very strict assumptions about the nature of cause and effect, works very well for certain types of problem and is applicable to many “hard” systems problems (yes, physics works). But such an approach is not applicable to “softer” living systems – to life itself – without doing violence to philosophy and the assumed modeling relations. So one of the reasons why we often get perverse outcomes from systems modeling and prediction is that we have been trying to shoehorn a broad set of problems into a narrow and inappropriate methodology. The development of computers merely reinforced this approach. We do indeed need a broader and more inclusive epistemology and methodology.

Some critical thinkers spotted this inconsistency early on. In a paper published in American Scientist, 36: 536-544 (1948) entitled “Science and complexity” Warren Weaver wrote; “science has, to date, succeeded in solving a bewildering number of relatively easy problems, whereas the hard problems, and the ones which perhaps promise most for man’s future, lie ahead (p. 543). In Science, 139: 81-88 (1963) George Gaylord Simpson wrote; “Biology, then, is the science that stands at the center of all science. It is the science most directly aimed at science’s major goal and most definitive of that goal. And it is here, in the field where all the principles of all the sciences are embodied, that science can become truly unified.” (p. 88, my emphasis)

We have fallen into the trap of thinking that there was just one systems theory – an exact science applicable to all problems. In the context of systems science, West Churchman published a “Challenge to reason” in 1968 and in it he recognized the need to balance the “hard” exact sciences with the “softer” disciplines. Churchman reached back to Immanuel Kant to argue that we need to drop the “requirement… of classical rationalism: the fixed axioms, the established truths that exist independent of man’s moods and wishes. Instead the rational is to become an interplay of thought, imagination and mood, necessarily not “consistent”, and yet not relativistic and skeptical either”. (p. 207)

An approach to a broader methodology exists; but it is in disciplines either not read by systems theorists or in disciplines actively despised by those scientists and their “hangers on” (those suffering from a severe case of physics envy, like some sociologists and economists, for example) who still cleave to the “superior” realist, rationalist and value free methodology of received science.

Such approaches can be found in the work of Robert Rosen, in his work on “relational biology” and in his call for a different modeling relation for living systems. They can also be found in the humanities and in sociology where “deconstruction” has been popular for decades. “Deconstruction” has been derided and vilified by scientists for its relativism and iconoclasm but there are valuable insights here that we must explore. When we deal with systems we are indeed faced with a “Challenge to reason” and limits to our knowledge.

A second-order cybernetics, a second-order systems theory, now exists. Second-order cybernetics was actually born at those Macy Conferences after the War but was dismissed as too “soft” and a poor cousin to the rigorous exact sciences. It languished in an intellectual corner until it was given a new life by people like Katherine Hayles and Niklas Luhmann. The key to second-order cybernetics is to deconstruct the first-order practices and to bring the observer into the loop. We shall take these ideas up and develop them in future blogs.

What I am trying to do is to pick a middle path, all is not relative and merely a matter of opinion and there are absolutes – there is a “hard” ontology that rules our lives. (See my 2007 book “Seeking sustainability in an age of complexity”.) There is room for a bounded rationality. What I am trying to develop is a broader epistemology to deal with that and with the fundamental uncertainty, ethics and values.

In the conclusion to his 1948 paper Warren Weaver wrote “If science deals with quantitative problems of a purely logical character, if science has no recognition of or concern for value and purpose, how can modern scientific man achieve a balanced good life, in which logic is the companion of beauty, and efficiency is the partner of virtue?  In one sense the answer is very simple: our morals must catch up with our machinery.”

Just remember, that was written in 1948. In the post-War “white heat of the technological revolution” (quoted from a speech to the UK Labour Party conference in Scarborough in 1963 by the then opposition leader, Harold Wilson) and in the years since, those valid questions were ignored.

As Sandra Mitchell wrote in “Unsimple truths” (2009): “if we project the overly simplistic old views of science as the epistemology of science, then when simple explanations and methods fail in complex situations, it appears to policymakers that science fails. The danger is that holding science up to the wrong standard will diminish the value of what science discovers about nature, and could create an environment in which science is no longer consulted to inform policy.” To avoid this scenario we need to develop a more sophisticated understanding of the kinds of explanations and solutions that science and other approaches can provide when we come to grips with the “unruly complexity” that Peter Taylor wrote about in 2005: the complex, heterogeneous and ever changing systems that make up our world.

Leave a Reply

Your email address will not be published. Required fields are marked *