Robustness in Networks


Our thinking habits are often based on simple cause-effect relations as can be seen in Newton’s mechanics, engineering and economics. However, with the coming of quantum-mechanics, interconnected computers and global interrelated economies  maybe our standard cause – effect need to be replaced by systems thinking.

Systems thinking is the process of understanding how things, regarded as systems, influence one another within a whole. In nature, systems thinking examples include ecosystems in which various elements such as air, water, movement, plants, and animals work together to survive or perish. In organizations, systems consist of people, structures, and processes that work together to make an organization “healthy” or “unhealthy”.

In a recent blogpost, “Training in economics is a serious handicap” we shared David Orwell’s opinion that neo-classical economics is a dangerous doctrine. Doctrine is a codification of beliefs or a body of teachings or instructions, taught principles or positions, as the body of teachings in a branch of knowledge or belief system. He proposes a systems approach to economics, in order to avoid system risk.

Systemic risk, the risk of collapse of an entire financial system or market, as opposed to risk associated with any one entity, could be reduced by introducing the concept of robustness. It can be defined as “financial system instability, potentially catastrophic, caused or exacerbated by idiosyncratic events or conditions in financial intermediaries”. It refers to the risks imposed by interlinkages and interdependencies in a system or market, where the failure of a single entity or cluster of entities can cause a cascading failure, which could potentially bankrupt or bring down the entire system or market. It is also sometimes erroneously referred to as “systematic risk“.

To make the economy more robust David Orwell in his book Economyths – How the Science of Complex Systems is transforming economic thought – proposes:

  • Modularity. Modularity is the degree to which a system’s components may be separated and recombined.
  • Redundancy. Redundancy is the duplication of critical components or functions of a system with the intention of increasing reliability of the system, usually in the case of a backup or fail-safe.
  • Diversity.  A degree of diversity in a system can help it adapt to change.
  • Controlled shut-down. If entities (banks, companies) are damaged beyond repair, they must be taken apart from the system, before it infects the whole.

One problem, that is not efficient.   Later we will explore Nassim Nicolas Taleb’s   ideas about robustness and fragility. See also our blogpost Mainstream thinking about designing systems.

Go here to look at our new eBook.

Photo “Business Network” by jscreationzs

2 Replies to “Robustness in Networks”

  1. Drawing analogies between financial and biological networks, the Bank of England’s research chief Andrew Haldane and University of Oxford ecologist Robert May have together argued that we should be less concerned with the robustness of individual banks than the contagious effects of one bank’s problems on others to which it is connected (Nature, vol 469, p 351). Approaches like this might help markets to avoid failures that come from within the system itself, Kirman says.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: