Complex is not (the same as) difficult (7)


Pieter Jansen and Fredrike Bannink

 

Information theory.

Information theory makes complexity quantifiable and mathematical. This theory was developed by Shannon (1948) for information technology, but may be applied to all complex systems. It is also suitable when working with biological processes and to understand how we think. Information theory is all about organizing information and managing uncertainty. In its essence, it is surprisingly simple.

 

The bit is the smallest unit of information.

But what does the bit measure? A bit can only have two values—yes or no; on or off. Information arises when an event takes place, for which it was uncertain beforehand whether it would happen. In essence everything is information. We can describe biology as information theory. We can describe our body as an information processor. Information can not only be found in the instructions of genes, and memory can not only be found in the brain. Both are in all (parts of) our cells.
Dawkins (1986), an evolutionary biologist, states, “If you want to understand life, you have to think about information technology”. Today there is no difference between physics and information theory anymore. Information is more fundamental than matter; matter results from information. “It from bit”, states Wheeler (1994), a theoretical physicist.

 

Like in thermodynamics, information theory uses the concept of entropy – the amount of uncertainty or disorder in a given system. Spontaneous structures arise in dynamic systems and when energy is added, self-organization arises. Biological organisms have a high degree of self-organization. They interact with their environment and face an ongoing challenge to keep the internal entropy limited. After all, the second law of thermodynamics states that entropy in an isolated system will tend to increase over time.
Hirsh, Mar and Peterson (2012, p.304) developed the entropy model for uncertainty for biological processes.

 

‘We propose the entropy model of uncertainty (EMU), an integrative theoretical framework that applies the idea of entropy to the human information system to understand uncertainty-related anxiety. Four major tenets of EMU are proposed:
(a) Uncertainty poses a critical adaptive challenge for an organism, so individuals are motivated to keep it at a manageable level;
(b) uncertainty emerges as a function of the conflict between competing perceptual and behavioral affordances;
(c) adopting clear goals and belief structures helps to constrain the experience of uncertainty by reducing the spread of competing affordances;
(d) uncertainty is experienced subjectively as anxiety and is associated with activity in the anterior cingulate cortex and with heightened noradrenaline release.‘

 

In other words, all organisms have certain possibilities for action. There is a constant flow of information that must be weighed against these possibilities. Many options signify a high level of uncertainty – a high entropy. Our brain ‘measures’ the activity of these options and functions as an entropy meter. A goal or belief structure limits the number of options, so is helpful in reducing entropy. Finding more or better goals, beliefs and values leads to a greater reduction of entropy and gives direction to our behavior.

 

In the cause-and-effect model, statistics are used to make certain issues uniform and suitable for causality. Information theory uses mathematics of the probability theory to calculate uncertainty. These calculations determine the direction to achieve a manageable level of entropy, over and over again. This has proven useful when faced with a complex and dynamic setting.
Note that deep learning and data mining use information theory to find patterns. And subsequently these patterns are often used – in a reductionist way – as a subject in a cause-and-effect model. So this is also a way to deal with uncertainty for analytical purposes.

 

References

Dawkins, R. (1986). The blind watchmaker. New York: Norton, p.112.

Wheeler, J.A. (1994). At home in the universe. New York: American Institute of Physics, p.296.

Hirsh, J.B., Mar, R.A., & Peterson, J.B. (2012). Psychological entropy: A framework for understanding uncertainty-related anxiety, Psychology Review, 119, 2, p.304-320.