
close
The
axes on the above graph are:
- Randomness -- completely predictable on
the left, utterly unpredictable on the right
- Complexity -- in this context, how much
you need to remember to make predictions
For
example, think of a fair coin,
one that comes up heads or tails with equal probability. The result of
each coin flip is completely random and not influenced by anything that
went before. The Complexity of the result is zero because there is
nothing that you can remember about the previous flip that will help
you predict the next.
Then think of a coin with two heads, it will always come up heads. Each
flip is completely predictable and there is no need to remember
anything (well, after the first flip anyway) in order to predict the
next result. So the Complexity is also zero.
Now imagine a special coin flipping system that always comes up tails
when the previous state was heads, but behaves randomly when the
previous state was tails. Now you need to remember one bit of
information in order to predict if the result will be tails with 100%
probability (after a head) or tails with a 50%
probability (after a tail). This is a more complex process and it is not completely
random so it falls in the middle of the graph above.
A great many natural systems behave in this more-complex, less-random
way. By analyzing their data streams certain regimes of order can
be found in what looks like random noise.