Knowing is Not a Binary State

As humans, we store information and ideas in complex 3 dimensional structures. Accounting for nuances, weight of logic and auxiliary topics. When we need to use an idea we slice through that 3d spatial matrix and render a flat idea, into a sentence for example, or a decision.

The person on the other side then unpacks what we said into their own mental map, and on we go.

As we started using machines for tele–comm we reaped all of the advantages, with relatively small loss of meaning.

Ironically the deeper we are in the algorithmic age the more loss in digital signal we should be expecting. By asking machines to extract meaning (make decisions for example) we’re relinquishing understanding, and set the machines to fail.

We must not confuse a transistor for a translator.

We are now asking a binary machine (Shannon, 1948) to unpack levels of knowing, in a deeply qualitative state. We are asking a black and white machine, to see shades of gray.

Machines, conditionally and unequivocally, operate on a single plain. Single train track, single hockey rink, one dimension. The meaning comes from the connections between all of those and the liminality.

We’re the only ones able at a system view, and it is that bird eye view that understands meaning.


Now read this

On Universality

Tools are designed to be universal, of course. A hammer can’t change to be a screwdriver. With algorithms we now try and use context where possible. With data, sensors and other externalities we might try and change the behavior of our... Continue →