Truth by Doc Hall (Video 7 minutes)
Is Truth Just a Better Lie?
The arts of human deception have ever resided within us. On-line technology multiplies opportunities to practice them. Every morning before breakfast I trash a half-dozen or more e-mail scams. Fifty years ago no one began scamming you until you were awake.
Scams are deceptions. By Harry Frankfurt’s definition, deliberate deception is a lie. However, bullshit merely makes up “facts and stories” to persuade – land a sale, get a vote, impress somebody at a bar. Veracity is irrelevant to successful persuasion.
A lie is not telling the truth. But can we ever know “truth,” and if so, can we describe it? Turns out that we can’t ever know the truth – reality. We’re incapable of it.
Deep thinkers thrashing through their philosophical weeds have questioned whether reality exists ever since Plato noted that if it does, we can never see it. He likened us to cave dwellers that cannot see outside, but must infer (guess) what is going on outside from shadows on the cave wall. But shadows lend themselves to the arts of deception.
Today, physiology explains the same thing. Humans perceive only a smidge of the phenomena in which we are immersed. For example, we can only see light in the visible spectrum, a sliver of the total electromagnetic spectrum. To guess at the rest, instruments must translate the invisible into something we can sense. And then using our direct senses, we focus on whatever draws our attention, ignoring the rest. Our perceptual limitations force us to focus.
It’s like we were each born in our own little mental prisons, shaped by unique life experience. Nobody sees exactly the same shadows on our walls. To learn something new – guess closer to reality – we must fight our own mental and physical limitations.
When sensing a broad spectrum of information, overload befuddles our brains. It’s like sorting 500 e-mails a day for a few morsels in the chaff. To sort, we pre-decide what we will pay attention to, and what not and tune a spam filter to help. However, filtering risks missing Important messages that don’t fit our preconceptions. Inescapably, we’re preconception biased; all of us; no exceptions. Only fools think they are. Nobody escapes being fooled.
When interacting with nature and other people, our perceptual feedback loops become complex (does she think that I think….). Our preconceptions clash. Fully grasping the workings of nature or the behavior of others is hopeless, but many of us barely try. We are perceptually lazy. We prefer people – and news – with similar biases. We can relax.
Although we can never see reality, technology can get us closer to it. A farmer that monitors soil temperature, moisture, pH, and composition has more data from which to exercise intuition. When meeting other people, we sense intent and trustworthiness. Will they do what they say? Can they? And persuasion by deception begins. For example, in budget negotiations in large organizations, a player must open with a big lie. Otherwise she’ll probably lose – and fail her department. Best liar wins; that’s why we lie. Technical problems are “tame problems.” Clashes of human intent set up “wicked problems.”
But in any kind of problem solving, we struggle against the instinct to conserve brain energy. Blame somebody else. Invent a hokum story. Honest searches to get closer to truth (reality) fight mental and emotional laziness. Science battles it all the time. The rest of society often seems disinclined to even consider “truth” being the only winner.
Obfuscation
“Obfuscating with Transparency” read the headline of a recent editorial in Science. It accused the EPA of excluding evidence in policy decisions by requiring that the public have access to the data for all studies cited. It often doesn’t for various reasons. Under the claim of requiring transparency, this rule excludes many findings from consideration. This EPA ruling appears to be politically motivated, but it raises a ton of complex issues.
Peer reviewers usually have access to check data. The public may not. Privacy is a big reason in safety studies; however, miscreants may also cherry pick data from a study to attempt to refute conclusions they don’t like. Interpreting data – to get closer to reality – Is hard enough without taking flak from parties not wanting to get closer to reality.
Reviewers often find errors, but no one can check every study in detail. Reviewing others’ data seldom being the most exciting part of a science career, the rigor of peer reviews invites criticism. Reviewers can’t escape using intuitive judgment on validity of findings. Without anyone intending to be deceptive, systemic issues of reviewer bias are considerable.
Negative findings are seldom submitted for publication, although negatives are important to others working an area. About 40% of all reported findings can’t be replicated. Almost every issue of Science reports a published study that has been pulled. Some studies are suspect because a source of funding automatically injects bias. A few scientists are disciplined for outright fraud. Scientists are human too, and their struggles trying get closer to the truth hold lessons for all of us.
Keeping up is a problem in science, as elsewhere. Accessing published articles is not easy. Most journal articles are behind a paywall, and journal subscriptions are expensive. University libraries complain about it; both the number of publications and the volume of articles keep rising. Plagued by Information overload, scientists can only keep up with findings significant to them. But which ones are those? Something significant can “come out of left field.”
Changes to ease these problems are underway, but science needs a new system of learning. The rest of us need a new system of learning too.
Systemic Learning
Despite its flaws in practice, science is organized for learning; active learning that pushes back on ignorance and self-deception, learning that takes us closer to reality. And science covers a wide swath of fields. The rest of society is not organized for learning in this sense. Excepting perhaps educational institutions, learning is directed at other goals. Businesses research customers and learn how to improve efficiency serving them. None would say that they are not learning, but it is not central to their mission.
Given the magnitude of environmental perils coming at us, the Compression Institute has proposed that almost all work organizations become Vigorous Learning Organizations, structured and culturally tuned for operationally learning how to improve the human condition in a rapidly changing world; have leadership for learning, think systemically, have a common mission, and cultivate behavior for learning – for starters.
We’re optimistic that real people can collectively learn much better. But we have to concentrate on it, and put much more emotional energy into it, as well as intellectual energy. If an organization can attain its goal (make a profit) without this, it probably won’t work up the extra human energy. However, the time is upon us to totally reconceive how an economy must work in an ecologically regenerative world.
Balancing the needs of nature with the human stakeholders of a work organization mandates more complex objectives than profitability, as challenging as that goal may seem today. And these objectives are not static. The present troubles of learning by science show that this is no small human challenge.
Pessimists speculate that advancing human civilization cannot escape destroying itself. We’re optimistic that the human race can actually step up to a new level of civilization. Technology will probably aid learning, but it is biased too, just like we are, and part of learning is overcoming our biases, or preconceptions. Vigorous Learning implies new objectives in life, abandoning our self-deceptions of a bygone world fading fast. Truth is just a better lie from which we can learn more.