March 29, 2011
Systemic bias is an inseparable part of organizational culture, the common pattern of thinking that goes with it, and the premises by which all human systems function. Therefore each of us has a systemic bias. However, we may be unaware of its unstated assumptions. Without sensing these and learning to question them, our clashes with other people assume that only they have systemic biases. Inability to clarify why we differ so, with all parties determined to “win,” lets major system changes degenerate into political contests of will among factions unable to fathom why their motivations are so different.
So are some systemic biases better than others? Yes. Some are much better for human needs of the time than others, although no human system can be perfect.
A science-based system (and its biases) is superior when business environments are changing rapidly, buffeted by many unprecedented challenges. It promotes faster learning with fewer mistakes and less rancor. The book Compression labeled such a system a Vigorous Learning Enterprise. If your organization has deployed problem solving based on PDCA or DMAIC, it is headed toward resolving problems more objectively, but can develop much more if systemic bias clashes are out in the open.
Science’s ideal is to abide by evidence that falls where it may. Most of us give that lip service. The systemic bias of business is to do what pays. Both struggle with “ethics.” They decide what benefits all of us, or minimizes harm, from different systemic biases.
But even scientists are gravely challenged to be objective and open to learning. For starters, even Wikipedia lists 55 ways to warp an experiment by experimental bias. Conducting an experimental program unraveling a welter of complexity seems simple only in retrospect.
Although deliberate deception is rare, like Dr. Hwang Wu Suk faking evidence of cloning human cells, systemic bias pervades pure science. Scientists’ mind sets glom onto old theories and abandon them reluctantly, as described in The Structure of Scientific Revolutions (1962), which popularized that now overworked term, “paradigm.”
Of course, one reason for delay accepting new ideas is that reviewers must probe them for holes in logic, data, or analysis. They may compare them with conflicting hypotheses offering explanation of the same phenomena. Open-minded reviewers need time to comb through this – and are rarely rewarded for doing so promptly. Methodology and lab skills factor in too, because clean work leaves less clutter to fog observation. Not every scientist who pioneers a turning point work wins a Nobel Prize. Systemic bias can delay recognition for decades. For example, E=MC2 never garnered Einstein a Nobel. (He got one for proving that light can be quantized.)
But systemic biases for established intellectual theory are less controversial than science clashing with those holding different systemic biases. Moralists of all stripes take umbrage with studies offending their view of morality. Financial interests launch PR attacks on any aspect of a study that threatens pocketbooks, seeking to undermine its credibility with the public. Shielding researchers from such attacks is the major argument for academic tenure, and it was long the reason why scientists advocated waiting for peer confirmation before “going public” with a controversial study. Finally, and most common, researchers are tempted to slant findings toward outcomes that might please their funders.
Compression Thinking’s systemic biases start with obtaining much better outcomes while physically using much less, interpreting what we do physically before evaluating it financially, and seeing what we do as part of much bigger systems.
These biases conflict with deeply held systemic biases in industrial societies, starting with unending economic expansion, and evaluating almost everything in monetary terms. These assumptions shape our language and daily thought patterns. We describe animals hunting food as “going shopping,” schooling as “skill set expansion,” prisons as “commodity markets for correctional services,” and soldiers as “assets to deploy.” Systemic biases common to industrial societies confine its arguments on how to deal with economic malaise to philosophical differences on how to prime market growth.
From this 30,000-foot perspective it’s hard to imagine how Compression Thinking would change our daily lives. But to deal with Compression any work organization has to initially survive in an expansionary commercial environment. Bridging this chasm in system biases is harder than any technical problem associated with Compression.
Compression Thinking also counters a Western systemic bias permeating both science and commerce: reductionism. Other terms describe reductionism, but in brief it’s thinking that a whole system is the sum of its parts. In science, reductionism breaks a complex system into parts, studying each one at a time and trying to infer the nature of the whole from these bits and pieces. In business and economics, reductionism assumes that a total economy is the sum of its parts, so if people are employed and businesses profitable, and all are making more money, surely all is right with the world. That’s what a constantly growing GNP suggests. But studies of complex systems tell otherwise. For example, overall human physiology is impossible to infer from detailed knowledge of the biology of its many different kinds of cells.
Compression Thinking’s systemic bias is to always look at what we do as fitting into larger systems, maybe global in scope. But digging into that is another episode.