The Efficiency Paradox: What Big Data Can’t Do by Edward Tenner
Edward Tenner, researcher of history and critic of technology, writes regularly for The Atlantic. Noted for a prior book, Why Things Bite Back, Tenner popularized the idea of unintended consequences. Examples from today’s leading edge to ancient times illustrate The Efficiency Paradox. In it, Tenner interprets efficiency as market efficiency, or learning efficiency, rather than as industrial efficiency, doing more using less.
What drew me to The Efficiency Paradox is that, without intending, Tenner contrasts how humans learn with how artificial intelligence (AI) learns. Other reviewers complain that Tenner does not conclude whether AI is a boon benefiting humanity or an ogre destroying it. His purpose was to project AI’s inherent capabilities and limitations.
The artificial voice of Google Duplex, now in deployment, is so well done that few of us can distinguish it from human, so it’s controversial. Will this duplicity set us up for scams by millions of AI bots? Shelley Palmer, who keeps up with AI, wonders whether sex with a robot would be ethical – or legal – or detectible. And if this blows your mind, thought activated interaction with robots using a neural lace interface is in active development.
Change is sneaking in faster than we can absorb. Amazon now assesses your life pattern to offer you personal prices. Uber offers personal prices, and jacks them up by 400% at peak demand times. Airlines and hotels are refining their personal pricing algorithms.
Two hundred years of same price to all retail buyers is ending. Mass production – economy of scale – is old hat. Software matching buyers and sellers, a new concept of efficiency, even applies to news services. All custom news feeds deliver what our prior preferences suggest we want to see. This assumes that what we want is what we really need to know, which narrows learning and divides us into antagonistic camps. Brooding pessimists now declare all sources of news to be fakes, “agnotology,” Tenner calls it. He fears that our “crisis of truth” has only begun.
Algorithmic pricing and shill infested customer ratings are reviving travel agencies to help us navigate travel info minefields. Consumer investment long ago went the same way. Only pros using ultrafast algorithms can play a dozen different capital markets. The same financial “instrument” may have 800 or more different prices, but that’s where the money goes. Financial services were estimated to be 4% of the economy in 1900; in 2014, it was 9%. Is this market efficiency or bloated complexity?
The Internet of Things will cram more complexity into an environment saturated with 5G radiation. Tenner expects many unintended consequences from “sofas interacting with doorknobs,” but does not mention biological scientists warning that unlimited 5G radiation could be a public health hazard.
This is a mere sample of Tenner’s overview of AI affecting everything, for good or ill. He burns many pages on effective search – humans vs. AI. Learning is a search process, so Tenner is coaching us to be more effective learners in a software-soaked society.
Learning is not efficient, but we’d like AI to make it go faster. We should ease up on that pedal. Effective learning is not efficient. Tenner suggests paying search engines more for slow, deep searches that turn up more unexpected findings. We are more creative when our concepts are challenged rather than reinforced. Scanning book titles in old libraries can be very serendipitous. We find things never converted to digits and outside our profile of interest preferences.
From there, Tenner explores AI in education, engineering, and health care. (Is a cyborg doctor possible?) He presents too much to summarize, so The Efficiency Paradox is good to read in detail if learning to learn interests you. Fortunately, Tenner distilled his research into two lists of key points:
Tenner opens with seven sins of excessively “efficient” learning:
- Little serendipity; you see what you look for with no random discovery.
- Hyperfocus; no stimulation to see larger patterns.
- Self-amplifying cascades – market crashes, fallen trees causing blackouts – remain unsuspected.
- Skill erosion (e.g. you’re totally lost if your GPS navigation suddenly conks out).
- Perverse behavior from quantitative indicators. For example, work performance ratings inflate. Teachers teach to the test. In organizations, performance monitoring justifies huge bureaucratic head counts while robbing operative people of learning time.
- Data deluges. In many fields, data is expanding faster than the per terabyte cost of storage. Humans can’t make sense of it. AI is biased processing it – if it too can keep up.
- Monoculture obsolescence. A successful algorithm is replicated until it can no longer respond to change. In agriculture, the Irish Potato Famine is a prime example. More broadly, it is dangerous to assume that conditions we now see will last forever.
Tenner finishes with seven prescriptions for human leaning in an AI era:
- Face-to-face human interaction has no AI substitute. Trust and intuition evolve through full sensory experiences.
- “Perfect 5” learning with AI seems best, where 5 is the midpoint of a 10-point scale with human learning at one end and AI algorithm learning at the other. That is, you use the tool; don’t let the tool use you by atrophying your own learning.
- Preserve intuition; unconscious social judgment is impossible for AI to learn.
- Enjoy creative waste – play with novelties way off track from well-worn ruts.
- “Analog serendipity” – browse and stop to observe with no purpose in mind.
- Desirable difficulty – go slow; question each step, like engineering a design by slide rule instead of by software. This cultivates critical thinking.
- “Cognitive bootstrapping” – don’t zero in on a search quickly. Explore divergent trails.
Tenner is aware that our environmental challenges are serious, but like most authors, sees them as just another problem in a world full of them. The possibility of huge social upheaval as a consequence is not recognized, but he does do Compression Thinking a favor: We need all the insight we can garner on how to learn more effectively and how to create effective learning organizations.
Finally, Tenner notes that the issues he raises have been creeping up for some time, too slowly to excite much passion, but now jolting to urgency. An example that Tenner manages to omit is the slow “dumbing down” of pop music over the past 50 years. The wife and I occasionally wonder why new music does not perk up our ears. Are we too old to relate to kids’ music?
Turns out that musicology studies show that we’re not unreceptive; instead, there’s less to receive. Sound homogenization began in the 1960s with electronic, mixed music.
1940s and 1950s pop music was more complex, varying more in tone, timber, volume, and lyrics. No matter who the performing artist, most pop hits today are written by just two composers, with minimal back up instruments. The business model dictates that all new songs resemble prior hits. Big money to promote a new artist to playlist status has high financial risk. To stir attention, all new exposure sites must be saturated. If you don’t get a big hit, hedge to at least avoid a big flop.
Consequently, pop musical themes are psychologically designed to be repetitive. So are the lyrics – in some cases, little more than oop, doop, ya, ya, ya. Compressing the dynamic range drowns musical nuance in decibels with a boom-box thump.
Pop music illustrates that attention funneling by AI has not suddenly popped out of nowhere. Mass market commercial logic contributes. The pattern matching of Amazon, et al has been preceded by say, the automotive industry. How many car models can you identify without reading a nameplate or logo? Has auto design suffered from me-too push marketing?
How would learning change if we gave priority to environmental issues in business? The complexity of the environment is not artificial, as in financial markets. To escape, we have to un-funnel our learning; make the customer just one consideration among many. The environment would become our most important customer. That switch would throw business learning down a whole new set of tracks.