Big Other (video 5 minutes)
Big Other
by Robert W. “Doc” Hall
During the first few hundred years of industrial expansion, mankind profited by learning to control nature. During the last 20 years or so, we have learned to profit big time by controlling – or manipulating – human nature. Let’s call this new phenomenon “Big Other,” a take-off on Orwell’s “Big Brother.” Orwell’s Big Brother was mind-control by English Socialism, suggestive of Nazi Germany or the Soviet Union. “Big Other” is pervasive intrusion of digital marketing running amok.
Suspicious minds sense government control, more of a single point source. But “Big Other” is diffuse, a business model anybody can use to compete for our attention and to influence our decisions. However, big social media companies provide platforms for people to also promote political ideologies. Big Other exploits our psychological vulnerabilities most effectively if we remain unaware that we are being influenced.
Americans swim in propaganda. Don’t think so? How many ads do you see daily? Although impossible to determine precisely, estimates range up to 4000 per day. On top of this load, add infomercials, fake news, slanted stories, surveys designed to suck you in, plus outright scams and hacks. What share of your mind is your own, and how much belongs to somebody else?
On line, all of us are subject to targeted ads. Google, Facebook and their ilk track your data constantly, algorithmically sniffing clues to your inclinations. Then they immediately direct you to what you want to see – or to what some influencer pays them for you to see. Buyers of targeted ads know whether you clicked through to their web site, and whether you bought anything. Plain old print ads can’t do that, so targeted ads are displacing print advertising.
Have you ever paused at a web site showing throw rugs, and immediately see banner ads for throw rugs pop up? You’ve been targeted by Big Data and algorithms. Subtle influence also takes many other forms, like product placement, paying a movie company to feature a product as part of a movie set. Or paying Instagram stars to display a product, with fees depending on their number of followers (real or fake).
Incessant commercial messaging dominates Big Other, but ugly messages go viral. Spreading fear via social media motivated the New Zealand Mosque shooter. Seventeen minutes of live slaughter on Facebook quickly migrated to Twitter, YouTube, and other media – exactly as the killer planned. Extreme connectivity is dry kindling for extremist message fires.
Big Other is the business model of technical unicorns like Google, Facebook, and Twitter. Business success (higher revenue; higher stock prices) depends on single-mindedly holding eyeballs to screens, maximizing daily users, time on line, and click-throughs. We can blame big companies as evil while personally abetting their intrusions. Anybody that can win auctions of our eyeball time can buy targeted ad messages.
Few executives intend to be evil. They are trapped in their own business models, their own Frankensteins. With money rolling in, social consequences are easy to rationalize; inventors of a technology are not responsible for how others use it, the same argument used by gun manufacturers. They use artificial intelligence to make our lives more convenient and connected. Social consequences are society’s problem.
Ethical codes are ineffective. For example, Fast Company reported that Google replaced “Don’t Be Evil” in its code of conduct with “Do the Right Thing,” a stronger ethic. However, recent news spilling out from Google indicates that intending to be more positive creating social good can’t buck the pressures to make more money.
Few social media enthusiasts foresaw ugly consequences early on, but after the 2016 U.S. presidential election, critics came out of the woodwork. Allegations of Russian meddling and Facebook’s Cambridge Analytica scandals opened eyes. However, meddling in that election – to win, to profit, or just to have fun – was an equal opportunity sport. Confusion is so deep that the case of the teenage fake news artists in Macedonia is still unraveling. Mounting criticism is framed as problems of privacy, subjugation of free will, abdication of human responsibility, and stirring of hate.
Two leading critics are Shoshana Zuboff and Roger McNamee. Both have recent books criticizing algorithmic manipulation of Big Data as a perversion of capitalism. Both finished writing before Facebook came under criminal investigation, but both these technology fans are alarmed by “artificial intelligence” being used to control us. Elizabeth Warren is proposing to break up big unicorn companies. That presumes that the problem is market size, a 20th century idea, but the problem is the business model.
We lack social restraint on the applications of artificial intelligence. Even more seriously, we lack social restraint on the use of technology – old or new – harmful to both humanity and the ecologies on which we depend. Beyond social media and into the Internet of Things, are we risking Techpocolypse or radiation poisoning from 5G?
Artificial Intelligence
Artificial intelligence (or AI) has a maze of definitions and subcategories, so AI is a fuzzy cpncept, but in brief, if any device can detect its environment, and without human direction, take action to achieve its goals, it uses AI. But no AI device has yet passed a Turing Test, which is that a human cannot detect whether human or software is behind a device’s input-output. And that level of AI is both doubtful and controversial.
But AI is a business buzzword. Of European start-ups claiming AI, 40% were assessed as “not really.” Buzzwords play on a bias that more complex technology will surely help us, and surely can’t make human problems worse. Ergo, technology developers claim no responsibility for its misuse. That is a cop out. Is not an AI developer responsible for setting the objectives of AI algorithms?
Whether, on balance, any technology benefits humanity is a human problem that humans have not been able to resolve. AI can aid concerned humans to preserve human health and ecological health, for instance. But covertly nudging human decisions at the speed of thought is beyond normal marketing. Sure, by market dogma the hidden hand of the market converts it to common benefit, but by this same logic we also exacerbate conflicting goals and human strife. Could AI help curb our human psychic weaknesses?
Possibly, but not if AI algorithm goals are limited to confiscating our attention, playing to our psychological vulnerabilities, dividing us more than uniting us. Privacy controls and squelching trolls only puts Band-Aids on the same old human problems of values and paradigms in mortal conflict. What collective actions will lead to quality-of-life outcomes for everyone, including nature? Can AI help with that?
In its own way, perhaps China is trying. A trend-setting Chinese social medium called Tik Tok now has 500 million followers, 300 million in China. It’s rapidly migrating West. Tik Tok flicks short videos, 15 seconds max, in a phone-friendly vertical format mesmerizing to 12-24 year olds. They can create their own videos, just to demonstrate their creative goofiness, if nothing else. To do this Tik Tok needs AI software.
Unlike Facebook, Tik Tok does not ask users to set up a network of friends. Videos beckon instantly. When the creative video playground shows itself, groups form around it. It’s a pretty benign distraction, but Tik Tok does have monetization motives, and users can stumble onto porn and bad company. However, Tik Tok minimally threatens the Chinese government, which is developing Propaganda 2.0 to give China’s social credit system an on-line presence. Is this Big Other easing people into Big Brother?
Tik Tok’s content is more centrally controlled than in Western social media. Chinese Communists may want to control the internet to stay in power, but they also want to align people behind a vision of China’s future. Are they using AI to propagate new values? This violates Western values for independent thinking – or does it? Advertising and Big Other already demonstrate that in practice, Americans are less independent of thought than they think they are.
Big Other and “The Apparatus,” a Larger System of Economic Thinking
Big Other is a creature of “The Apparatus,” a larger system of economic thinking that thrives on a set of myths we tell ourselves, myths that are increasingly out of whack with emerging realities. Myths of The Apparatus promote private property, individual reliance, entrepreneurship, and economic growth. Despite warning signs, The Apparatus lumbers along on its own momentum. It’s how we do things. We don’t know how to do much without using it. We can’t imagine a new system with new values.
We squabble politically about the social justice of The Apparatus. Its trickle-down myth misses a lot of us. Its presumption of endless growth conflicts with preserving life on earth, living with nature instead of endlessly conquering it (or destroying it).
Living at the behest of The Apparatus is like being subject to the monarchs of old. Whether a king was benevolent or cruel, feudal systems remained similar. Only the values of the incumbent and court politics made any difference. Did they fight for power or did they govern to improve the welfare of subjects?
Democracy attempts to collectively impose popular will on The Apparatus, our modern system. Even when it succeeds, the basic system remains. We think we are governed by laws, but existing laws merely codify The Apparatus and its myths. We need to redesign The Apparatus, locally, nationally, and globally.
How would a new system be structured? What supporting myths would guide us to it? Capitalist-socialist, labor-management, and 1%-99% divides excite us, but none deeply question the flawed goals of The Apparatus. Social media AI with goals limited to growing its own profitability don’t serve us well. Likewise, if goals of The Apparatus are limited to economic growth and profitability, it no longer serves humanity’s most basic need, keeping the human race going. We serve The Apparatus while the system should serve us. Serving us is a huge rethinking of the most basic human values.
We have always lived by systems, but in daily living we don’t much notice them. Many initiatives to revise The Apparatus are underway, notably local economy projects, where habits and lifestyles really change. What’s needed are programs extending to all humanity, to design (or evolve) a system to promote continued life, all life, and to the extent possible, quality of life.