In the 1997 sci-fi comedy, Men in Black, Agent K (Tommy Lee Jones) tells his partner, Agent J (Will Smith) an important thing about humanity: “A person is smart. People are dumb, panicky, dangerous animals, and you know it.”
While taking life lessons from a goofy movie probably isn’t the road to great personal growth, Agent K might be on to something. The cynicism aside, he’s got a point. People in the throes of a herd mentality often make very bad decisions. A couple of recent studies lend strong support to this idea.
Most of us like to think that we are perfectly rational creatures. Before we decide on a course of action we calculate the utility of it — attempting to maximize benefits and minimize costs.
All of our calculations have context. They are in some instances Bayesian and in others, swept along by incalculable anomic passions.
Many researchers use the concept of “bounded rationality” to describe the middle ground where we most often operate.
One such researcher, Pieter Van Baal, adds an important detail: “This does not mean that everybody includes the same parameters as costs and benefits or that they evaluate those parameters in the same way.”
In other words, my calculation in a given situation may be different than yours. This explains in part why poker and eBay are so popular —- they pit our estimations against one another.
Of course, acting out of bounded rationality means that we are making decisions using incomplete — and often deeply flawed — information. When that bad data gets amplified by the herd, a whole lot of crazy stuff can happen. In the world today, there is no greater amplification system than the Internet.
To this point, professor of philosophy Vincent F. Hendricks at the University of Copenhagen says: “Group behavior that encourages us to make decisions based on false beliefs has always existed. However, with the advent of the Internet and social media, this kind of behavior is more likely to occur than ever, and on a much larger scale, with possibly severe consequences for the democratic institutions underpinning the information societies we live in.”
Hendricks and fellow researchers, Pelle G. Hansen and Rasmus Rendsvig, just published a study where they analyze a number of social information processes that are enhanced by modern information technology. The two most interesting observations they make can be summarized as “echo chambers” and “information selection.”
In the first of these, echo chambers, the researchers conclude that online forums are especially prone to self-reinforcing viewpoints, the consequence of which can be a radicalization of participants. The entire group may shift to a more extreme position after a discussion even though individual group members did not necessarily subscribe to the radicalized view prior to the discussion. Attention Fox News and NPR listeners — your radios and TVs get other channels.
The other tenancy of social media is information selection — other customers who bought the widget you just bought also bought these widgets … . Companies use expertly developed algorithms to tailor their marketing to your purchasing and viewing profile. Here again, we are victims of the echo. In each case our bounded rationality becomes more tightly bounded. We see ample proof of this in Congress and state legislatures.
This idea is also an ancient one. You might recall 2 Timothy 4: 3-4, “For the time will come when men will not put up with sound doctrine. Instead, to suit their own desires, they will gather around them a great number of teachers to say what their itching ears want to hear.”
In the 1970s, British social psychologist Henri Tajfel explored a phenomenon he called the “minimal group paradigm.” The basic mechanism of this perspective is to discover what are the smallest number of criteria required for discrimination to occur between groups.
Tajfel figured out that it doesn’t take much. We find comfort in “we versus they.”
To this end people will self-bin into all kinds of distinctions: Republicans or Democrats; Americans or everybody else; Red Sox or Yankees; Fords or Chevys; Edward or Jacob … aliens or humans.