In Part 1, I indicated my interest in the Netflix series: The Untold History of the United States (Oliver Stone, 2012), and my own issues with trust. Here, I continue with commentary on the underlying issues of how we trust, as well as the immense difficulty we have with too much information, or (mis)information.
Cognitive Biases
In attempting to understand trust, I recently looked up the nature of cognitive biases[1]. To quote Wikipedia, “Cognitive biases are tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment.” Depending on which source I looked up, I found between 180 and 250 distinct biases, ranging from anchoring bias (the tendency to anchor conclusions on the first piece of “trusted” information acquired) to the Zeigamik Effect (the tendency for interrupted tasks to be remembered better than completed ones). I found the list to be fascinating, and recognized that many of the biases would have great survival value in a simple culture.
But ours is not a simple culture. When overwhelmed with too much information, I (and almost certainly any human being) will rapidly sort the information for importance according to my biases, especially my other-than-conscious biases. I know I do this every day — and (perhaps as my bias) I believe I am very sophisticated in my understanding of human communication. Heaven help those who are less sophisticated.
Whom To Trust
As I said recently, I have previously written about the means by which we establish trust (Whom Do You Trust?), and the TIC process that people use. To reiterate (as I regard it as a very important process to understand), people:
translate (T) the new information into language they can understand more easily, they interpret (I) into their own system of meaning, and then they corroborate (C) this meaning with groups that they already trust. For example, if I want to process information about new electric cars, I translate (T) the information into my current understanding of cars, think about (I) what cars mean to me, and then go ask (C) my friends what they think about electric cars.
Thus the fundamental basis of trust is how we select those around us whom we will believe, or at least with whom we will associate. But the group we trust may have their own biases, often in many ways. Examples include the colonial stances of the 19th century and the information presented in The Untold History . . . .
Such biases are especially important in the light of George Marshall book Don’t Even Think About It: Why Our Brains Are Wired To Ignore Climate Change that I recently reviewed (7 parts, beginning here).
I recently wrote to a friend, concerning our mutual need to find a way to have the Canadian people mobilize for climate disruption, that we need:
a big frame that allows the conservatives and doubters to engage together with those committed. We have to interact so as to establish trust, not so much with the people like [Steve] Bannon, but with those who listen to him and still have uncertainty. The frame could be something like: ‘What do you want for the future? We are all in this together, and even though most of us have uncertainty, we need to pull together to create a better world. Let’s all talk to each other as if the other has truth in what they are saying: both those who are uncertain about climate disruption, and those who are more certain.’
But I continue to wonder to what extent my own biases and those of others interfere with our ability to cooperate on this super-wicked difficulty.
And if we don’t cooperate, the consequences are immense, if not disastrous.
[1] (A) List of Cognitive Biases, https://en.wikipedia.org/wiki/List_of_cognitive_biases, accessed 2017 February 14; (B) Cognitive Bias Codex, https://en.wikipedia.org/wiki/List_of_cognitive_biases#/media/File:Cognitive_Bias_Codex_-_180%2B_biases,_designed_by_John_Manoogian_III_(jm3).jpg, accessed 2017 February 14