Know your brain: the just world bias

Academics spend much of their lives inside their head, mulling cause and affect, conjuring experiments, weighing options. This quiet time is absolutely essential. But what if are we aren’t seeing clearly?  What if our logic is weighted toward one conclusion? What if our mental mirror has defects, or, worse, is of the funhouse variety?

As scientists we design experiments to minimize bias. Indeed science as a way of knowing acknowledges bias and attempts to circumvent it. We have one, often unacknowledged ally in this cause, experimental psychologists.

Now, I know, many associate psychology with the treatment–effective or not–of mental illness. But one of the most useful products of psychological research is the uncovering of inborn biases–the defects in the lenses through which our brains perceive the world. Our wiring, evolved on the plains of Africa, rejiggered in clans, tribes, and societies, is that which allowed our ancestors to survive, not necessarily to see the world clearly.

We are semi-rational beings. How do we avoid being run over by the Semi?

We need to know our biases so we can work to circumvent them. We need to know our student’s biases so we can work, as teachers, to circumvent them.

And so I introduce an occasional series called: “Know your brain”. 

This one comes from the always dependable Oliver Burkeman, who starts his column.

‘Deep down, and whatever our political opinions, many of us seem to believe that life is fair – that by and large people get what they deserve’

Now we all know, rationally, that this isn’t true. Much of life’s events are stochastic; much of our work in life is preparation to temper that fact. But then again, can you think of a group of humans that are not more pre-disposed toward an acceptance of cause-effect than scientists (except–in the Department of Strange Bedfellows–those that think that everything that happens is God’s will). Well, consider this:

This annoying but by now well-substantiated finding is known as the “just world hypothesis”, and the most famous demonstration of it was a series of clever experiments by the psychologist Melvin Lerner. In one, he showed people what appeared to be live footage of a woman receiving painful electric shocks for making errors in a memory test. (She was actually his accomplice.) Some groups of viewers had the option of ending her ordeal; others didn’t. The latter – forced to watch suffering with no chance of relieving it – formed far lower opinions of the woman, seemingly to “bring about a more appropriate fit between her fate and her character”. Those opinions were worst when they were told the woman got no financial reward for her pains. The greater the injustice, the more people appeared to need to believe the victim brought it on herself.

So the likelihood of us rationalizing a cause out of thin error is

 (the size of the injustice) / (our power to curb the injustice). {thanks Brian}



2 Responses to Know your brain: the just world bias

  1. Interesting! Shouldn’t the operator in the last paragraph, though, be a divided by sign? Based on the summary you quoted, the likelihood of rationalizing is inversely proportional to one’s power to curb injustice.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: