The moral gravity bias

The moral gravity bias is a hypothetical cognitive bias or logical fallacy that can occur in judgments in emotionally sensitive areas. When someone else’s moral evaluations about two items are different than the own evaluations, there is a tendency to think that the other person’s average evaluation of the two items is lower than the own average evaluation (or in particular that the other person’s evaluation of one of the items strongly decreases).

As an abstract example, suppose that someone claims that the badness of X is less than the badness of Y. If you believe that the badness of the two items are equal (or incomparable), and if the two items evoke strong emotions, then you have a moral gravity bias if you tend to believe that the other person is underestimating the badness of X, rather than overestimating the badness of Y. From the other person’s mere claim that “X < Y”, you derive the illogical conclusion that the badness of X is underestimated, trivialized or minimized to (almost) zero, or more generally: that the position of X is lowered rather than the position of Y being raised.

A concrete example of the moral gravity bias is the controversy caused by Richard Dawkins’ statement: “X is bad. Y is worse. If you think that’s an endorsement of X, go away and don’t come back until you’ve learned how to think logically.” Dawkins then gave some example about pedophilia and rape, complaining that a claim such as “rape X is bad, rape Y is worse” often generates an illogical conclusion that rape X is endorsed. Dawkins was criticized by those who believe that all rapes are equally bad (or incomparable). Those critics say that Dawkins underestimates the badness of X, because he lowers X by saying “X<Y”. But there is another possible movement: increasing Y. It might be that according to Dawkins, rape X is as bad as rape X is according to his critic, but that Dawkins thinks rape Y is worse than what his critic believes about rape Y. In other words: it is equally possible that Dawkins increases the badness of Y when he says that “X<Y”, and that he could say that his critic underestimates the badness of Y when his critic says that “X=Y”. None of Dawkins’ critics replied that Dawkins overestimated the badness of Y.

Another concrete example of the moral gravity bias can be seen in animal rights discussions. Proponents of animal rights claim that non-human animals should get strong rights because they are sentient and are concerned about their own well-being. Critics claim that those animals should have a low moral status (and hence only deserve weaker rights) because they lack moral or rational agency. The animal rights advocate replies that some humans, such as some mentally disabled people, also lack those levels of rational agency. Being against speciesist discrimination, the animal rights advocate claims that X, the moral status of a non-human animal, and Y, the moral status of a mentally disabled human, are equal. This claim that “X = Y” evokes the reaction by critics (who believe that “X < Y”) that mentally handicapped people are degraded, that the position (the moral status) of Y is underestimated and lowered to the level of mere beasts. Yet, the claim that “X = Y” can also be interpreted as increasing the position of X (the animals) up to the level of Y (the humans). That is what most animal rights advocates believe. The animal rights advocate replies that s/he does not underestimate the moral status of mentally disabled humans, but that the critics underestimate the moral status of non-human animals. According to the animal rights advocate, the position of mentally disabled humans is as high as the position according to critics, but the position of non-humans animals should be increased.

A third example: the Israel/Palestine conflict. Hamas publicly said they would like to kill every Jew. Sam Harris asked the question: “What if the positions were reversed and Hamas had the strength of the Israelian army and Israel had the strength of Hamas?” Then probably Hamas would react worse than the response of Israel in the recent Gaza wars. Harris’ question provoked furious accusations that he minimized the cruelty of Israel’s bombing of Gaza’s children. Saying that the response of Hamas would be worse (or merely asking the question whether it would be worse) does not imply that Israel’s response in the Gaza wars was justifiable or proportional self-defense. It might be that Sam Harris believes that Israel’s response is as bad as what his critics belief. And that the hypothetical response of a strong Hamas would be even worse.

One more example: if a person strongly believes that two candidates X and Y are both equally qualified for a job, and if this person hears another speaker making the claim that “both X and Y are good for the job, but X is more qualified than Y”, the person often believes that the other speaker underestimates X rather than overestimates Y. The person thinks that the other speaker believes that X is underqualified rather than Y being overqualified for the job.

Similarly, if a sexist claims that “Men are smarter than women”, a feminist with a moral gravity bias tends to believe that the sexist underestimates women rather than overestimates men, that the position of women is lowered rather than the position of men raised. Hence the feminist would rather say: “No, women are not stupid. Women are also very smart”, rather than: “No, men are not that smart either”. This example demonstrates that in some cases the moral gravity bias can be a good rule of thumb: most sexists indeed degrade the position of women by saying that they are less smart. Those sexists believe that women are less smart than what feminists believe. As with other cognitive biases, the moral gravity bias does not always produce erroneous judgments.

To conclude, we can write the bias in a formal way: suppose person X believes that MX(A)<MX(B); whereby MX means the moral assessment according to person X, and A and B are e.g. two different options or situations. And suppose person Y believes that MY(A)=MY(B). Person X has the moral gravity bias if X believes that MY(A) and MY(B) are close to MX(A), instead of for example close to MX(B), or right in between MX(A) and MX(B). In the case that person X believes that MX(A)=MX(B) and person Y believes that MY(A)<MY(B), person X has the bias if X believes that MY(A) is less than MX(A) and MY(B) is close to MX(B), instead of MY(A) being close to MX(A) and MY(B) higher than MX(B).

Dit bericht werd geplaatst in Blog, English texts en getagged met . Maak dit favoriet permalink.

Geef een reactie

Vul je gegevens in of klik op een icoon om in te loggen.

WordPress.com logo

Je reageert onder je WordPress.com account. Log uit / Bijwerken )

Twitter-afbeelding

Je reageert onder je Twitter account. Log uit / Bijwerken )

Facebook foto

Je reageert onder je Facebook account. Log uit / Bijwerken )

Google+ photo

Je reageert onder je Google+ account. Log uit / Bijwerken )

Verbinden met %s