How emotional attachments make us less effective

I have been changing a lot the last few years, due to the new movement of effective altruism. I changed my mind about many topics (such as GMOs, ecocentric environmental ethics), I changed my activism and volunteering, I changed my donations to charities, I changed some of my moral beliefs and ideas, and so on. I realized that a lot of the things I used to believe or do, were not very effective in terms of doing the most good. I realized that a lot of my moral beliefs were not really coherent or altruistic.

In order to change my beliefs and choices, I faced one big obstacle: emotional attachments. I was so to speak married to some organizations and strongly attached to some moral theories (such as deep ecology). When my actions or beliefs were criticized, I felt strong emotional resistance, a strong urge to defend my beliefs. When I changed my mind about a subject such as GMOs, it felt like an emotional shock to me.

When it comes to beliefs, this emotional attachment results in an overconfidence bias, and when it comes to activism and volunteering, this emotional attachment results in a commitment bias.

Overconfidence bias

The overconfidence bias means that you have more confidence in a belief than what can be justified by evidence or reason. The level of confidence that you ascribe to that belief is higher than the probability that you are right about that belief. If you have a set of beliefs, you feel 100% certain about those beliefs but you are only right about them 80% of the time, you have a 20% overconfidence level.

If you have strong feelings about a belief, for example if you have a strong preference about something to be true, you are likely to have an overconfidence bias. The problem with this overconfidence bias is that updating your beliefs becomes difficult. Suppose you feel 100% certain about something, but now you are faced with new evidence that contradicts your belief. From a rational point of view, you should update your confidence in that belief according to a so called Bayesian updating process. When you receive new evidence, your new confidence level is the result of your old confidence level multiplied by a Bayesian updating factor that depends on the new evidence. If the new evidence strongly confirms your belief, the updating factor is high, but if it strongly contradicts your belief, it is close to 0.  However, if you are 100% certain about a belief, you give a 0% probability that the opposite is true. And 0% multiplied with any updating factor remains 0%, so no evidence can convince you that the opposite is true.

When I realized that I had an overconfidence bias, that some of my beliefs could be wrong, that it is irrational to feel absolutely sure about something, I started using this idea of Bayesian updating more often. I looked at the confidence levels of my beliefs as if they were meters on a control panel. Every time new information arrives, I change my confidence levels: the meters on the control panel go up or down if my confidence increases or decreases. Avoiding emotional attachments and keeping in mind that I have to look at the confidence levels of all my beliefs – including the beliefs that generate strong emotions – like meters on a control panel, allows me to be more flexible, to change my mind more fluently. I even changed my mind about moral principles that were very dear to me (see my biggest moral mistake).

And this is one of the major strengths of people in the effective altruism community: they are so flexible and open about updating their beliefs. It is very unlikely that all the beliefs held by an altruist (an activist or a politician) are true. In fact, a lot of the beliefs I thought to be true, were falsified by new evidence. And we know that changing one’s mind is not easy. So we have to train ourselves to be better able to change our minds. We have to resist social temptations to stick to our guns. Changing one’s mind should be considered as a proof of trustworthiness and integrity, not fickleness or unreliability. The more people (activists, politicians) change their minds, the more it becomes socially acceptable. Dare to think, dare to change.

Commitment bias

The commitment bias is an irrational escalation of commitments where you continue the same behavior (e.g. support the same organization or project) or make the same decisions over and over again, even when you are faced with increasingly negative results or new evidence that the decision was probably wrong (e.g. that the project is not effective). Maintaining the behavior is based on the cumulative prior investments and aligns with previous decisions, but it can be irrational. This commitment bias is a kind of sunk cost fallacy, where previous investments in an organization or project (the sunk costs) influence your choice to keep on supporting this organization or project, even when you encounter other organizations or projects that are much more effective. It is also a kind of loss aversion: if we have invested a lot in a project, giving up this project is hard because we are afraid to lose the investment. We are afraid to think that is was all for nothing. Losing a project is perceived as being worse than not acquiring that project, similar to the phenomenon that losing money is considered worse than not acquiring that same amount of money.

In the past I invested a lot in some organizations and projects that were not very effective. It took serious efforts to accept the evidence that there were better, more effective things to do. It took effort to let go of my old projects and commitments. When I realized that I had strong commitment biases towards certain projects, I started to look at my projects in a different what. I started thought experiments by asking questions like: “What if I didn’t put all the effort in the project but instead I inherit this project from someone else? What if I were offered this project now?” For example, after I have written an article and I receive feedback to rewrite it, it becomes difficult to rewrite it because I have invested some effort in that article and it is emotionally speaking difficult to change it. But then I try to think as if someone else wrote that article and I get the opportunity to rewrite it based on the feedback. This makes it emotionally easier to rewrite it.

One of the major strengths of people in the effective altruism community is that they are more willing to give up commitments and switch to new projects to avoid the commitment bias or the sunk cost fallacy. Effective altruists are not married to a project or organization, they are very flexible and able to change projects. Only this attitude allows them to constantly pick the most effective choices to do the most good.

Other biases

The overconfidence and commitment biases are accompanied with other cognitive biases that make us less effective. There is the confirmation bias, the tendency to seek out only that information that supports one’s preconceptions, and to discount that which does not. In order to justify a commitment to a project, a confirmation bias is at play when people are not willing to accept negative information about the project, for example that the project is less effective than other projects. And when we strongly believe something, we become overconfident when due to the confirmation bias we seek out information that confirms our beliefs. This is related to the so-called active information avoidance: when we receive negative information, for example that our project is less good or our belief is wrong, we actively try to avoid that negative information. Even very intelligent people can have those biases, can actively avoid evidence that contradicts their beliefs and can be susceptible to irrational escalations of commitments.

Debiasing: the art of letting go

Strong emotions are not always reliable and can be very obstructive when it comes to doing the most good. From now on, when it comes to altruism and doing good, I will avoid this kind of emotional attachments to beliefs and projects. I train myself in changing my mind, in being able to let go of bad ideas and projects. I will hold strong emotional attachments only to real persons, not to ideas, projects, organizations or behavioral choices that are intended to do the most good.

When I had the conscious intention to update my beliefs according to new information in order to arrive at the most accurate beliefs necessary to do the most good, I changed my mind about more things than expected. I underestimated my cognitive biases, I underestimated how much my mind would change due to effective altruism. This means I had a bias blind spot, I was not fully aware of the influence of my cognitive biases. As a result, I start to belief that other altruists may also have this bias blind spot and would have to change their minds about many things when they intentionally try to counteract their cognitive biases. This means that debiasing ourselves – in particular debiasing altruists and politicians – is very important, because we might easily be wrong about more things than we believe or expect.

 

Advertenties
Dit bericht werd geplaatst in Blog, English texts en getagged met , . Maak dit favoriet permalink.

Geef een reactie

Vul je gegevens in of klik op een icoon om in te loggen.

WordPress.com logo

Je reageert onder je WordPress.com account. Log uit / Bijwerken )

Twitter-afbeelding

Je reageert onder je Twitter account. Log uit / Bijwerken )

Facebook foto

Je reageert onder je Facebook account. Log uit / Bijwerken )

Google+ photo

Je reageert onder je Google+ account. Log uit / Bijwerken )

Verbinden met %s