Mapping Biases to Testing: Confirmation Bias

I use terminology from earlier blog posts about biases. If you have missed those posts, read part 1 here. I explain the terminology there. In the second post I wrote about the Anchoring Effect.

Let me state the ‘bad news’ up front: you cannot fully avoid the confirmation bias. That’s actually a good thing because if you could you wouldn’t be human. We jump to conclusions (with System 1) when our assumptions are likely to be correct (Page 79, Kahneman) and a mistake is acceptable. For example, when you meet a new person you immediately judge him or her based on stereotypes, what type of clothes the other person wears, his/her posture, etcetera. This happens so quickly that you cannot stop it.

confirmation

However, jumping to conclusions can be a bad thing when the situation is unfamiliar, the stakes are high and when there is no time to collect more information(Page 79, Kahneman) . This is what testing is about! We are often dealing with unfamiliar situations, the stakes are high and we usually face a deadline. How do we deal with this? Let’s explore what the confirmation bias has to do with testing.

The confirmation bias is an umbrella term with more specific biases in its category: the halo effect, ‘What you see is all there is’ (WYSIATI) and the availability heuristic, to name a few from “Thinking, Fast and Slow”. We’ll also see that heuristics, which are very important in our work, are strongly linked to the confirmation bias.

The Halo Effect

The halo effect is explained as “the tendency to like or dislike everything about a person”. I think this can also apply to systems or new parts of a system you are testing the first time. A first impression is hard to erase. Confession time: after working in the same team for awhile I started expecting a certain level of quality from the different developers. Developer A was a very structured and organised person, taking pride in developing clean code. He would involve me in development and ask questions. When it was time to do an exploratory test session (after the automated tests were written by us together) I was liking everything about the software. I was already biased in a positive sense.

Developer B was a sloppier person; not writing unit tests, not even starting up the application locally before ‘handing it over to test’. Without discussing how not Agile this approach is, at that point, I was already expecting his work to be crappy. Even when, over time, developer B was coached by other developers and improved his way of working, I still tested his code with more suspicion. I wonder how many bugs I missed in developer A’s work, because of the halo effect.

‘What you see is all there is’ (WYSIATI)

This one has really scary implications for testing. WYSIATI is “an essential design feature of the associative machine. It represents only activated ideas. Information that is not retrieved from memory might as well not exist.” I basically read this as: you cannot test what you do not think about. Sounds completely logical to me, but this is often not how testing is sold! We are casually using terms like ‘100% coverage’, ‘completely tested’ and the like. We might test a product extensively and thoroughly, but that doesn’t mean we catch all the errors. That’s personally the part of testing I have always struggled with. There’s a trade-off between time and risk. You have to make the leap of faith towards going live even though you know that scientifically there is always a chance (or is it a certainty?) that a (horrible) bug might be lurking around the corner.

 Availability Heuristic

This heuristic is described as “a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision.” It is about how easy it is to think of examples. The easier that is, the more you believe what you think it truthful. You can convince yourself of a ‘truth’ simply because you cannot think of evidence that proves your hypothesis wrong. For me, the availability heuristic comes into play when you decide whether you have tested enough. It happens when you have been testing an application or part of an application for a while and you are not finding interesting things anymore; you are not learning anything new about it anymore. You might decide: I have tested enough; I am confident I have found all the bugs. You cannot think of other tests or test techniques at that time. Put in the context of the availability heuristic: it is no longer easy to think of examples (test ideas) so you are convinced you have done enough.

Again and again, I have found this to be dangerous territory. It has happened to me that I was confident about the quality of a new part of an application, only to come back into the office the next day (with new energy and new ideas) to find an absolutely terrifying bug. A bug that, if I were to port myself back to the day before, I didn’t think would exist, simply because that example (specific test idea) didn’t come to mind easily. To avoid the availability heuristic, you need the power of collaboration.

I want to make an extra mention regarding other types of heuristics. I love heuristics a lot, but there is a danger hidden in them. The more experienced you become, the more you might start to rely on heuristics. It gets easier to recognise patterns in certain situations. Your associative memory is triggered a lot when you are an experienced tester. Just be aware that this associative memory is strongly linked to the confirmation bias. Stay open-minded. Stay curious for other people’s view on things. That can help you avoid the confirmation bias to a certain extent.

Conclusion

System 2 is in charge of doubting and unbelieving (Page 80, Kahneman), which is very important in testing! So when the stakes are high, try to involve System 2 thinking. Be kind to yourself though, it simply is not possible to always involve System 2. You would get very tired. Just realise you are vulnerable to different manifestations of the confirmation bias. Take a step back once in a while and ask someone else to pair up with you.

PS: have you read “Thinking, Fast and Slow” yet? No? Run to the (internet) bookstore, off with ya! See you next time!

3 thoughts on “Mapping Biases to Testing: Confirmation Bias

Leave a Reply

Your email address will not be published. Required fields are marked *