TestCon Europe 2019
Nicole van der Hoeven
Tricentis Flood, The Netherlands
Freshly migrated to the Netherlands from sunny Australia, Nicole brings with her seven years of experience in software testing, both functional and nonfunctional. As a performance testing consultant, she’s worked with clients from a wide range of industries to make sure their systems can handle heavy load. In her role at Tricentis Flood, she talks daily to companies from all over the world and helps them plan, script, execute and analyse their way into performance success using the power of Flood and load testing on the cloud.
Nicole writes about challenges in load testing and shiny new open-source tools on the Flood blog, Dzone and Opensource.com, and she also has several tutorial videos on the Tricentis Academy YouTube channel. She recently participated in a webinar on How to Stop Software Fails and is passionate in her belief that well-tested software, multilingualism, and Hungarian lÃ¡ngos can change the world.
Cognitive Biases in Performance Testing
A cognitive bias is a mental shortcut that we rely on to solve a problem, and our human tendency to rely on these shortcuts rather than systematically tackling a problem can lead to significant detrimental effects when performance testing software. There are two systems of thinking in the human brain: fast thinking and slow thinking. Fast thinking is the intuitive, pattern-seeking system of thought, and slow thinking is the concentrated, methodical thinking that arises when faced with a problem for which we have no previous patterns. Relying on either fast thinking or slow thinking too heavily while testing software can lead to cognitive biases.
Nonfunctional requirements tend to produce the anchoring effect, in that having a number causes us to focus on that number at the expense of the application performance. Biases can also arise when analysing data, such as when we use statistics like averages and percentiles rather than visualising data to see the spread and distribution of the load testing results. Other biases that can affect test outcomes are the availability bias, authority bias, confirmation bias, and inattentional blindness, each of which will be explained with common examples from performance engineering. This talk also walks through a real-life production outage that occurred recently at Tricentis Flood in order to illustrate the cognitive biases that arose during investigation and how they extended the duration of the production incident by five hours.
Finally, practical tips and performance testing best practices will be given for how to avoid these traps of thinking, including the best graphs to use for visualising data, tools to explore data rather than use it to confirm a hypothesis, techniques for using fast thinking and slow thinking in appropriate situations, and the importance of social context when making decisions.