Press ESC to close

Why We Think the Way We Do: Solving the Maze of Cognitive Biases and How Researchers are Susceptible to Them 

Getting your Trinity Audio player ready...

The human brain is super powerful but still employs certain shortcuts so that decision-making is easier and faster. Many of these shortcuts are referred to as “cognitive biases”: systematic patterns in thinking that influence how we perceive the world around us or make decisions. There’s nothing inherently wrong in using these mental shortcuts (technically termed “heuristics”) for decisions that have little or no long-term impact (e.g., choosing between Chinese and Indian takeout for dinner one night). However, when we apply these in our research projects or in making career-related decisions as a researcher, these cognitive biases can play us false.  

So, What Exactly is a Cognitive Bias: Definition 

According to Korteling and Toet (2022), “[c]ognitive biases are systematic cognitive dispositions or inclinations in human thinking and reasoning that often do not comply with the tenets of logic, probability reasoning, and plausibility.” In simpler terms, cognitive biases are mental shortcuts we take that may not be the most logical or reasonable options, but are often the simplest and fastest. The problem is that these shortcuts we use aren’t necessarily objective or even accurate.  

Why Do Cognitive Biases Arise? 

The human brain is constantly bombarded with information. If we had to rationally process and think out every decision we make while considering all available evidence, we’d get very little done. Imagine the strain of deciding every day what kind of bread to use in your sandwich or the best way to brush your teeth! For humans to be efficient and successful, they need the ability to quickly detect patterns, assign significance to different pieces of information or input, and filter out unnecessary distractors. 

Are Researchers Subject to Cognitive Bias? 

The answer is a resounding yes! No human is free of cognitive bias, and researchers are often faced with an absolute flood of information and data. According to Sullivan and Schofield (2018), cognitive bias is now attracting increased attention in the field of medicine, because it has been deemed to be a critical and widespread source of medical error. Jala et al. (2023) further explored cognitive bias during clinical decision-making in their scoping review. And cognitive biases are not limited to medical practitioners and researchers alone: Berthet’s (2022) literature review sheds light on how different cognitive biases affect decision-making among professionals in fields like management, law, and finance as well.  

So, How Many Types of Cognitive Biases are Out There? 

Researchers have identified over 100 types of cognitive biases. And if we were to delve into each, this article would never get finished! Instead, let’s look at a few of the most common cognitive biases and how they could play out among researchers.  

Anchoring 

This is the tendency to assign more importance to the first piece of information you come across, and interpret subsequent information in the light of the initial information. For example, while searching for a journal to which you can submit your paper, you come across a journal with a very quick turnaround time. In comparison, all the other journals seem excessively slow (though they may actually be more reputable).  

Halo Effect 

Let’s say that you meet a researcher at a conference who speaks English haltingly and with a strong accent. You unconsciously view them as less knowledgeable or prominent in their field, even though they might be an award-winning scientist and on the editorial board of a prestigious journal. This phenomenon is known as the halo effect. The halo effect occurs when our perception of a single attribute of a person (e.g., physical appearance) influences how we perceive their other characteristics (e.g., intelligence, honesty, trustworthiness).  

Confirmation Bias 

Confirmation bias occurs when we pay more attention to information that confirms or supports our existing beliefs. Take, for example, a researcher who is strongly pro-life and is conducting a literature review on the psychological effects of long-term oral contraceptive use. The researcher may overemphasize studies finding that long-term oral contraceptive use is linked with depression, anxiety, etc., and downplay studies that show minimal or no such effects.  

Framing Effect 

This type of bias takes place when people have a set of options before them and tend to favor the options that are “framed” (i.e., described) positively. For example, if you’re trying to decide which disinfectant to use, you pick the one described as “Kills 99% of the germs” rather than the one claiming “Only 1% of germs survive!” 

Availability Heuristic  

This is a kind of mental shortcut wherein we estimate the probability of something based on how many examples of that thing come to mind easily. For example, a lot of your friends have participated in research studies as college students. You assume that recruiting participants will be very easy, although you’re trying to draw your sample from working professionals of all ages and demographics.  

Actor-Observer Bias 

This refers to the tendency to attribute your own actions/behaviors/condition to external causes, and the actions/behaviors/condition of others to internal causes. For instance, you believe that the high loss to follow up rate in your own study is due to the disease burden on the patients you’re studying. In contrast, when you see a high loss to follow up rate in a paper you are peer-reviewing, you question the quality of study design, methodological soundness, etc. 

Baader–Meinhof Phenomenon 

Also known as the frequency illusion, this bias occurs when you learn something new (even if it’s just the name of a new AI tool), and then you notice it all over the place. In reality, there’s no increase in occurrence. It’s just that you have started to notice this product, phenomenon, or word. If you want to understand how this phenomenon applies in research, do refer to Das’ (2021) editorial on “Covid toes”. 

Representativeness Heuristic 

At a breakfast networking event for researchers, you’ve to decide where to sit: next to an older, White man in a business suit or a young, Black woman in a knee-length dress with a pink cardigan. You quickly grab the seat next to the man, assuming he’s a senior scientist, journal editor, or someone “worth knowing” in the field. However, this might not be the case: the man could be the sales director of a statistical software company while the woman could be a section editor of your research society’s flagship journal.  

Recency Bias 

When you pay more attention to recent events than to historical ones, you’re operating under “recency bias.” Basically, you place too much emphasis on whichever event is freshest in your memory. For instance, your most recent research paper received a “Minor Revisions Required” decision from your target journal. You therefore feel more complacent that your next paper will also receive such a favorable decision, and are completely floored by a “Major Revisions Required” decision.  

Negativity Bias 

You might have heard that humans are “hardwired for negativity.” This is the negativity bias: we tend to pay more attention to and remember negative stimuli or events in comparison with positive stimuli or events. For instance, you remember the one incident where SPSS closed unexpectedly and you couldn’t retrieve your analyses, as opposed to the hundreds of times it functioned smoothly.  

Cognitive Biases and How Researchers are Susceptible to Them | Editage USA

Leave a Reply

Your email address will not be published. Required fields are marked *