Cognitive biases are mental shortcuts that our brains use to make decision-making more efficient, but they can also lead to errors in judgment. These biases often arise because our brains are trying to process large amounts of information quickly, using patterns and assumptions to simplify complex situations. While these mental shortcuts can be helpful in everyday life, they can also make us more vulnerable to manipulation, especially in phishing attacks and other social engineering tactics.


Cognitive biases exist because our brains are wired to conserve cognitive energy and make quick decisions. However, this speed sometimes comes at the cost of accuracy. In stressful or high-pressure situations, like receiving an alarming email or phone call, these biases can make us more likely to fall for scams because we rely on instinct rather than careful analysis.


Cognitive biases can be grouped into four main categories:


  1. Too Much Information
    Our brains are constantly bombarded with information, so we filter out what seems unnecessary and focus on the most noticeable or relevant details. This can lead to overemphasizing certain data while ignoring the bigger picture.
  2. Not Enough Meaning
    In uncertain or ambiguous situations, we tend to fill in gaps with assumptions and patterns that may not exist. This can cause us to draw incorrect conclusions based on incomplete information.
  3. The Need to Act Quickly
    When faced with urgent situations, we often make fast decisions to resolve problems quickly. However, this need for speed can lead to impulsive actions that aren’t fully thought through.
  4. What We Should Remember
    Our memories aren’t perfect, and we tend to focus on information that seems important or emotionally relevant. This can result in us remembering things inaccurately, especially under stress.

Here are 10 common cognitive biases that phishing attackers exploit:


  1. Availability Heuristic
    People are more likely to react to information that comes easily to mind, especially after hearing about recent events, such as cyberattacks.
  2. Anchoring Bias
    People tend to rely too heavily on the first piece of information they receive, such as an initial alarming message about a supposed account breach.
  3. Negativity Bias
    Negative information, like warnings of account suspension or threats of data loss, tends to weigh more heavily on decisions than positive information.
  4. Framing Effect
    The way information is presented—whether positive or negative—can influence decisions, such as presenting an action as “protecting your account.”
  5. Confirmation Bias
    People tend to favor information that confirms their preexisting beliefs. Phishers exploit this by sending messages that align with the target’s concerns, like fake security alerts.
  6. Authority Bias
    Requests from perceived authority figures, like executives or IT departments, are more likely to be followed without question.
  7. Scarcity Bias
    Limited-time offers or threats of expiring access play on people’s fear of missing out, prompting impulsive actions like clicking malicious links.
  8. Social Proof
    Messages that imply “everyone else is doing this” can manipulate individuals into complying with fake instructions, such as updating security settings.
  9. Omission Bias
    Phishers suggest that inaction will lead to worse consequences than taking an action, making individuals feel pressured to respond.
  10. Empathy Gap
    When in a heightened emotional state, like fear or stress, people may not predict how they will act, which attackers use to create urgency or panic.

See our Cognitive Bias Index for more complete information on cognitive biases.