-Roger Atkins, Canadian Ski Guide, Avalanche Expert, Author
Almost every avalanche accident involves a poor decision made by a human. If you didn't catch that, let us repeat ourselves— humans and their actions are most commonly the primary reason an accident occurs in the backcountry.
More specifically, it is a member of the party involved in the accident that triggers the avalanche that results in injury or death (Avalanche Canada, 2018; Klassen et al., 2013). That’s heavy, but also actionable information. Most of the time, avalanche accidents aren’t just something that happen to us, they happen because of us. Recent years of research has yielded provocative insight into how we make decisions in the backcountry and how errors in perception led to accidents. This insight provides us with the ability to identify these human factors and correct them BEFORE an accident occurs. These errors are commonly the result of victims under utilizing the information available to them while making critical decisions in the backcountry (Arnott, 1998). Selecting good backcountry partners can help reduce some of the impacts of social dynamics and draw from a diversity of experience to help us spot errors.
Even those of us with a high level of education and experience are prone to errors, especially when operating in the presence of human factors, unusual conditions, and within an arena that provides meager feedback. All of these elements are prevalent in winter mountain environments. Consider for a moment how challenging it truly is to gain expertise in this area. Avalanches are most frequent only half the year meaning your window of opportunity to interact, observe, and learn from them is short. They only provide feedback when they actually occur which means that avalanches NOT occurring IS NOT quality feedback. Finally, the medium is constantly changing. If you can only recreate on the weekends, you are not getting the full picture. If our perceptions and decision making skills are based on our understanding of patterns and cause and effect, what does it mean if we are not sure if we are ever getting quality feedback?
-Confucius
Humans have evolved complex cognitive ways to take in and process the large and constant stream of information we receive from the world around us. Additionally, humans are social beings. How we interact with others and our need at some level to “fit in” is a powerful (and generally unconscious) driver of our behavior. We call these cognitive and social aspects that influence our behavior and ultimately our decision-making human factors.
There are many ways to identify, group, and combat human factors. This is also a constantly evolving conversation as the field of behavioral economics and the study of how our brain works grows. While this list is not exhaustive, we have grouped major human factor traps relevant to backcountry travel and decision making in five categories:
Social Pressure
Overconfidence and/or Low Self Confidence
Closed Mindedness
Shortcuts
Impaired Objectivity
Even experienced backcountry travelers succumb to these human factor traps. Regardless of avalanche knowledge or experience, watchful team members help each other identify these traps at work during the PLAN and RIDE components of The AIARE Framework. These deliberate actions can help prevent biases from driving a poor decision.
The drive to be or remain a part of a group and how we behave and define ourselves around others is powerful and usually unconscious. Peer pressure, a desire for acceptance, social proof, feelings of scarcity or a need to express individualism can drive us to make poor decisions or not see all the information for what it is.
Peer Pressure: People are susceptible to peer pressure. It can be difficult to be the lone dissenter. Professionals such as ski patrollers and guides have additional status within the group and potential to affect decisions.
Scarcity: Also identified as a common trap by Ian McCammon, Scarcity is a trap related to the pressures of a window of opportunity or a diminishing resource (McCammon, 2002). The most common example of this is “powder fever” which is usually seen in popular backcountry areas with limited terrain. The desire to capitalize on a special, limited opportunity can cause people to make poor terrain choices.
Social Proof / Risky Shift: Social Proof is the idea that an action is correct because other people are doing it (seeing skiers on a slope of concern) (McCammon, 2002). The Risky Shift is a phenomenon identified where a group may accept a higher level of risk than each individual might choose alone (Stoner, 1961). These two traps relate to what has been called the “herding instinct” – the illusion of safety in numbers. Avalanches are commonly triggered by the 3rd, 4th or 5th person rather than the first one down.
Acceptance: McCammon calls this the tendency to engage in activities that we think will get us noticed or accepted by our peers, or by people whose respect we seek (McCammon, 2002). Alain de Botton refers similarly to “Status Anxiety,” or the desire for status in modern society and the anxiety resulting from a focus on how one is perceived by others (de Button, 2004). It is easy to see how this pressure can become a trap that influences people to make poor backcountry decisions.
Individualism: People sometimes have a compulsion to feel uniquely individual (skiing alone is one example). Those who do not embrace a team mentality often show an inability to communicate effectively, a lack of empathy for other group members, and an unwillingness to listen to the group. This leads to a lack of cohesion in the team and can provoke tension and poor choices.
We can be quite poor at judging our own abilities (see the Dunning-Kruger effect) and as novices may not have the experience to differentiate between real and perceived risk. We can both overestimate or underestimate the risk, which causes us to build a plan for the wrong hazard. Technology, education, and skill as a rider can give us confidence to falsely transfer our skills in one realm to another or cause overconfidence. Conversely, low self-confidence can make us distrust our instincts and cause us to go along with the group and agree with a decision we instinctively feel is wrong.
According to one study by Dale Atkins, overconfidence was the leading human factor attributed to fatal avalanche accidents by people with some level of formal avalanche training (Atkins, 2000). Overconfidence is a dangerous trap as it generally results in more risky behavior.
Overconfidence Effect: This effect is a well-established bias in which one’s subjective confidence in their judgments is greater than their objective accuracy. Numerous studies demonstrate that this bias can adversely affect backcountry decisions.
Actual vs. Perceived Risk: There is a gap between perception and reality. Since decisions can only be based on perceptions, this trap can lead to miscalculation of risk and poor terrain choices.
Technology: In the modern world, technology has made possible the inconceivable. People sometimes demand more from their avalanche safety equipment, electronics, and snow study tools than that technology is actually able to provide. This can lead to a misperception of risk.
Education: “A large percentage of people caught in avalanches had formal avalanche training” (McCammon, 2000). A little knowledge can offer just enough confidence to overreach on decisions. It takes a lot of experience on top of training to make consistently good decisions, and what experts come to realize is that it is rare to be very confident when it comes to forecasting avalanches.
Abilities Outperforming Experience: Skiers and snowboarders can become expert riders as teenagers in the boundaries of a ski resort. Sometimes, it is hard for them to imagine that they might only be beginners at backcountry decision making, even though they are capable of great feats of mountain athleticism. Confidence in physical abilities has a tendency to transcend to overconfidence in terrain decisions.
Low Self-confidence: Low self-confidence can lead people to distrust their instincts and allow them to agree with a decision that they intuitively feel is wrong. In some cases, people with little formal training or group members with less experience than the leader may observe or become aware of significant data that is crucial to the decision being made. These people are often unwilling to challenge or question the “experienced” leader in the group or the status quo even when they have information or knowledge that others do not.
These are cognitive biases that stem out of brain’s strategy to not be pulled around by every new piece of information that enters it. This means entrenched (even if incorrect) beliefs take time to change, even in light of new information. We tend to place more importance on information that we’ve acquired recently, hear more frequently, or that is more readily available. The key is that these biases are unconscious. Simply knowing about them doesn’t mean we will change them. Using processes such as The AIARE Framework helps slow down group decision making and allows for recognition of when these biases are at work.
The filters listed below affect the ability to observe, process, and respond to information, resulting in a deceptively incomplete picture (McClung & Schaerer, 2006). (These excerpts come from The Avalanche Handbook).
Conservatism: “Failure to change (or changing slowly) one’s own mind in the light of new information or evidence” (McClung & Schaerer, 2006). It takes time or a great effort to make decisions based on what we now know versus what we used to know, even we have new knowledge at our disposal.
Recency: In one’s mind, recent events dominate those in the past, which may be downgraded or ignored. This trap can allow more recent information to override more relevant information from the past. For example, a rider might base terrain choices on recent habits, rather than modifying the approach to match a successful strategy used in similar snowpack conditions not seen for 3 years.
Frequency: Again, in one’s mind, more frequent events dominate those that are less frequent. This is a trap because smaller storm events tend to be more frequent than larger ones, but larger ones can present higher danger.
Availability: This trap involves making decisions based on past events easily recalled by memory to the exclusion of other relevant information. The availability of memories to be recalled may cause unusual or exceptional events to be treated as more common and may bias the decision maker to disregard other important data.
Prior Experience: People tend to see problems in terms of their own background or experience. For example, one can imagine that a snowboarder with experience gained riding a resort terrain park might have a different approach to terrain use than an experienced backcountry snowmobiler.
The human brain has efficiently processed the constant incoming stream of information and stimulus. Most of the time, the brain’s methods are an effective way to keep from being overwhelmed with information and the effort involved in consciously making millions of decisions each day. These decision-making shortcuts are called heuristics. These heuristics can get us into trouble when we mistakenly apply a shortcut developed for one situation to what is actually a completely different situation.
Stress and Logistics Pressure: Feelings of stress and pressure can complicate decision making. Uncorrected errors often result in increased stress, as do unanticipated conditions or scenarios. Time applies pressure. When stressed or under pressure the tendency is to take shortcuts to change the immediate scenario.
“Rules of Thumb” or Habits: Habits tend to shortcut thoughtful evaluation. Independent rules of thumb may be functional at times, but they often oversimplify the problem. Good terrain selection is a complex process that demands unique assessment for each situation. Dependence on rules will lead to a decrease in accuracy, and errors can be fatal.
Decisions from Few Observations: Observations take time and energy to gather. Consider if the quality/quantity of observations represents reality, or simply convenient support for the group’s desire to not find instability. For example, “I don’t see any avalanches; it must be good to go!”
Back to the Barn: The urge to simply “get it over with” and return to safety, food, and shelter is powerful. Commonly, people make poor decisions late in the day, when people are tired and nearly home.
Expert Halo: People with more experience or knowledge tend to be perceived as experts. Group members often shortcut their own cognitive processes and allow someone they perceive as more competent to dominate the decision making.
We all live in our own subjective realities. We can never be truly objective, but recognizing that our subjective reality colors our perception helps us to be more open to the idea that we may actually be missing evidence and more open to the input of others in our group. The occurrence of avalanches can provide a pretty low feedback environment. We may only see the evidence to support our desires (the snow looks really good) or may interpret non-events as evidence of our good decision making. The key is that we can’t make ourselves more objective simply by recognizing that subjectivity happens. We need team members to provide a mirror and point our blind spots to contradictory evidence.
Search for Supportive Evidence: Bruce Tremper says that people often say, “I’ll believe it when I see it,” when actually it is the other way around (Tremper, 2001). People tend to see what they already believe to be true. People tend to gather facts that lead to certain conclusions and disregard facts that threaten them.
Blue Sky / Euphoria: Avalanche accidents tend to occur during blue sky days following storms (Tremper, 2001). Experiencing a day with great snow conditions can cloud judgement because of the hormones released during the throes of euphoria.
Familiarity / Non-event Feedback Loop: McCammon pointed out that many accidents happen in familiar terrain (McCammon, 2002). People often feel comfortable in familiar areas. They let their guard down or base their current decisions on past experience. The trap here relates to the “Non-event Feedback Loop” in decision making. When backcountry decisions result in no avalanche, people may believe that they made the best choice. The traveler may have been simply “lucky.” It may be only a matter of time before acquired habits that seem adequate result in an accident.
Optimism: This is a bias also known as “wishful thinking,” and has been referred to as “Commitment” by McCammon (McCammon, 2002). The more someone prefers an action, the more likely they are to decide to do it. Being optimistic in the face of facts we don’t want to acknowledge would be like rearranging the deck chairs on the Titanic as it goes down.