Decision Bias & Software Design: Why It Matters
Understanding how prejudice & bias can stand between rational design decisions in software development
The Oxford dictionary defines bias as "prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair.” Many organizations invest heavily in educating their associates on workplace bias--including biases based on race, creed, religion, sex, disability or age. But those same organizations often overlook the cognitive biases that are present in our daily decision making processes. You may wonder what could possibly go wrong because of this. Let us take our software industry as an example.
In the software industry there is a strong feedback factor to everything we do that helps us innovate on new ideas. The product designer, owners, software architects and developers all try to interpret this feedback, localize them to their scope of work, and derive a solution. Imagine what could go wrong if just one of these participants makes a decision in a biased fashion? If left unaddressed, these decision biases can lead to unforeseen consequences, seeping into how we make decisions in our work, ultimately impacting our customers, and leading to costly repairs and refactoring over time. Therefore, in order for an organization to combat these biases, a sound, data-driven approach is necessary while making decisions.
If you own software design decisions for your line of business or organization, it is important to ensure that your design decisions are reviewed thoroughly, by you and your peers, and that a proper mitigation strategy is implemented to keep the impact of any decision bias to a minimum.
In this article, I will highlight some of the top decision biases and design biases that are most prevalent in the field of software design and development and that team leaders need to be aware of. I will then introduce some tactics for handling and counteracting these biases. My hope is that with this information, you will be better equipped with the tools needed to create robust design suggestions, especially around architecture and UX design.
Why are biases so common?
A recent article in Forbes magazine on cognitive bias stated:
“The human brain was originally built for survival. We had to make a quick decision between fight or flight to avoid being killed. In this instance, our ancestors needed some information processing shortcuts. The option chosen was probably the one that worked the last time the situation occurred. So, cognitive bias helped keep enough of us alive to be pondering the process today.”
This basic thought process helped our ancestors for thousands of years and honed our flight or fight responses. After all, an unbiased thought process requires intense computation and time to derive an outcome. Since time is a luxury during many survival situations, relying on assumptions helped our ancestors to assume an immediate understanding, and proceed with that understanding, to aid their survival. This simple cognitive response led to many core biases making their way into how we process information and make decisions.
Fast forward to the present day. When we design software architectures, the design is often initiated by a business need accompanied by a stringent time constraint. This adds pressure to architect applications quickly, which can impact the ability to gather all the relevant information and potential options beforehand. While many design flaws can be corrected by iterative development models in the software application development process, the real problem lies in the decision space.
So what are the biases we need to be aware of when working on software designs? In this article I have highlighted a few important decision and design biases that I have come across in my journey. I have also provided suggestions on how to identify, analyze, and remediate each of these aforementioned biases. This article is not a catalog of all possible biases or traps out there, but is intended to get you thinking on this front.
Types of Biases
1. The Anchoring Bias
Logically thinking, we gather data long before aligning to a decision. However, most of us don’t question where the data comes from. We simply trust the numbers and more often than not, base our decisions on this sometimes unverified information. Anchoring bias is therefore the most basic and common decision bias that we encounter in software development.
In the HBR journal in January 2006, anchor bias at the workplace was illustrated with the following example, “A marketer attempting to project the sales of a product for the coming year often begins by looking at the sales volumes for past years. The old numbers become anchors, which the forecaster then adjusts based on other factors. This approach, while it may lead to a reasonably accurate estimate, tends to give too much weight to past events and not enough weight to other factors. In situations characterized by rapid changes in the marketplace, historical anchors can lead to poor forecasts and, in turn, misguided choices.”
An example based on my experience in software development is when a team of infrastructure specialists are tasked to right size their infrastructure. Imagine the contrast in the proposals if the team considered one data point from the recent past versus validating the traffic pattern, response times, and load of infrastructure for the last year? I also see anchoring bias in many designs where a design pattern was chosen without understanding the variables that led to certain decisions in that design. Rather than analyzing the rationale behind the feature or understanding the variables that hold a certain validity, I often see decision logs stating that the reason for choosing a design is simply “following standards.”
In Capital One’s Canada Studio, I gathered a team of SMEs who are interested in this field and together we run a Design Dissension Session (known as 2D) where we focus not on the how of design, but on the why. Asking why in every single point in the architecture design process brings out a colorful array of reasonings from our fellow designers. This helps the team to spot any bias underneath the design decisions and provide recommendations to address them in a collaborative environment. The best part of this process is that it is engaging and not just a check-the-box activity. These are sessions where both the team participating in the activity and the team conducting it get an opportunity to learn and take away valuable insights from the discussion.
What can we do about anchoring bias?
While internalizing an approach to these biases is the best way to move forward, there has been extensive research done on how to effectively handle anchoring biases.
An article in Psychology Today titled “Outsmarting Anchor Bias in Three Simple Steps” makes the following suggestions that are easy to implement and easy to adopt:
- Acknowledge the anchor bias - Understand that you may be biased and that you should expect to find anchoring biases in your judgement. Whenever I get into a design session, I consciously remind myself that I am biased. This helps me add a point in my checklist to identify and solve for any biases that I may have introduced in my solution proposal.
- Delay your decision - Delay your decision making process and always seek multiple opinions. In the Design Dissension Sessions mentioned above, the team creates and signs off on a document that captures the common consensus on a design. We don’t mark the activity closed until each member of the team signs off on the proposed recommendation, even if that means capturing some dissenting opinions.
- Drop your own anchor - Apply your well analyzed anchor into your decision and be open to changing the point of reference should new data emerge. In my team when we make a design recommendation, we always complement it with a clearly defined rationale. This ensures that the design should be revisited when the baseline rationale no longer holds.
The HBR article mentioned earlier includes a few other suggestions that I have found valuable in my own work:
- View from different perspectives - One of the best starting points is to define and redefine a problem statement from multiple perspectives. This gives an excellent perspective about the problem before even diving into the solution part. In our Design Dissension Sessions, we describe a problem from perspectives of capacity, business priority, resource skill, and timeline. We then choose one statement that resonates closely with the presenter and work from there. The other perspectives stay in the document for archival purposes.
- Think about the problem on your own - Do this before consulting others to reduce the influence of their bias on your thought process. For example, when you participate in a meeting, you will feel more productive if you have a take on the agenda prior to joining. This will truly bring your perspective into the discussion without risking the chance of someone else’s bias convincing your viewpoint.
- Widen your frame of reference - Try to be open-minded and seek information and opinions from a variety of people. In our Design Dissension Sessions, we often invite SMEs from multiple domains to get their inputs. While we keep the quorum to a minimum, say 5 members or fewer, the unique perspectives these experts bring are invaluable when we vote for a recommendation.
- Avoid anchoring your peers - Encourage your peers to bring their own thought process into the discussion. This is very crucial to arrive at an unbiased agreement. While you are at this ensure that even when you all disagree, arrive at an agreement on at least some aspect. Never let the meeting end as a zero sum engagement.
2. The Status Quo Bias
To simply state, the status quo bias means we often do things just to adhere to a well established process and comfortably avoid attracting scrutiny towards ourselves. In the HBR journal, a clear explanation for this could be found in this quote. “The source of the status-quo trap lies deep within our psyches, in our desire to protect our egos from damage. Breaking from the status quo means taking action, and when we take action, we take responsibility, thus opening ourselves to criticism and to regret. Not surprisingly, we naturally look for reasons to do nothing. Sticking with the status quo represents, in most cases, the safer course because it puts us at less psychological risk.”
The world’s first automobile looked just like the buggy it replaced. Likewise, the world’s first electronic newspaper resembled the structure of the known handheld newspaper. While adhering to a “norm” or “standard” is not wrong, we should ensure that it does not curb innovation. I have seen many solution architects derive inspiration from other well-known applications just because it was easier to explain should someone challenge their design. While going to the other end of the spectrum with completely radical, unsupported ideas may be detrimental to an organization, careful acknowledgement and analysis of this bias is critical to a well thought design and decision making process.
What can we do about status quo bias?
Note that most of the biases I am covering are weaved with other biases. Singling them out or approaching them uniquely might be challenging. A discussion on status quo bias can be a book in itself due to the countless variations under this bias. However, understanding where you stand is crucial before analyzing these suggestions:
- Create a decision log - When a design is proposed, create a well articulated decision log of all key decisions made around the design. These decisions will clearly reveal why one option was chosen over another. With help from fellow associates, you can see if you can safely take a risk without compromising the overall intent of your decision. In our Design Dissension Sessions, the meeting starts with a decision log where we start with painting multiple perspectives of the given problem statement. This gives a clear direction to move towards.
- Become an experimentation organization - Experimentation should be embedded in your everyday work. Fail fast, fail more, and learn from those failures to arrive at a matured model sooner. Encourage teams to experiment and appreciate failure when it comes from experimentation.
- Avoid exaggerating the cost or effort involved for your second and third choices - We often tend to blow up the risks of choices that we personally don’t prefer, and lay them first in a bad spot. Try to avoid framing choices as per your convenience. Analyzing all options across multiple perspectives will help one to arrive at an unbiased list of choices that can be used without compromising the integrity of the intended outcome.
- Always evaluate options - Evaluate options from both the present and future perspective before discarding them. Involve peers in the decision making process to neutralize the effect of your own status quo bias. In our Design Dissension Sessions, when we make a design recommendation, we often consider the target state vision set by other teams across the enterprise. For example, in order to move towards using serverless technologies (e.g. AWS Fargate) in the future, we might choose to motivate our teams to move to a container solution (e.g. AWS ECS) today to acclimate them to the difference in control that status quo virtual server instances give.
3. The Sunk-Cost Bias
In the book Thinking Fast and Slow, Daniel Kahneman talks about the sunk-cost fallacy. He states, “Rather than considering the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret.” In other words, we tend to make choices in the present based on our past decisions in a way that justifies our actions, even when the past decisions no longer appear valid.
By justifying our past actions, even though the outcome was proven inefficient, one must acknowledge the error and move on. I am of the opinion that one must not compromise the forest for a single tree. When we consciously choose actions that benefit the community as a whole, we can easily identify this bias much better.
The HBR Journal article referenced earlier adds, “Why can’t people free themselves from past decisions? Frequently, it’s because they are unwilling, consciously or not, to admit to a mistake. Acknowledging a poor decision in one’s personal life may be purely a private matter, involving only one’s self-esteem, but in business, a bad decision is often a very public matter, inviting critical comments from colleagues or bosses. If you fire a poor performer whom you hired, you’re making a public admission of poor judgment. It seems psychologically safer to let him or her stay on, even though that choice only compounds the error.”
This desire to protect one’s self esteem often leads to bad decisions all over again. This is by far the easiest bias to detect, yet I have rarely seen people address this openly. As a leader, it is important for one to tackle this bias head-on and ensure that our decisions do not negatively impact the customers and the communities that we serve.
What can you do about sunk cost bias?
Sunk cost bias takes different forms in different job duties. It is therefore important to internalize the proposed recommendations specific to your role:
- Watch out for this bias among your teams - Be on the lookout for sunk cost biases in the recommendations given by teams or reports. Encourage reassignment of responsibilities wherever appropriate so teams can expand their viewpoints By ensuring that team members rotate job duties and therefore have less need to justify past actions you can help mitigate this bias. .
- Don’t cultivate a failure fearing environment for your team - I strongly recommend the book named Teaming - How Organizations Learn, Innovate and Compete in the Knowledge Economy by Amy C.Edmondson. This book lays out great pointers to set up an experimentation organization within teams and how to motivate the idea of failing fast. The ideas in this book resonated strongly with me, because when your team is in experimentation mode, the need to justify past actions amidst failure no longer prevails since failure is the path forward.
- Get insights from a completely different expert - Seek out input from someone not associated with the intent or solution.
- Challenge your ideas from different perspectives - Intentionally seek out new frames of reference to see how solutions might change, should certain constraints no longer exist.
4. The Confirming Evidence Bias
Have you ever searched around for information just to support an internal hunch? You are not alone. As a leader, there are times where you have to make strategic decisions on high risk problems. This is a gnarly game where the stakes are high. If you’re not mindful of it, rather than evaluating data your mind will prepare you for a flight or fight response by seeking out supporting evidence. Following is an excerpt from the HBR journal mentioned earlier.
“There are two fundamental psychological forces at work here. The first is our tendency to subconsciously decide what we want to do before we figure out why we want to do it. The second is our inclination to be more engaged by things we like than by things we dislike—a tendency well documented even in babies. Naturally, then, we are drawn to information that supports our subconscious leanings.”
Another classic example of confirming evidence bias is from an SDLC perspective. We often see unit tests that attempt to confirm that the code works rather than prove the scenarios where it could reveal failures. Developers tend to test their programs with data that are consistent with the intended behavior and not with the ones that will reveal gaps. Often this gap is bridged via peer peer review yet the value of understanding this while coding will help raise our code quality to new heights.
What can you do about confirming-evidence bias?
Confirming evidence bias can be tackled when you arrive at a consensus with your team or peers who understand the problem statement and the proposed solution. There are several techniques that can be applied to mitigate this bias:
- Don’t accept evidence without questioning - Always check to see whether you are examining all evidence with equal rigor. Avoid the tendency to accept confirming evidence without question.
- Bring in contradictory ideas - Try to challenge your thoughts by spending time considering contradictory ideas.
- Lean on data - Lean on data such as statistics and forecasts when making a decision.
- Avoid asking leading questions - While discussing a problem with others, avoid asking leading questions that might lead to your confirmation. For example, instead of asking closed ended questions like “Can this design scale?” use open-ended questions like “Could you explain the scalability aspect of this component?” Asking neutral open-ended questions like this will give you more data to analyze and align on.
5. The Framing Trap
The frame within which a problem statement is formed can adversely affect the outcome of a solution. The psychologists Daniel Kahneman and Amos Tversky observed that people tend to take the question in the frame it was given to them rather than rephrasing it. This trap is one of the most dangerous cognitive bias traps and is often hardest to detect.
Additionally, this trap can highlight all the traps that we discussed up until this point. For example, when a problem is framed as a gain versus loss frame, observers normally use their internal risk evaluation to adhere to a solution. This is because when a problem is posited in a gain frame, observers were found to be risk averse. When the same problem was posited as avoiding loss frame, observers showed signs of taking higher risk.
What can you do about the framing trap?
This is the most common type of trap that I encounter during design walkthroughs. The outcome of a design is limited to the frame from which the problem is viewed. Often this frame is put in place by the stakeholder or by the designers themselves. Framing traps are easy to detect, but take a lot of time to address since finding the right frame of reference is often challenging. A poorly framed problem can undermine even the best effort to address the bias:
- Do not accept the original frame - Don’t accept the frame in which the question is posted. Try to re-frame from multiple references. In our Design Dissension Sessions, we start by capturing the problem statements through multiple frames of reference. This helps us to choose a statement that resonates with the stakeholder and proceed from there.
- Create a neutral frame - Try posting the problem in a neutral way. If that solution seems ideal, the stakeholder might be convinced to revisit the problem statement. In our discussions, if we review a design that comes with a set of constraints, we try to remove the constraints in one of our options just to give a perspective on how the solution would look in a constraint free world. Note that this viewpoint gives more reason to choose an ideal solution that plays within the boundary of the constraint and is no way a replacement choice in our recommendation.
- Find the frame from which your peers view the problem - When others recommend a solution, try to understand the frame from which they viewed the problem statement.
In this article, we learned how biases can impact decisions that we make on behalf of our organization or our customers. We covered a few key decision biases and design biases that can impact the quality of the decisions we make in our day-to-day tasks as software developers and architects. We also learned how each of these biases can manifest and a little about why they developed in the first place. Lastly, we covered some recommended approaches that could help us mitigate these biases in our work.
Topics like this are often considered less important in the field of technology. While the stated problem of biases is psychological in nature, overcoming them should form the first step in decision-making processes anywhere a human component is involved. Therefore, it is necessary to understand how we work as humans to ensure we give our best while solving our industry’s challenges.
If you have made it this far, thank you for your time. I encourage you to do more research and share it with others. If you are interested in discussing this topic further, feel free to reach out to me on Twitter (@IamSrivatssan) or Linkedin (srivatssan).