Using neuroscience to improve decision making.
By Larry J. Bloom
Board directors are facing a rising problem. They are increasingly responsible for compliance, governance, and corporate responsibility, which are essential to organizational decision making. Yet decision making also depends on the efforts of real people. And real people are flawed.
Can the law or internal policies change the way that individuals receive and filter information, or is the human component of risk management an inevitable limitation? The field of neuroscience is starting to shed light on the importance of understanding the workings of the human brain when developing strategies and making decisions. It suggests that improving decision quality requires a better understanding of the human element.
The most sweeping corporate governance reform in any country in five decades, the Sarbanes-Oxley Act (SOX) of 2002 was overwhelmingly passed by an emotional United States Congress in the wake of major corporate and accounting scandals and the popping of the internet/telecom equity bubble. The bill was enacted as a reaction to events that had cost investors billions of dollars. The act’s express purpose was ensuring diligent and responsible corporate behavior.
Five years later, we determined that SOX played no noticeable role in preventing the unsuitably risky corporate behaviors that triggered the global financial crisis. It might even have had the unintended consequence of worsening the 2007-2010 financial crisis.
Legislators predictably put in place new rules for financial reform and corporate risk management. The 2,319-page Dodd-Frank Wall Street Reform and Consumer Protection Act, or Dodd-Frank Act, represents the most comprehensive financial regulatory reform measures taken since the Great Depression. Among other things, it places greater oversight responsibility on directors for strategy and risk.
But the landscape seems to remain the same. Just recently, JPMorgan Chase’s Jamie Dimon appeared on NBC’s Meet the Press and owned up to his company’s mistakes following the disclosure of its $2 billion (and rising) loss. “We made a terrible, egregious mistake and there’s almost no excuse for it,” Dimon said. The problem was due to human error.
If we want improvement, history suggests that regulatory actions and internal policies will not be enough. Indeed, no matter how noble the intent, the past has demonstrated that well-intentioned reform efforts have led to new crises in corporate decision making. Each new calamity leads to a fresh round of exceedingly complex regulations. The unintended consequence is that corporate governance and compliance become less viable and the cycle starts over. The logical conclusion is that we must look elsewhere for a solution.
The Human Element
In any business activity, we cannot fully understand one crucial factor, because it is us—the human element. From psychology to cognitive social neuroscience, research points to shortcomings in how people gather and process information and experiences in order to answer questions, solve problems, determine judgments, and make decisions. Many are simply unaware of the flaws plaguing some of their decisions.
Companies rely on people at all levels who can systematically pursue important goals, recognize and analyze significant problems, communicate essential meanings, and assess their own performances on the job. After all, employee decision making ultimately affects revenues, costs, employee loyalty, safety, reputation, and more.
Unfortunately, just as computers can have bugs, we humans have bugs in the way we think and make decisions. As a result, information and decisions can get filtered, even in good faith ways, on their way through the company power structure to the board. Left unchecked, the consequence is a workplace that unknowingly contributes to greater risk and poorer performance because important decisions are based on flaws in the way we gather and process information.
Individuals still make the key decisions, and their possible biases and surrounding environment might be more influential than any risk assessment reports. By turning to neuroscience, we can sensitize individuals to the problem as a first step in improvement.
The scientific study of the human brain is starting to provide underlying insights that can be applied in the business world. While merely a starting point, one simple fact is that we are living with brain forms that include influences from many thousands of years ago when daily survival was the name of the game. In fact, survival is referred to as the organizing principle of the brain.
It makes sense when we think about the evolution of humans. Our early ancestors needed to react instantly in order to survive. Evolutionary-based survival programming is actually quite ingenious, helping us in a million ways every day. And, we don’t even have to think about it consciously.
The problem is that some of these nonconscious, programmed patterns can actually result in errors of interpretation and miscommunication. When that happens, information provided to the board can be (unbeknownst to all) corrupted, and flawed decisions can occur that could have been avoided.
Fight or Flight, Inc.
When our brain perceives a threat to survival, it triggers a fear reaction. This reaction gave primitive humans a better chance of survival. Humans with the strongest such traits survived at a greater rate, passing on these traits to future generations.
Fear pathways have been widely studied, initially through animals and more recently through brain-imaging studies in humans. Here are some key findings:
• Fear reactions are automatic and nonconscious.
• They unleash a cascade of chemicals that affect the way we
feel and think.
• Neural fear circuitry takes priority over rational reflective thinking.
• Cognitive ability is reduced during a fear reaction.
• Once the fear reaction has occurred in the brain, it is difficult
to turn off.
Although the fear reaction is essential to our survival, in modern humans it can also be disruptive. That’s because our brain triggers a fear reaction based on a nonconscious perception of workplace threats. Under this condition, the brain loses its ability to correctly interpret subtle clues from the environment; it reverts to familiar behaviors, loses some of its ability to perceive relationships and patterns, and tends to overreact in a phobic way. Here are some examples based upon current workplace threats:
Fairness. Fairness matters to humans, so the brain perceives unfairness as a threat. It can be so powerful that some people are willing to fight or die for causes involving justice, fairness and equality. When this occurs in the workplace, employees might unknowingly reject new facts or select and use data in a self-serving way in order to “restore fairness.”
Ambiguity. When our brain perceives uncertainty or confusion, the fear reaction is aroused. It is similar to when your computer freezes. Until it is resolved, it is difficult to focus on other things. Uncertainty registers as something that must be corrected, and people might perceive patterns in random data where none exist or underestimate their own shortcomings as the brain attempts to “feel comfortable again.”
Control. The degree of perceived control determines if a fear reaction will be triggered. For example, not being able to exercise routine decisions without perceived overinvolvement of a supervisor can easily generate fear. We might unknowingly defend decisions made solely on snap judgments or subconsciously conform our thinking to that of the group. It can develop into a problem that impacts creativity and innovation, among other items.
Trust. Decision quality depends on healthy relationships. In the brain, each time we interact, we nonconsciously make a quick friend-or-foe distinction depending on the context. When the person is perceived as competition, survival circuits can be triggered. Spinning or withholding information from the next level of management are a few of the possible results.
Social status. We are biologically predisposed to threats to our social status in the workplace as part of our survival programming. As a result, supervisors might have a nonconscious tendency to marginalize people who disagree with them. Similarly, people might avoid disrupting group beliefs if it serves to improve their social status.
As you can see, each day at work is filled with moments of perceived survival. When fear reactions occur, people are just not in touch with their thinking. It can unknowingly affect fact gathering, analysis, insights, judgments, decisions, and performance. Personal strategies might be obscure and not apparent, even to those who are using them.
Using this insight to improve results requires two things: 1) All employees must understand how human tendencies can impact their decisions, and 2) management must set the tone by encouraging objective self-assessment of the human element and candid communications throughout the organization.
Most important decisions rely on at least some subjective input by humans. To improve performance and avoid mistakes of the past, boards must take responsibility for creating an organization that is in touch with its thinking. When we become aware of the reality of our thinking, we step out of the thousands of years of collective human conditioning and begin to appreciate our brain for what it is.
Larry J. Bloom spent 30-plus years helping grow Bio-Lab, Inc., from a small family business into an industry-leading $700 million subsidiary of a public company. He is the author of The Cure for Corporate Stupidity: Avoid the Mind-Bugs that Cause Smart People to Make Bad Decisions. He can be reached at email@example.com.