IMAGINE TWO HYPOTHETICAL SCENARIOS. In the first scenario, military acquisitions officers have spent millions of dollars creating a weapons system, only to find that it doesn’t work. In the second scenario, everything is the same except that the acquisitions officers have not yet spent any money; they merely contemplated doing so before it became clear that the system wouldn’t work. Rationally, consideration of the system’s future should depend only on the likelihood it will work, which is identical in both scenarios. The prior investment is a sunk cost that should not affect a forward-looking decision—and yet it does. Whereas the overwhelming majority of decision makers presented with the second scenario decide not to invest any funds in the system, the majority of decision makers presented with the first scenario decide to continue investing, spending millions more dollars, potentially throwing good money after bad.

In national security, if we can improve the accuracy of estimates by even 10 percent, that’s a significant number of lives saved.
Jennifer Lerner

We human beings routinely make these kinds of errors in judgment, says Jennifer Lerner, Thornton F. Bradshaw Professor of Public Policy, Decision Science, and Management at the Kennedy School and a leading researcher in the field of decision science. In some situations the same error, known as “sunk-cost bias,” could have dire consequences. “The same mental intuitions can apply on the battlefield,” says Lerner. “People may say, ‘We’ve come this far, we can’t go back now.’ It’s the same sunk-cost bias, only lives are at stake instead of just dollars.”

Sunk-cost bias is only one way in which even smart people tend to make poor decisions. “We’ve conducted several experiments aimed at mitigating sunk-cost mistakes and we are also working on many other worrisome tendencies,” Lerner says. “There are probably 30 different errors and biases that even the smartest people fall victim to systematically. In the domain of national security, stakes are especially high and commanders know it. They want to set the highest possible standards. If we academics can improve the accuracy of risk perception by even 10 percent, for example, we can save lives. And in fact, we can improve the accuracy of estimates by significantly more than 10 percent.”

Jennifer Lerner and John Richardson, Chief of Naval Operations, walk down a long corridor during one of her visits to the Pentagon.
Lerner with Chief of Naval Operations Admiral John Richardson, the service’s top military officer, during one of her visits to the Pentagon. “Decision science is a potential game changer,” Richardson says.

Since September, Lerner has moved to put that belief from theory into practice as the first-ever chief decision scientist for the U.S. Navy. The role aims to refine how the Navy makes choices, integrating decision science (otherwise known as behavioral economics) into its calculus. Using her understanding of the cognitive, social, structural, and emotional factors that shape decision making, Lerner is working with the Navy’s top military officer, Chief of Naval Operations (CNO) Admiral John Richardson. Together, they are seeking to improve “decision environments” by modifying the way choices are structured and incorporating decision science into leadership training. At the same time, she is working with the Navy’s leadership to bring a more science-based approach to such longstanding behavioral challenges as eliminating discrimination in hiring and reducing sexual harassment and assault.

By integrating decision science into the service, Lerner hopes to transform the culture of how the military operates in both combat and non-combat situations. “Decision science is a potential game-changer in enabling us to protect America from attack, promote American prosperity, and preserve America’s strategic influence,” Richardson says.

 

Boundedly Rational

Lerner isn’t exactly central casting for the Pentagon. “Walking the halls to my Pentagon office, I tend to stick out,” she jovially acknowledges. Often times, she finds herself the only woman in a room full of uniformed men. But Lerner, who grew up in a liberal activist family, was always imbued with respect for the military. Her father, who was part of an Orthodox Jewish low-income Yiddish-speaking immigrant family, struggled to fit in at Harvard as an undergraduate in the early 1950s. Harvard was a less welcoming climate in those days for poor people in general and for Jews in particular, according to Lerner. When her dad enlisted in the Army in 1954, however, he finally felt accepted as an equal. “For the first time in his life, he was fully valued for what he could do, regardless of who he was,” she says. His few years in the Army had a life-changing effect and enabled him to go back to Harvard for graduate school with new confidence. “So he raised me with the knowledge that the military, while not perfect, is a meritocracy and can serve as a conduit for societal progress.”

Jennifer Lerner stands with Commander Josh Menzel on a dock in front of the Arley Burke-class destroyer USS Spruance, in port in San Diego.
As part of her study of decision environments, Lerner toured the Arley Burke-class destroyer USS Spruance, in port in San Diego, with Commander Josh Menzel.

When she was 16, Lerner was diagnosed with systemic lupus erythematosus, a chronic autoimmune disease in which the body literally attacks its own organs. For long stretches, she was confined to a hospital or a bed at home. Of necessity, Lerner says, that experience gave her “an enormous life of the mind.” In particular, it forced her to think much more intentionally than most teenagers about the decisions she made, carefully choosing what she had the time and physical ability to do. She was drawn to psychological science as an undergraduate at the University of Michigan, where she became fascinated by understanding how people made choices. She began to study the work of such researchers as economist Herb Simon, who taught that humans are only “boundedly rational” and won the Nobel Prize in 1978, “No matter how smart we are, there are certain ways our brains are wired that promote systematic biases,” Lerner says. “I became utterly fascinated.”

As a doctoral student at the University of California, Berkeley, in the 1990s, Lerner was doing work on cognitive decision making. She also began to examine the role of emotions, a relatively understudied topic at the time. “Emotion signals travel more rapidly in the brain than cognition,” Lerner says. “When something happens, we respond emotionally first. There is a direct route from the sensory thalamus to the amygdala and motor neurons, bypassing the cortex, where thinking occurs.”

But even those researchers examining emotion were mostly only examining the effect of positive or negative mood on decision making. Most models at the time assumed that negative emotions, such as anger, would trigger pessimistic perceptions of risk. Lerner thought emotions were a lot more complicated than this, and drilled down to examine the effect of specific emotions, such as happiness, anger, sadness, and fear. Each of these emotions altered decision processes differently, she found, in three ways: the content of thought, the depth of thought, and the implicit goals activated. She found that anger, for instance, is defined by a cognitive sense that events are certain, predictable, and controllable. “When you are angry, you feel you know what is going on. You don’t think, ‘Well, I’m not sure if I’ve understood this carefully enough.’ Instead, you act without deep thought.” Through a series of experiments and associated statistical models, Lerner has documented that the perceptions of control and certainty triggered by anger give rise to an inflated sense of power and an underestimation of risks. Far from pessimism, anger triggers relative optimism when it comes to how the self will prevail through challenges. Thus, “in situations where you need to have a nuanced understanding of risks, anger undermines sound decision making. On the other hand, in situations where a risky choice turns be the best choice, an angry individual is better equipped to take necessary actions.”

Jennifer Lerner sharing insights with five active-duty military students in a classroom at Harvard Kennedy School.
Lerner (top center), sharing insights with active-duty military students. Clockwise from top: Kimberly Lahnala MC/MPA; Thomas Shannon MC/MPA; Chris Umphres PhD; Joshua Stinson MC/MPA; and Brad DeWees PhD.

In order to overturn the conventional wisdom that anger is necessarily bad for decision making, Lerner and her students designed a financial choice study in which risk-seeking choices would be rewarded. Results revealed that decision makers primed with anger earned more money in the financial choice task than did decision makers primed with neutral emotion. According to Lerner, cool heads do not always prevail. “Whether an emotion has a beneficial or detrimental effect on decision making depends on the nature of the decision task at hand,” she says.

Dozens of scientific papers later (on a variety of topics in decision making), Lerner’s pioneering work resulted in her receiving the highest honor bestowed by the U.S. government to early-career scientists and engineers, the Presidential Early Career Award for Scientists and Engineers. Receiving that award from the president of the United States and the director of the National Science Foundation helped cement in Lerner’s mind the belief that science should be pursued not only for its intrinsic value but also for its value to the nation itself, which needs to build upon the information discovered. The award also carries with it a commitment to go above and beyond the usual strategies for disseminating science, urging recipients to work directly with public leaders and agencies to make the world a better place. Having been funded by the National Science Foundation nearly continuously since her doctoral program, Lerner takes seriously her commitment to share scientific results in ways that advance our nation’s interests.

Since coming to Harvard more than a decade ago, Lerner has continued her work at the Harvard Decision Science Laboratory, which she co-founded. Now dubbed “one of the most prominent American scientists” by the National Science Foundation’s “Sensational 60” award, her publications have been cited over 20,000 times in scholarly publications alone. But research is not her only passion. An award-winning instructor and curriculum innovator, she also teaches undergraduate, graduate, and executive education courses. Continuing her commitment to share science with public leaders, among her current students are two U.S. Air Force officers studying for PhDs in decision science.

 

Wider Role

One of Lerner’s most popular courses at Harvard Kennedy School is “Leadership Decision Making.” The executive education version of the course has especially attracted people in professions where stakes are high and margins for error are small. Practitioners in medicine, finance, and the military are especially drawn by the content. One such student, a U.S. Navy admiral taking the executive education class, was impressed by an exercise on implicit bias in hiring, showing quantitatively how prejudice influences who gets hired. “He said, ‘The Navy needs to improve our promotion and selection procedures, because we may inadvertently be favoring people who fit our mental image of a leader,’” Lerner recalls. That officer raised the issue with Chief of Naval Operations Richardson, who asked Lerner to help examine the Navy’s hiring.

Jennifer Lerner sits with Navy Admiral John Richardson in a meeting at the Pentagon.
Lerner, pictured here with Admiral John Richardson during a recent visit to the Pentagon, is working to help educate Navy leadership about the breadth and applicability of decision science.

“I was super-impressed with Jenn’s command of the field, her comfort and familiarity with the military culture, and her toughness and tenacity,” Richardson says. (Lerner has maintained her role as a Harvard professor, albeit with a slightly reduced load, through research, teaching, and mentoring doctoral students.)

Together, Richardson and Lerner identified the two main areas in which the Navy will integrate decision science: helping leaders better assess and manage risk; and building the decision science curriculum for leadership development. The Risk Assessment Research Effort (RARE), a Navy initiative Lerner proposed and now heads, will study how decisions are made in risky or uncertain conditions across the entire spectrum of the institution, from an officer confronting a potentially adversarial ship at sea, to a top official making a management decision in his or her Pentagon office. In addition, Lerner has a mandate to help the Navy take a more evidence-based, scientifically guided approach to solving such longstanding challenges as improving diversity and inclusion, eliminating sexual harassment and assault, and designing reporting relationships in ways that reduce the abuse of power.

Lerner is also working across the board on helping the Navy integrate the scientific method for testing ideas and learning from carefully controlled experiments. When considering a new procurement procedure to lessen time in port, for example, the Navy could randomly assign two different procedures across its various ports, comparing results to determine which procedure is really better. To address an issue such as sunk-cost bias, for example, decision science could promote such tools such as cost-benefit analysis, expected-value calculations, reframing the choices so that sunk costs are no longer salient. Similar tools could be applied to other biases. Research has shown, for example, that people tend to make much riskier choices when a decision is framed in relation to the losses than they do if it is framed in relation to gains. By reframing a question and considering both sides of a choice, a leader can better weigh the risk.

We say look, if you are in an angry state or if you are in a fearful state and making a decision, keep in mind that your decision may be biased.
Jennifer Lerner

And Lerner is also helping officers better understand the effects of their own emotions on their choices. “We say look, if you are in an angry state or if you are in a fearful state and making a decision, keep in mind that your decision may be biased—anger reducing the perception of risk and fear increasing it,” says Lerner.

Lerner is enlisting other scholars from around the country and especially from within Harvard to work on initiatives and to help educate Navy leadership about the breadth and applicability of the discipline. For example, Lerner is collaborating with two Kennedy School colleagues, Michela Carlana and Dara Kay Cohen, on empirical studies that will help the Navy. And Kennedy School graduate students, including active-duty military members, are involved in research applicable to Lerner’s work.

Furthermore, Lerner is working with Richardson to make it easier for the Department of Defense to draw on academic scientists’ expertise more regularly. The present system, which requires months of paperwork to be processed even in the best of circumstances, needs to be revised and augmented.

Nowadays when Lerner meets with Navy leadership, she hears many of the terms she has been teaching. “Admirals have started to talk in terms of biases, expected value, and probabilistic reasoning,” Lerner says.  And decision science has now been written into the Navy’s official strategic plan, an important guiding document that has a years-long influence on the Navy’s course. Incorporating Lerner’s help, the latest plan, released in December of 2018, included decision science for the first time.

“The Navy has a long history of drawing on scientific expertise in physical sciences and engineering,” Lerner says. “But we’re not going to preserve freedom across the seas just by having the best equipment. We are going to do so by smarter use of information.”

Michael Blanding is a freelance writer living in Brookline, Massachusetts.

Photos by Raychel Casey and Raymond Diaz III

Get smart & reliable public policy insights right in your inbox.