Jump to:Page Content
The intersection of behavioral science and political science is a growing field of academic inquiry, inspired by recent progress in the science of human behavior. Assistant Professor Todd Rogers is a behavioral scientist whose research attempts to bridge the gap between intention and action. Among other things, he has studied election participation and public support for long-range social policies.
Q: One of your most recent studies examines how voters often overlook candidates' "dodges" during debates. Please discuss this research.
Rogers: This research started by observing politicians and spokespeople in press conferences as they delivered responses that seemed unrelated to the questions they were asked. So working together with my collaborator, Mike Norton [of Harvard Business School], we undertook a series of experiments in which we examined the conditions under which speakers succeed in answering questions that are different than the ones that they are actually asked. What we found was that, pretty frighteningly, observers have a very difficult time detecting when a speaker answers a question that is different, though similar, to the question they were actually asked.
For example, if they were asked “What are you going to do about illegal drug use” and they say “we need universal health care because …” observers fail to realize that the speaker answered the wrong question and so observers rate the speaker just as honest and trustworthy as if the speaker had actually been asked about healthcare. And we found that answering the right question poorly is worse than answering the wrong question well. So delivering a stuttering, flub of a response that is direct and honest is punished, relative to answering a totally different question, but answering it fluently.
Q. Considering the results of your research, what accounts for people missing the fact that a candidate is dodging?
Rogers: In our research we find that people are surprisingly susceptible to not detecting when someone dodges a question. We think that one of the underlying, most fundamental mechanisms behind this is that we have limited attention. There’s been other research on this where people have been asked to watch a video and count the times a basketball is passed around which makes them vulnerable to failing to detect a gorilla dancing across their line of sight. It’s called inattentional blindness. And what it is, basically, is that humans have limited attention. When we are listening to someone speak, the default thing that we are doing with our attention and our minds is assessing what they’re saying and whether we like the person – basically the default social evaluation. And so when someone is offering an answer to a question that is similar to what was asked - maybe you could make the stretch that it’s related: them answering about health care, you asked about illegal drug use- the observer is really focusing on “do I agree with what they’re saying, do I like that person?” It’s an entirely different task to go backwards and say “he’s answering this – is it related to what was asked?” It requires a lot of cognitive energy.
Q: Do the results of this study have implications beyond candidate debates?
Rogers: I think this research has relevance across a broad range of domains. As I’ve been speaking with people about the research, I've heard of instances during job interviews when people are asked difficult questions and they are committed to being truthful, but they’d rather not answer the question. I’ve heard negotiators talk about how they have used similar techniques and it allows them to not explicitly lie but to evade answering important questions.
Our focus in this research is not just descriptive: how you can do it, but rather prescriptive: how you can prevent people from doing it. So in political debates, what we’ve found is that by posting the text of the question on the monitor, even subtle dodges get noticed by the audience. As a result of this finding, we have been advocating that the news media post questions in this manner during debates.
Q: Another study looks at the inaccuracy of "self-reporting" by would-be voters. What did you learn?
Rogers: This research was motivated by a challenge faced by pollsters and political organizations that care about identifying likely voters. The standard approach to this is to ask “do you intend to vote?” Because of data and technology advances, we’re able to match the records of these phone calls to the public voting records and see who actually casts a vote and who doesn’t. We’ve found that there’s a surprisingly poor correlation between how people self-predict their future behavior and their actual future behavior. In one study during the 2008 election involving over 10,000 live phone interviews conducted over the course of the five months leading up to the election, we found that 55 percent of people who said they were not going to vote actually did vote. At the same time, only about 85 percent of people who said they were absolutely going to vote, did vote.
What this research shows is that self-reported answers to the question “are you going to vote” are often barely related to the actual future behaviors. Given that pollsters attempt to make claims about the “likely electorate” this research deeply challenges their ability to identify who belongs in that classification based on self-report of voting intentions.
We have recommended several strategies to address this problem. What we find is that the single greatest predictor of whether someone is going to vote is whether they have voted in the past. People who never voted in the past, but say they’re going to vote – mostly don’t vote. People who always voted in the past but say they’re not going to vote – mostly do vote. That’s a generalization, but it’s a strong, consistent trend. So the recommendation to pollsters and people who care about identifying the likely electorate is to rely more heavily on past vote history as the guide for identifying likely voters. Even in the absence of actual records of people’s past vote history, even just asking them if they’ve voted in the past is as good or better at predicting future turnout than asking them if they’re going to vote.
Q: What are the larger questions that you seek to examine in your research? What future studies are you contemplating?
Rogers: The broad arc of my work is how we translate the insights from behavioral economics and psychology into solutions for important social problems. Much of my research has been about how to mobilize voters. Much of my research moving forward will continue to be in the realm of politics, but I also plan to branch out into the education realm– using the insights of psychology to develop tools and tactics that teachers can use to motivate students to attend school, to study outside of class and to pay attention in class. It’s complementary to the line of education research that many are now pursuing which focuses on the question “how do we identify who is a good teacher?” My approach is more focused on identifying the micro tactics that all teachers can use to motivate kids.
Interviewed by Doug Gavel on Jan 27, 2012.