Asim Khwaja on Evidence for Policy Design

September 29, 2015
by Doug Gavel

When development economists conduct research, one of the goals is to help practitioners and policymakers create better policies in countries struggling with a host of educational, political, and financial challenges. Asim Ijaz Khwaja is the Sumitomo-Foundation for Advanced Studies on International Development Professor of International Finance and Development at Harvard Kennedy School, and co-director of Evidence for Policy Design (EPoD) at the Center for International Development. His areas of interest include economic development, finance, education, political economy, and institutions, and he is a strong advocate on behalf of evidence-based policymaking in the developing world. 

 

 

Q. Explain the theory of evidence-based policymaking? How is it making a difference in the developing world?

Khwaja: We’ve been working for the last few years on how do we get evidence to really inform decision-making. One of the challenges in this is the kind of naïve view that you produce research, and the research is read and adopted by practitioners, isn’t quite true. Partly because the research often isn’t accessible, but even more so, the questions that the researcher is working on may not be the questions that are of direct interest to the practitioner or policymaker.

And, so, we’ve started a different form of engagement which we call “smart policy design,” but it’s really a systematic way in which we can get the researcher to sit at both the problem discovery and the design table with a practitioner.

And what we find is that this process works much better in terms of greater acceptability of the evidence by the practitioner, but also, it really helps the researcher understand truly what are the bigger questions that they should be working on. 

I can very briefly give you a sense of what the process looks like. It’s essentially five stages. They’re very simple. The first stage is what we call problem discovery where we sit together with a practitioner and really work on what the true problem that they should be focusing on is, and this is a process of joint discovery. Often, the problem the researcher has in mind and the practitioner has in mind is not really the fundamental problem. So, we might think the problem is enrollment when, really as a discussion ensues, you realize the problem is quality of education as opposed to enrollment.

Step two for us is diagnosis, which is really trying to figure out what are the underlying, deeper factors which produce the problem. Not superficial factors, but really the deep down causal factors. So, for instance, it’s not that teachers aren’t showing up and which is why quality is low in, say a public schooling setting, but it’s more of what is a political economy situation which makes teachers not be held accountable by the public. So, that could be a deeper diagnosis.

And only then do we come to step three, which is a solution, which is what we call design, and it’s very critical for us that this step shows up as step three, not as step one, because often the development world is full of solutions looking for a problem to solve, and we don’t want to be that way. We want to be problem-driven not solution-driven. So, best practices are great, but they should only be applied if they truly solve the problem that you are faced with.

Once we design the solution, then I think it’s critical to realize that no solution is ever perfect. So, just like you would have a black box in, say  an airplane, we want to build in, for us step four will be called test, is effectively this black box. So, as the policy is rolling out, we want to be gathering evidence.  It could be an elaborate impact evaluation. It could be simple process monitoring, but gather data which tells us how well the solution is working out.

And this leads to our last step which is to refine or recalibrate as this data rolls out; the whole idea is that you now figure out how to change the policy -- what aspects of the policy to improve upon. And this process that I just described is a cyclical process for us. So, it’s an ongoing process. Once you embed this kind of smart learning mechanism inside policymaking, we feel this is the best way to sustainably bring in evidence into the decision making and policymaking process.

Q. What are some practical examples of how research taking place at EPoD is helping advance smart policy design?

Khwaja: There are several ways in which we have begun creating embedded researcher and practitioner relationships. I’ll talk briefly about some of the work my colleagues are doing and then spend a bit more time discussing one example of a project I’m working on.

In terms of my colleagues, Rohini Pande is doing some fascinating work on environmental regulation in India. She is working with regulators in India to come up with novel ways to change, for instance, audit mechanisms, all with the idea that misaligned incentives and poor information causes programs to fail, and theory and data can be used to improve feedback loops and also to enable the recalibration and the redesign of programs.

Similarly, another colleague, Rema Hanna, is doing really fabulous work in Indonesia where she’s working with the Indonesian government to help change their welfare programs, in particular targeting one of their subsidies and thinking about basic design issues, using data and testing multiple design iterations. For instance, when designing a poverty alleviation program, do you need means testing, or do you try and do surveys to identify the poor, or do you have community self-identify who the poor are? Do you have self-targeting mechanisms where the poor can basically go through a filtering process, which is somewhat costly, so that only the poor would choose to do it? Any of these techniques are different ways of targeting the poor. Which one would work more effectively? So, there is some fascinating work in that area.

In my own field, I’m currently doing a lot of work in public finance, and also in education, in the area of human capital acquisition. I’ll give you one example of a project which is along these lines. It is a project where we’re working with the government to help design and evaluate a vocational training program for adults who have missed the educational market, and who are now being given the opportunity to re-tool themselves for a new workforce. In the process of doing this work we and our government partners realized very early on that even the basic design of these large training programs had to move away from what used to be a very traditional supply-driven approach, in which you open a bunch of training slots, people come in, get trained, and leave, and you’re done.

What we realized when examining this sort of vocational training was that there was a huge mismatch between what people wanted and what was being offered. It’s just one example of where evidence comes in. It wasn’t just about creating some training opportunities and helping people get trained, but it was looking at what people wanted to be trained in and what employers wanted, and then matching the two. So something as simple as gathering feedback through household and employer surveys allowed us to provide data to our policy partners to improve the program. And now we’re in the third year of the program in which we’ve had numerous instances of such cases where the data gathered has allowed our partners to make multiple improvements in design – from the very basic suite of courses that should be offered, to designing the specifics of how the program is delivered.

One such example helped identify and address access issues. Our initial work revealed an apparent puzzle. When asked, over 90% people expressed a strong interest in enrolling in specific vocational trainings. Yet even when offered programs that were free and paid additional stipends, we found that only 5% actually enrolled! The initial reaction one would have had was to believe that the expressed demand was “cheap talk” and not reflective of people’s real desires. Yet, there was some suggestive evidence that lead us to suspect that there may be more going on – and that individuals – especially women – may face access barriers that are more than just travel time and cost. Working with our policy partners we introduced a series of design variations that attempted to address these less “monetizable” social, cultural and psychological barriers and in doing so were able to raise enrollment rates to closer to 50%! 

This process is ongoing and now we’re at the stage where we’re helping our partners design not just training elements of the program, but we’re using the data to help inform how to link trainees to the market. So, you might have acquired a vocational skill, you might know how to sew better, but where do you sell? Your village may not offer a marketplace for you. So, linking trained women to buyers in urban environments – solving not just the skills acquisition but also market access constraints - can have big impacts.   

This process – whereby the next natural question emerges from the previous work you did – leads to what we hope is a more robust cycle of creating impact from what we’re doing.

Q. Is EPoD’s work relevant solely for developing economies?

Khwaja: Not at all, actually. In fact, a lot of the inspiration we have in our work comes from successful examples in developed economies. And the more we do this work, what is interesting to me is how when I talk to my own colleagues at the Kennedy School who work exclusively in developed economies, how some of the insights that we’re now getting actually are very informative for their own debates.

Often, what happens in developing economies is due to the fact that there is a dearth of skilled human capital; people who are doing research are adept at using data and we often tend to sit on design tables which would be much harder to get on to in developed economies. And, so, our opportunities to run these joint experiments are actually greater in developing countries and allow us to run fairly ambitious experiments. And now that we’re seeing the results of these experiments, it actually creates opportunities to replicate or to conduct better versions of those experiments in developed economies.

For example, in my colleague Rohini Pande’s work in India that I mentioned earlier, she and her colleagues were able to work directly with the environmental regulator in the state of Gujarat to pilot and test a reform to the auditing structure, aimed at addressing the conflict of interest inherent in any policy design that requires companies to hire the very firms that are tasked with conducting audits. Everyone knows that there is a hypothetical chance – or even likelihood – of cheating to occur, but it often takes hard data and evidence to show just how bad the status quo has become.

The results of their study not only showed cheating in the status quo system, but also showed how relatively minor reforms in the payment structure can lead to more truthful audits and, most importantly, lower levels of pollution from firms. She was able to do that in India, largely because of the long-standing relationship she and her colleagues have developed with that branch of the Indian government. While similar experiments may be harder to conduct in developed economies, problems such as conflicts of interest are perfectly paralleled in many other arenas. A good example is financial regulation where financial institutions are required to directly pay third party companies to conduct audits – the pitfalls of which we clearly saw in 2007-08. So in this, as in many other cases, the lessons learned through large field experiments in developing economies may be directly transferable to developed economy contexts.

So I feel like more and more there’s a very healthy dialectic going on, and a conversation going on about the sort of work we’re doing and the work that is possible in places like the U.S. and other developed countries.

Tags: India India

Tags: International Economics International Economics

 

 

"We’ve started a different form of engagement which we call 'smart policy design' but it’s really a systematic way in which we can get the researcher to sit at both the problem discovery and the design table with a practitioner."

Asim Khwaja


John F. Kennedy School of Government 79 John F. Kennedy Street
Cambridge, MA 02138
617-495-1100 Get Directions Visit Contact Page