[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
IDEAS home Printed from https://ideas.repec.org/p/cpr/ceprdp/11762.html
   My bibliography  Save this paper

From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application

Author

Listed:
  • Duflo, Esther
  • Banerjee, Abhijit
  • Banerji, Rukmini
  • Berry, James
  • Kannan, Harini
  • Mukerji, Shobhini
  • Walton, Michael
  • Shotland, Marc
Abstract
The promise of randomized controlled trials (RCTs) is that evidence gathered through the evaluation of a specific program helps us—possibly after several rounds of fine-tuning and multiple replications in different contexts—to inform policy. However, critics have pointed out that a potential constraint in this agenda is that results from small, NGO-run “proof-of-concept†studies may not apply to policies that can be implemented by governments on a large scale. After discussing the potential issues, this paper describes the journey from the original concept to the design and evaluation of scalable policy. We do so by evaluating a series of strategies that aim to integrate the NGO Pratham’s “Teaching at the Right Level†methodology into elementary schools in India. The methodology consists of re-organizing instruction based on children’s actual learning levels, rather than on a prescribed syllabus, and has previously been shown to be very effective when properly implemented. We present RCT evidence on the designs that failed to produce impacts within the regular schooling system but helped shape subsequent versions of the program. As a result of this process, two versions of the programs were developed that successfully raised children’s learning levels using scalable models in government schools.

Suggested Citation

  • Duflo, Esther & Banerjee, Abhijit & Banerji, Rukmini & Berry, James & Kannan, Harini & Mukerji, Shobhini & Walton, Michael & Shotland, Marc, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," CEPR Discussion Papers 11762, C.E.P.R. Discussion Papers.
  • Handle: RePEc:cpr:ceprdp:11762
    as

    Download full text from publisher

    File URL: https://cepr.org/publications/DP11762
    Download Restriction: CEPR Discussion Papers are free to download for our researchers, subscribers and members. If you fall into one of these categories but have trouble downloading our papers, please contact us at subscribers@cepr.org
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    2. Abhijit Banerjee & Sharon Barnhardt & Esther Duflo, 2015. "Movies, Margins, and Marketing: Encouraging the Adoption of Iron-Fortified Salt," NBER Chapters, in: Insights in the Economics of Aging, pages 285-306, National Bureau of Economic Research, Inc.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    5. Abhijit Vinayak Banerjee & Alice H. Amsden & Robert H. Bates & Jagdish Bhagwati & Angus Deaton & Nicholas Stern, 2007. "Making Aid Work," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262026155, April.
    6. Abhijit V. Banerjee & Shawn Cole & Esther Duflo & Leigh Linden, 2007. "Remedying Education: Evidence from Two Randomized Experiments in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(3), pages 1235-1264.
    7. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part II: Using the Marginal Treatment Effect to Organize Alternative Econometric Estimators to Evaluate Social Programs, and to Forecast their Effects in New," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 71, Elsevier.
    8. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukherji & Marc Shotland & Michael Walton, 2016. "Mainstreaming an Effective Intervention: Evidence from Randomized Evaluations of “Teaching at the Right Level” in India," NBER Working Papers 22746, National Bureau of Economic Research, Inc.
    9. repec:hal:pseose:halshs-00840901 is not listed on IDEAS
    10. Esther Duflo & Rema Hanna & Stephen P. Ryan, 2012. "Incentives Work: Getting Teachers to Come to School," American Economic Review, American Economic Association, vol. 102(4), pages 1241-1278, June.
    11. Tahir Andrabi & Jishnu Das & Asim I. Khwaja & Selcuk Ozyurt & Niharika Singh, 2020. "Upping the Ante: The Equilibrium Effects of Unconditional Grants to Private Schools," American Economic Review, American Economic Association, vol. 110(10), pages 3315-3349, October.
    12. James Berry & Greg Fischer & Raymond Guiteras, 2020. "Eliciting and Utilizing Willingness to Pay: Evidence from Field Trials in Northern Ghana," Journal of Political Economy, University of Chicago Press, vol. 128(4), pages 1436-1473.
    13. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    14. Pascaline Dupas & Edward Miguel, 2016. "Impacts and Determinants of Health Levels in Low-Income Countries," NBER Working Papers 22235, National Bureau of Economic Research, Inc.
    15. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    16. Esther Duflo & Pascaline Dupas & Michael Kremer, 2011. "Peer Effects, Teacher Incentives, and the Impact of Tracking: Evidence from a Randomized Evaluation in Kenya," American Economic Review, American Economic Association, vol. 101(5), pages 1739-1774, August.
    17. Sylvain Chassang & Gerard Padro I Miquel & Erik Snowberg, 2012. "Selective Trials: A Principal-Agent Approach to Randomized Controlled Experiments," American Economic Review, American Economic Association, vol. 102(4), pages 1279-1309, June.
    18. Esther Duflo & Emmanuel Saez, 2003. "The Role of Information and Social Interactions in Retirement Plan Decisions: Evidence from a Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 118(3), pages 815-842.
    19. Agha Ali Akram & Shyamal Chowdhury & Ahmed Mushfiq Mobarak, 2017. "Effects of Emigration on Rural Labor Markets," NBER Working Papers 23929, National Bureau of Economic Research, Inc.
    20. Sylvie Moulin & Michael Kremer & Paul Glewwe, 2009. "Many Children Left Behind? Textbooks and Test Scores in Kenya," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 112-135, January.
    21. Hausman, Jerry A. & Wise, David A. (ed.), 1985. "Social Experimentation," National Bureau of Economic Research Books, University of Chicago Press, number 9780226319407, September.
    22. Manuela Angelucci & Giacomo De Giorgi, 2009. "Indirect Effects of an Aid Program: How Do Cash Transfers Affect Ineligibles' Consumption?," American Economic Review, American Economic Association, vol. 99(1), pages 486-508, March.
    23. Abhijit Banerjee & Esther Duflo & Clément Imbert & Santhosh Mathew & Rohini Pande, 2020. "E-governance, Accountability, and Leakage in Public Programs: Experimental Evidence from a Financial Management Reform in India," American Economic Journal: Applied Economics, American Economic Association, vol. 12(4), pages 39-72, October.
    24. Abhijit V. Banerjee & Rukmini Banerji & Esther Duflo & Rachel Glennerster & Stuti Khemani, 2010. "Pitfalls of Participatory Programs: Evidence from a Randomized Evaluation in Education in India," American Economic Journal: Economic Policy, American Economic Association, vol. 2(1), pages 1-30, February.
    25. Donald B. Rubin, 1981. "Estimation in Parallel Randomized Experiments," Journal of Educational and Behavioral Statistics, , vol. 6(4), pages 377-401, December.
    26. Abhijit Banerjee & Rukmini Banerji & Esther Duflo, 2016. "Mainstreaming an Effective Intervention: Evidence from Randomized Evaluations of “Teaching at the Right Level†in India," Working Papers id:11419, eSocialSciences.
    27. Jerry A. Hausman & David A. Wise, 1985. "Social Experimentation," NBER Books, National Bureau of Economic Research, Inc, number haus85-1.
    28. repec:idb:brikps:77778 is not listed on IDEAS
    29. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    30. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    31. Karthik Muralidharan & Venkatesh Sundararaman, 2015. "Editor's Choice The Aggregate Effect of School Choice: Evidence from a Two-Stage Experiment in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1011-1066.
    32. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    33. Abhijit Banerjee & Dean Karlan & Jonathan Zinman, 2015. "Six Randomized Evaluations of Microcredit: Introduction and Further Steps," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 1-21, January.
    34. Jerry A. Hausman & David A. Wise, 1985. "Introduction to "Social Experimentation"," NBER Chapters, in: Social Experimentation, pages 1-10, National Bureau of Economic Research, Inc.
    35. Barrera-Osorio, Felipe & Linden, Leigh L., 2009. "The use and misuse of computers in education : evidence from a randomized experiment in Colombia," Policy Research Working Paper Series 4836, The World Bank.
    36. Lant Pritchett & Justin Sandefur, 2015. "Learning from Experiments When Context Matters," American Economic Review, American Economic Association, vol. 105(5), pages 471-475, May.
    37. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    38. Rachael Meager, 2015. "Understanding the Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of 7 Randomised Experiments," Papers 1506.06669, arXiv.org, revised Jul 2016.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eduard Marinov, 2019. "The 2019 Nobel Prize in Economics," Economic Thought journal, Bulgarian Academy of Sciences - Economic Research Institute, issue 6, pages 78-116.
    2. Committee, Nobel Prize, 2019. "Understanding development and poverty alleviation," Nobel Prize in Economics documents 2019-2, Nobel Prize Committee.
    3. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    4. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    5. Annie Duflo & Jessica Kiessel & Adrienne Lucas, 2020. "Experimental Evidence on Alternative Policies to Increase Learning at Scale," NBER Working Papers 27298, National Bureau of Economic Research, Inc.
    6. Benjamin A. Olken, 2020. "Banerjee, Duflo, Kremer, and the Rise of Modern Development Economics," Scandinavian Journal of Economics, Wiley Blackwell, vol. 122(3), pages 853-878, July.
    7. Karthik Muralidharan & Abhijeet Singh & Alejandro J. Ganimian, 2019. "Disrupting Education? Experimental Evidence on Technology-Aided Instruction in India," American Economic Review, American Economic Association, vol. 109(4), pages 1426-1460, April.
    8. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    9. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    10. Tessa Bold & Mwangi Kimenyi & Germano Mwabu & Alice Ng'ang'a & Justin Sandefur, 2013. "Scaling-up What Works: Experimental Evidence on External Validity in Kenyan Education," CSAE Working Paper Series 2013-04, Centre for the Study of African Economies, University of Oxford.
    11. Baylis, Kathy & Ham, Andres, 2015. "How important is spatial correlation in randomized controlled trials?," 2015 AAEA & WAEA Joint Annual Meeting, July 26-28, San Francisco, California 205586, Agricultural and Applied Economics Association.
    12. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    13. Pascaline Dupas & Edward Miguel, 2016. "Impacts and Determinants of Health Levels in Low-Income Countries," NBER Working Papers 22235, National Bureau of Economic Research, Inc.
    14. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    15. Jenny Aker, 2013. "Scaling Up What Works: Experimental Evidence on External Validity in Kenyan Education," Working Papers 321, Center for Global Development.
    16. Francesca Marchetta & Tom Dilly, 2019. "Supporting Education in Africa: Opportunities and Challenges for an Impact Investor," Working Papers hal-02288103, HAL.
    17. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    18. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    19. Pritchett, Lant, 2023. "Rely (only) on the rigorous evidence” is bad advice," LSE Research Online Documents on Economics 119818, London School of Economics and Political Science, LSE Library.

    More about this item

    Keywords

    Education; India;

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • O12 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development - - - Microeconomic Analyses of Economic Development
    • O35 - Economic Development, Innovation, Technological Change, and Growth - - Innovation; Research and Development; Technological Change; Intellectual Property Rights - - - Social Innovation

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cpr:ceprdp:11762. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://www.cepr.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.