2016 Federal Index


Repurpose for Results

In FY16, did the agency shift funds away from any practice, policy, or program which consistently failed to achieve desired outcomes? (Note: Meeting this criteria requires both Agency and Congressional action.)

Score
8
Administration for Children and Families (HHS)
  • In FY12, ACF established the Head Start Designation Renewal System, requiring Head Start ($9.2 billion in FY16) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal controls. The 2007 Head Start Reauthorization Act made all Head Start grants renewable, five-year grants. At the end of each five-year term, grantees that are running high-quality programs will have their grants renewed. But grantees that fall short of standards are now required to compete to renew grants. Grantees whose ratings on any of the three domains of the Classroom Assessment Scoring System, an assessment of adult:child interactions linked to improved outcomes, fall below a certain threshold, or in the lowest 10 percent of grantees, must also compete.
  • ACF’s FY17 budget request (p. 418) proposes to eliminate funding for Abstinence Education grants because the program is not focused on funding evidence-based models.
Score
3
Corporation for National and Community Service
  • In FY13-FY14, Mile High United Way, a grantee of the Social Innovation Fund (SIF), ended funding relationships with 3 of its sub-grantees who were not able to conduct rigorous evaluations of their In FY15, United Way for Southeastern Michigan, also a SIF grantee, ended its funding relationship with one of its sub-grantees for the same reason. These actions are consistent with the SIF National Assessment findings, which recognize the role SIF has played in fostering evidence-based grant making among its grantees.
Score
8
Millennium Challenge Corporation
  • MCC has established a Policy on Suspension and Termination that describes the process and procedures for suspension and termination of MCC assistance in cases in which partner countries are not living up to their MCC has suspended or terminated a compact partnership, in part or in full, seven times out of 33 compacts approved to date, and has suspended partner country eligibility to develop a compact an additional four times (most recently with the suspension of Tanzania in March 2016). In 2012 MCC suspended Malawi’s Compact due to a pattern of actions by the Government of Malawi that was inconsistent with the democratic governance evidence criteria that MCC uses for selection. However, the Government of Malawi took a number of decisive steps to improve the human rights environment and to ensure that laws and institutions support democratic rights and processes. These steps and the resumption of sound economic policy led to the reinstatement of Malawi’s Compact in 2012.
  • MCC also consistently monitors the progress of Compact programs, and makes changes as necessary. For example, an activity in the Philippines, the Electronic Tax Information System (eTIS), an activity under the Revenue Administration Reform Project, was reduced in scope in FY15 due to time and completion risks. This proactive approach allowed MCC to judiciously reallocate funds to finance additional subprojects under the Kalahi-CIDSS Community-Driven Development Project (K-C) and further maximize the project’s benefits.
Score
7
U.S. Agency for International Development
  • USAID uses rigorous evaluations to maximize its investments. A recent independent study found that 71 percent of USAID evaluations have been used to modify and/or design USAID projects. Below are a few examples where USAID has shifted funds and/or programming decisions based on performance:
  • Mozambique: Many donors working in the education sector in Mozambique were using traditional reading programs to improve early grade reading. USAID recently designed an impact evaluation to test whether reading interventions alone or reading interventions paired with school management support led to improved reading outcomes. Findings from a mid-term impact evaluation found that pairing reading instruction interventions with school management support improved reading outcomes more than reading instruction alone, and was more cost effective. Based on these findings. USAID Mozambique changed the way it worked in Mozambique, and the findings prompted the Government of Mozambique to request that this approach be scaled from 120 schools to 1,060 new schools. More information can be found in the recently published report on USAID evaluation practice.
  • Armenia: A 2013 mid-term evaluation of USAID/Armenia’s flagship health program revealed a number of significant design and implementation flaws, which prompted the Mission to terminate the program early and saved USG resources. Since then, USAID/Armenia has redesigned its health portfolio to focus on areas where it can make difference and leave a positive legacy, as it phases out from the sector.
  • Latin America and Caribbean Bureau: USAID’s Latin America and Caribbean (LAC) Bureau no longer funds expensive out-of-country scholarship programs such as the Scholarship for Education and Economic Development (SEED) and other precursors. A 2013 evaluation of the Latin America and Caribbean region’s Higher Education Scholarships Program looked at the cost-effectiveness of providing students with scholarships to study at US institutions and determined that USAID could provide improved training opportunities for many more poor youth by focusing resources on improving the quality of LAC regional or in-country training institutions. This finding informed a redesign of the program and the issuing of a new Request for Applications (RFA).
  • Indonesia: In 2013, a USAID Indonesia changed the geographic targeting of a forestry program based on a USAID-commissioned evaluation that found that the program was spread out among too many geographic locations and could be more effective by focusing on fewer. This example can be found in the recently published independent study on evaluation use at USAID.
  • In 2011, a USAID-commissioned evaluation of a USAID/Afghanistan road rehabilitation program found that cooperative agreements and grants are not as effective implementing mechanisms as contracts in terms of the levels of implementing partner accountability to USAID, particularly in regards to infrastructure activities. In part as a result of this evaluation’s findings, in 2013 USAID released a new operating policy, entitled “USAID Implementation of Construction Activities,” that mandates the use of contracts rather than grant or cooperative agreement mechanisms for projects that involve construction.
Score
7
U.S. Department of Education
  • Since 2010, ED has worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see A-72 to A-73) (-$66.5 million in FY11) and Mentoring Grants (see p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
  • ED also tries to shift program funds to support more effective practices by prioritizing the use of entry For ED’s grant competitions where there is evaluative data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the grant competition design of future projects. For example, an impact evaluation of the Teacher Incentive Fund (TIF) will inform ED’s FY16 competition design for TIF, including focusing applicants’ attention on practices more likely to be effective.
  • Additionally, ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, ESSA’s Education Innovation and Research (EIR) program – the successor to i3 – supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students.
Score
7
U.S. Dept. of Housing & Urban Development
  • HUD’s FY17 budget request includes a new formula for funding Housing Choice Voucher Administrative Fees that shifts funding away from inappropriately compensated public housing agencies and increases overall funding according to evidence about actual costs of maintaining a high- performing voucher program. (See here for more)
  • The Administration’s FY17 request recommends shifting support from homeless interventions shown to have limited effectiveness toward housing vouchers that were proven effective in the Family Options study.
Score
6
U.S. Department of Labor
  • DOL’s evidence-based strategy is focused on program performance improvement and expansion of strategies and programs on which there is evidence of positive impact from rigorous evaluations. The department takes all action possible to improve performance before considering funding reductions or program termination. However, DOL does use program performance measures to make decisions about future funding. For example there is currently a proposal to close a Job Corps Center because of its chronic low performance. Closure of this center will allow DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in March 2016, DOL requested public comments on this proposal. Additionally, all discretionary grant performance is closely monitored and has been used to take corrective action and make decisions about continued funding.
Back to the Index

Visit Results4America.org