2016 Federal Index
Use of Evidence in 5 Largest Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY16?
Score
7
7
Administration for Children and Families (HHS)
- In FY16 the 5 largest competitive grant programs are: 1) Head Start ($9,168,095,000); 2) Unaccompanied Children Services ($948,000,000); 3) Early Head Start-Child Care Partnerships ($915,799,422); 4) Transitional and Medical Services ($490,000,000); and 5) Preschool Development Grants ($250,000,000).
- ACF’s template (see 14 in Attachment C) for grant announcements includes two options, requiring grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis.
- In FY12, ACF established the Head Start Designation Renewal System requiring Head Start ($9.2 billion in FY16) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal control.
- ACF’s Personal Responsibility Education Program ($75 million in FY16) includes three individual discretionary grant programs that support evidence- based competitive grants that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
- To receive funds through ACF’s Community Based Child Abuse Prevention (CBCAP) program, states must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” CBCAP defines evidence- based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds (most recently 89.4% in FY14) directed towards evidence-based and evidence-informed practices.
Score
9
9
Corporation for National and Community Service
- CNCS is operating three competitive grant programs in FY16: 1) AmeriCorps State and National program (excluding State formula grant funds) ($386 million in FY16); 2) Senior Corps RSVP program ($49 million in FY16); and 3) the Social Innovation Fund (SIF) ($50 million in FY16).
- SIF provides competitive grants to non-profit grant-making organizations to help them grow promising, evidence-based solutions that address pressing economic opportunity, healthy futures, and youth development issues in low- income The FY14-16 Omnibus Appropriations Acts have allowed CNCS to invest up to 20% of SIF funds each year in Pay for Success initiatives. There are 2 Pay for Success competitions planned for FY16, which will invest both the FY15 and 16 appropriations (approximately $11.6 million at minimum).
- CNCS’s AmeriCorps State and National Grants Program (excluding State formula grant funds), application (see 10-14) allocated up to 27 points out of 100 to organizations that submit applications supported by performance and evaluation data in FY16. Specifically, up to 15 points can be assigned to applications with theories of change supported by relevant research literature, program performance data, or program evaluation data; and up to 12 points (a 4 point increase from FY15) can be assigned for an applicant’s incoming level of evidence with the highest number of points awarded to strong levels of evidence. These categories of evidence are modeled closely on the levels of evidence defined in the Social Innovation Fund.
- In FY16, Senior Corps RSVP grantees seeking funding (see p. 1) through the administrative renewal process are encouraged to fulfill the National Performance Measures requirement by committing a certain number of volunteers to serve in an evidence-based health education program. A total of $500,000 (just above 1% of program funds) is allocated to support organizations in implementing evidence-based interventions or to evaluate programs.
Score
8
8
Millennium Challenge Corporation
- MCC awards all of its agency funds through two competitive grant programs: Compact and Threshold programs (whose budgets for FY16 were $667 and $30 million respectively). Both require demonstrable, objective evidence to support the likelihood of success in order to be awarded funds. For country partner selection, MCC uses twenty different indicators within the categories of economic freedom, investing in people, and ruling justly to determine country eligibility for program assistance. These indicators (see MCC’s FY2016 Guide to the Indicators) are collected by independent third parties. When considering granting a second compact, MCC considers 1) the degree to which there is evidence of strong political will and management capacity, 2) the degree to which the country has exhibited commitment and capacity to achieve program results, and 3) the degree to which the country has implemented the compact in accordance with MCC’s core policies and standards.
- Following country selection, MCC conducts a constraints analysis (CA) to identify the most binding constraints to private investment and entrepreneurship that hold back economic The results of this analysis enable the country, in partnership with MCC, to select compact or threshold activities most likely to contribute to sustainable poverty-reducing growth. Due diligence, including feasibility studies where applicable, are conducted for each potential investment. MCC also performs Cost-Benefit Analysis to assess the potential impact of each project, and estimates an Economic Rate of Return. MCC projects generally have an ERR above 10% at project inception, and MCC recalculates ERRs at compact closeout in order to test original assumptions and assess the cost effectiveness of MCC programs. In connection with the ERR, MCC conducts a Beneficiary Analysis, which seeks to describe precisely which segments of society will realize the project benefits. It is most commonly used to assess the impact of projects on the poor, but it has broader applicability that allows for the estimation of impact on populations of particular interest, such as women, the aged, children, and regional or ethnic sub-populations. In line with MCC’s M&E policy, MCC projects are required to submit quarterly Indicator Tracking Tables showing progress toward projected targets. MCC also requires independent evaluations of every project to assess progress in achieving outputs and outcomes throughout the lifetime of the project and beyond.
Score
8
8
U.S. Agency for International Development
- USAID is committed to using evidence of effectiveness in all of its competitive contracts, cooperative agreements and grants, which comprise the majority of the Agency’s USAID has rebuilt its planning, monitoring, and evaluation framework to produce and use evidence through the introduction of a new Program Cycle, which systematizes use of evidence across all decision-making regarding grants and all of USAID’s work. The Program Cycle is USAID’s particular framing and terminology to describe a common set of processes intended to achieve more effective development interventions and maximize impacts. The Program Cycle acknowledges that development is not static and is rarely linear, and therefore stresses the need to assess and reassess through regular monitoring, evaluation, and learning. Thus the different components of the Program Cycle mutually reinforce each other by having learning and adapting integrated throughout. The Program Cycle encourages planning and project management innovations to increase the cost-effectiveness and lasting impact of development cooperation.
- In 2013, USAID reformed its policy for awarding new contracts to elevate past performance to comprise 20 to 30 percent of the non-cost evaluation criteria. For assistance, USAID does a “risk assessment” to review an organization’s ability to meet the goals and objectives outlined by the Agency. This can be found in ADS 303, section 303.3.9. Contractor performance is guided by USAID operational policy ADS 302, section 302.3.8.7. As required in FAR Subpart 42.15, USAID must evaluate contractor performance using the Contractor Performance Assessment Reporting System (CPARS). Information in CPARS, while not available to the public, is available for Contracting Officers across the Government to use in making determinations of future awards.
- USAID has also instituted a policy called the Acquisition and Assistance Review and Approval Document (AARAD) process where all contracts, grants, and cooperative agreements over $75 million are reviewed by the Administrator prior to being awarded and all awards over $25 million are reviewed by the relevant Assistant Included in the AARAD review are several key factors that include: Policy Relevant, Commitment to Sustainable Results, Feasibility, and Value for Money. This policy ensures that results, evidence, and long-term strategies are incorporated into all of USAID’s major programs. In addition, it ensures senior level accountability on USAID’s biggest programs. This policy is outlined in ADS 300. USAID guidance for competitive grants is also available online.
- The Development Innovation Ventures program ($22.4 million in FY16) provides funding for proof of concept through rigorous evaluation of innovative solutions, and scale-up funding when a solution is proven to DIV’s approach is unique in three ways:
- DIV recognizes that good ideas can come from anywhere, so they welcome a wide range of potential partners to propose their concepts for high-impact development.
- Borrowing from the experience of venture capital, DIV takes advantage of a staged financing They pilot promising new ideas with small amounts of money, and we scale only those solutions that rigorously demonstrate their impact.
- DIV emphasizes a high standard of evidence, including the use of impact evaluations and randomized control trials whenever possible.
Score
8
8
U.S. Department of Education
- ED’s five largest competitive grant programs in FY16 include: 1) TRIO ($900 million); 2) GEAR UP ($323 million); 3) Teacher Incentive Fund ($230 million); 4) Charter Schools Grants ($333 million); and 5) Preschool Development Grants ($250 million).
- The Evidence Planning Group (EPG) advises program offices on ways to incorporate evidence in grant programs, including use of evidence as an entry requirement or priority to encourage the use of practices where there is evidence of effectiveness, and/or an exit requirement or priority to build new evidence. For the past several years, ED has reported publicly on Performance.gov on its Agency Priority Goal (APG) focused on directing an increasing percentage of funds available for new competitive awards towards projects that are supported by evidence. In FY15, ED spent 29% of its funding available for new discretionary awards on projects that are supported by promising, moderate, or strong evidence, based on EDGAR evidence levels, surpassing both the FY15 and FY16 targets for that APG.
- While not all of ED’s FY16 decisions have been finalized yet, ED has announced the following FY16 competitions, which include the use of evidence beyond a logic model: 1) Alaska Native and Native Hawaiian Serving Institutions, 2) Asian American and Native American Pacific Islander-Serving Institutions Program, 3) College Assistance Migrant Program, 4) Educational Technology, Media, and Materials for Individuals with Disabilities— Stepping-up Technology Implementation, 5) High School Equivalency Program, 6) Hispanic-Serving Institutions – Science, Technology, Engineering, or Mathematics, 7) National Professional Development, 8) Native American-Serving Nontribal Institutions Program, 9) Technical Assistance and Dissemination To Improve Services and Results for Children With Disabilities, and 10) TRIO Talent Search.
- The Investing in Innovation (i3) program ($120 million in FY16) provides competitive grants to local school districts and non-profit organizations that have demonstrated positive impacts to innovate, expand, and scale evidence-based activities to improve student achievement, although details for the FY16 competition have not been announced. ESSA authorizes an Education Innovation and Research (EIR) Grants program.
- Additionally, ESSA requires that ED give priority to applicants demonstrating strong, moderate, or promising levels of evidence within the following seven competitive grant programs: Literacy Education for All, Results for the Nation; Supporting Effective Educator Development; School Leader Recruitment and Support; Statewide Family Engagement Centers; Promise Neighborhoods; Full-Service Community Schools; and Supporting High- Ability Learners and Learning.
- ESSA authorizes the Supporting Effective Educator Development program that awards grants to applicants with a demonstrated record of improving student outcomes while giving priority to applicants demonstrating strong, moderate, or promising evidence of effectiveness (as described above). And ESSA authorizes the Replication and Expansion of High-Quality Charter Schools program that awards grants to applicants based on their demonstrated success in improving student outcomes.
- ED’s FY17 budget, which for P-12 programs is based on ESSA, prioritizes funding evidence-based activities. For example, the budget includes $180 million for the EIR program, an increase of $60 million over the FY16 enacted level for its predecessor, the i3 program. ED also proposes building new evidence to increase the effectiveness of the Magnet Schools Assistance Program. Requests like the $100 million in FITW program, $30 million HBCU/MSI Innovation for Completion Fund competitive grant program, and the use of up to $20 million to develop a TRIO Demonstration Initiative, in consultation with the TRIO community, demonstrate ED’s commitment to building and using evidence to improve college access and completion.
Score
7
7
U.S. Dept. of Housing & Urban Development
- In FY16 HUD’s major competitive grant programs are: 1) Homeless Assistance ($1.9 billion); 2) Disaster Assistance/National Disaster Resilience Competition ($300 million); 3) Choice Neighborhoods Grants program ($125 million); 4) Service Coordinators program ($77 million); and 5) Family Self-Sufficiency Program Coordinators ($73 million).
- The National Disaster Resilience Competition used evidence about disaster resilience, including benefit/cost analysis, to ensure that disaster funding improves communities’ ability to withstand and recover more quickly from future disasters, hazards, and shocks rather than simply recreating the same vulnerabilities.
- Decisions regarding the design, funding, and implementation of all HUD competitive grant programs are evidence-based, as specified in funding criteria in HUD’s FY16 Notice of Funding Availability (NOFA). The “Achieving Results and Program Evaluation” factor (see 13), provides funding priority for applicants that demonstrate effective use of evidence in identifying or selecting the proposed practices, strategies, or programs proposed in the application, and requires all grantees to cooperate in HUD-funded research and evaluation studies (see p. 14). Another factor, “Past Performance,” provides: “In evaluating applications for funding HUD will take into account an applicant’s past performance in managing funds, including, but not limited to…. meeting performance targets as established in Logic Models or other performance evaluation tools approved as part of the grant agreement….” (see p. 14). The “Achieving Results and Program Evaluation” factor and “Past Performance” factor are two of five factors considered that total 100 points. The maximum achievable score, with priority points and bonus points, is 106.
Score
7
7
U.S. Department of Labor
- In FY16, the five largest competitive grant programs awarded were: 1) American Apprenticeship Initiative ($175 million), 2) Face Forward Grants Program ($59 million), 3) Disability Employment Initiative ($60 million), 4) Homeless Veterans Reintegration Program ($35 million), and 5) the Workforce Innovation Fund/Pay for Success 2016 ($35 million in FY16). All have national evaluations designed by CEO and the relevant agencies, and two also require grantees to use a portion of their fund for high-quality evaluations on which incentive and priority points were received in the application funding competitive selection process.
- DOL includes rigorous evaluation requirements in all competitive grant programs, involving either: 1) full participation in a national evaluation as a condition of grant receipt; 2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions or multi-site rigorous tests), or 3) full participation in a national evaluation as well as rigorous grantee (or local) The $10 million Linking to Employment Assistance Pre-Release Grant program to improve employment for formerly incarcerated individuals serves as an example of the requirement to participate in a national evaluation as a condition of the grant.
- The Trade Adjustment Assistance Community College and Career Training Grant Program (TAACCCT) program ($2 billion in FY12-14 available through FY 2017; including $410 million in FY 2016) provides grants to community colleges and other higher education institutions to develop and expand evidence-based education and training for dislocated workers changing careers. Up to 10% of each grant can be spent on evaluation. DOL has awarded $11 million for technical assistance and a national evaluation of the program.
- The Workforce Innovation Fund grants ($232 million total, including $35 million awarded in FY 2016) and Pay for Success ($35 million total) are awarded to rigorously test innovation training and employment strategies, with rigorous evaluations incorporated into the programming. PFS is a social investment pilot with payment based on rigorous randomized control trial impacts.