2016 Federal Index

U.S. Department of Education


Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY16?

  • ED’s Institute of Education Sciences (IES), with a budget of $618 million in FY16, supports research and conducts evaluations of ED’s major programs, including impact The Director of IES and the Commissioner of the National Center for Education Evaluation and Regional Assistance (NCEE) are supported by 10 staff who oversee these evaluations. The Office of Planning, Evaluation, and Policy Development’s (OPEPD) Program and Policy Studies Services (PPSS) has a staff of 20 and serves as the Department’s internal analytics office. PPSS conducts short-term evaluations to support continuous improvement of program implementation and works closely with program offices and senior leadership to inform policy decisions with evidence. While some evaluation funding – such as that for Special Education Studies and Evaluations – is appropriated to IES ($10.8 million in FY16), most evaluations are supported by funds appropriated to ED programs. NCEE and PPSS staff work closely with program offices to design program evaluations that reflect program priorities and questions. Both IES and PPSS provide regular briefings on results to help ensure information can be used by program offices for program improvement.
  • Both IES and PPSS sit on ED’s Evidence Planning Group (EPG) with other senior staff from the ED’s Office of Planning, Evaluation, and Policy Development (OPEPD) and the Office of Innovation and Improvement (OII). EPG reviews and advises programs and Department leadership on how evidence can be used to improve programs. Department Senior officials from IES, OII, and PPSS are part of ED’s leadership structure and weigh in on major policy decisions. They play leading roles in the formation of the Department’s annual budget requests, recommendations around grant competition priorities, including evidence, and providing technical assistance to Congress to ensure that evidence informs policy design.
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/ learning agenda(s) and did it publicly release the findings of all completed evaluations in FY16?

  • ED’s Institute of Education Sciences (IES) supports research and conducts evaluations of ED’s major IES’ evaluation policies are set by the IES Standards and Review Office, addressing issues of scientific quality, integrity, and timely release of reports. Related, the National Board for  Education Sciences, IES’s advisory board, has approved policies for Peer Review, which are implemented by the Standards and Review Office.
  • EPG works with program offices and ED leadership on the development of ED’s annual evaluation This happens through the Department’s annual spending plan process and through identification of high priority evaluations for use of the pooled evaluation authority. IES and PPSS work with programs to design and share results from relevant evaluations that help with program improvement.
  • ED’s current evaluations constitute its learning agenda.
  • ED’s evaluations are posted on the IES website and the PPSS See FY15 Annual Performance Report and FY17 Annual Performance Plan for a list of ED’s current evaluations. IES publicly releases findings from all of its completed, peer-reviewed evaluations on the IES website and also in the Education Resources Information Clearinghouse (ERIC).
  • ED’s supports research through IES’s National Center for Education Research (NCER), which makes grants for prekindergarten through postsecondary research and IES’ National Center for Special Education Research (NCSER), which sponsors a comprehensive program of special education research designed to expand the knowledge and understanding of infants, toddlers, children, and young adults with IES also manages the Regional Educational Laboratory (REL) program, which supports districts, states, and boards of education throughout the United States to use research in decision making.

Did the agency invest at least 1% of program funds in evaluations in FY16? (Note: Meeting this criteria requires both Agency and Congressional action.)

  • There are a variety of ways that ED generally supports evaluations as well as evaluation technical assistance and capacity-building. In FY15 and FY16, ED has the authority to reserve up to 5% of ESEA funds – except Title I funds, Title III funds, and funds for programs that already have an evaluation provision – to evaluate ESEA programs (which RFA estimates at $41.3 million for FY15). In FY15, ED pooled $8.8 million to conduct evaluations that will build new evidence about the following programs: ESEA Title I, Part A; the migrant education program; and the Indian Education LEA Grants Program; and also provided continued support for program evaluations on ESEA Title I, Part A; ESEA Title I, Part D; and ESEA Title III, which began with FY14 pooled funding. The Every Student Succeeds Act (ESSA) of 2015, which reauthorized ESEA, continues the pooling authority and includes Title III as an allowable program from which to pool funds. ESSA also authorizes $710,000 for an evaluation of Title I for FY17-FY20. ED spent over $60 million on program evaluations in FY15.
  • In addition, many ED programs are authorized to support national activities, including program evaluations, and some programs encourage their grantees to conduct project-level evaluations. One of the key lessons from i3 has been that high-quality technical assistance for grantees on project- level evaluations is critical to producing credible information on project outcomes. In FY15 i3 invested more than $4 million of its appropriation in evaluation technical assistance – virtually no other discretionary grant program has the authority or means to fund such a robust vehicle for technical assistance. ED, with the expertise of IES, has begun to pilot less expensive approaches to evaluation technical assistance for programs like First in the World ($1.5 m), and Supporting Effective Educator Development ($~800,000), which also tasks its grantees with producing rigorous project-level evaluations.
  • According to RFA estimates, overall spending on evaluation ($60 million in FY15) and evaluation technical assistance and capacity-building ($6.3 million in FY15) represents 0.1% of ED’s $67.1 billion discretionary budget in FY15.
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY16?

  • ED develops a four-year strategic plan and holds quarterly data-driven progress reviews of the goals and objectives established in the plan, as required by the Government Performance and Results Act Modernization Act (GPRAMA) of ED’s FY14-18 Strategic Plan includes a goal on the continuous improvement of the United States education system with objectives focused on enhancing the use of data, research, evaluation, and technology (see pp. 37-43). GPRMA also requires agencies to develop agency priority goals (APGs) and submit information on those goals to OMB on a quarterly basis. APGs reflect the top near-term performance priorities that agency leadership aims to accomplish within a two-year period. ED established an APG on enabling evidence-based decision-making (see Performance.gov for quarterly reporting on the APGs) and, in March 2016, decided to continue its work on this APG for FY16-17. Once established the metrics for the APGs are included in the strategic plan. For example, strategic objective 5.3 in the Department’s current four-year strategic plan, which is part of the continuous improvement goal referenced above, includes the metrics for the evidence APG. Although many of the metrics in the strategic plan are annual, the Department uses the quarterly reviews to discuss data available and milestones achieved.

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY16?

  • ED has several resources to support the high-quality collection, analysis, and use of high-quality data in ways that protect privacy. IES’ National Center for Education Statistics (NCES) serves as the primary federal entity for collecting and analyzing data related to education. Almost all of  ED’s K-12 statistical and programmatic data collections are now administered by NCES via EDFacts. NCES also collects data through national and international surveys and assessments. Administrative institutional data and statistical sample survey data for postsecondary education is collected through NCES in collaboration with the Federal Student Aid Office (FSA). NCES data are made publicly available online and can be located in the ED Data Inventory. Some data are available through public access while others only through restricted data licenses. ED’s Office for Civil Rights conducts the Civil Rights Data Collection (CRDC) on key education and civil rights issues in our nation’s public schools. Additionally, the Data Strategy Team helps to coordinate data activities across the Department and the Disclosure Review Board, the Family Policy Compliance Office (FPCO), the EDFacts Governing Board, and the Privacy Technical Assistance Center all help to ensure the quality and privacy of education data.
  • ED has made concerted efforts to improve the availability and use of its data in With the release of the new College Scorecard, the Department now provides newly combined data in a tool that helps students choose a school that is well-suited to meet their needs, priced affordably, and consistent with their educational and career goals. Additionally, the College Scorecard promotes the use of open data by providing the underlying data in formats that researchers and developers can use. This effort is a model for future releases of education data, and led to ED’s new effort, InformED, to improve Department capacity to release data in innovative and effective ways to improve public use of data. InformED was part of the FY17 budget request (see p. 78).
  • ED has several data sharing agreements with other For example, ED and the U.S. Department of Treasury match Federal Student Aid data with administrative Internal Review Service tax records to calculate earnings information by postsecondary institution for the College Scorecard consumer tool. This agreement allows ED to annually update and publish data on mean earnings, median earnings, and fraction not working among all students who received Title IV aid (i.e., federal grants and loans). ED and the U.S. Department of Labor are engaged in a joint federal/state workgroup that is developing help for data sharing at the state level through the new State Wage Interchange System (SWIS) for the Workforce Innovation and Opportunity Act (WIOA). For calculating the Gainful Employment (GE) debt-to-earnings metric, the Department of Education obtains from the Social Security Administration (SSA) annual earnings of students who completed a GE program. ED submits the Social Security numbers of students who received Title IV aid (i.e., federal grants and loans) to SSA in order to calculate the highest of mean and median earnings for each program.
  • Additionally, ED administers the Statewide Longitudinal Data System (SLDS) program ($34.5 million in FY16), which provides grants to states to develop their education-related data infrastructure and use these data for education improvement.
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY16?

  • ED’s evidence standards for its grant programs, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) evidence standards. ED often includes these evidence standards in its discretionary grant competitions to direct funds to applicants proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation (see Question #8 below for more detail). Additionally, IES and the National Science Foundation issued a joint report that describes six types of research studies that can generate evidence about how to increase student learning in 2013. These principles are based, in part, on the research goal structure and expectations of IES’s National Center for Education Research (NCER) and National Center for Special Education Research (NCSER). NCER and NCSER communicate these expectations through their Requests for Applications and webinars that are archived on the IES website and available to all applicants.
  • ED’s What Works ClearinghouseTM (WWC) identifies studies that provide credible and reliable evidence of the effectiveness of a given practice, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website. The WWC has reviewed more than 11,325 studies that are available in a searchable database.

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY16?

  • ED’s Investing in Innovation (i3) is the Department’s signature innovation program for K–12 public education. While the larger i3 grants are focused on validating and scaling evidence-based practices, the smaller i3 grants are designed to encourage innovative approaches to persistent challenges. These “Development” grants are the most prevalent type of i3 grant, comprising 105 out of the 157 i3 grants to date, and 7 of the 13 new i3 grants made in FY15. In order to spur similar types of innovation in higher education, the Department made its second cohort of grantees under its First in the World (FITW) program in FY15. The Department made 18 FITW grants in FY15, the vast majority of which (16 of 18) were in the “Development” category.
  • ED is participating in the Performance Partnership Pilots for Disconnected Youth These pilots give state, local, and tribal governments an opportunity to test innovative new strategies to improve such outcomes for low-income disconnected youth ages 14 to 24, including youth who are in foster care, homeless, young parents, involved in the justice system, unemployed, or who have dropped out or are at risk of dropping out of school.
  • The White House Social and Behavioral Sciences Team has conducted several behavioral science studies related to ED’s work, including looking at the impact of text message reminders for students about key tasks related to college matriculation, such as completing financial aid paperwork, and about notices to student borrowers about income-driven repayment plans.
  • ED is currently implementing the Experimental Sites Initiative to test the effectiveness of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid.
  • ED has hired a full-time Pay-for-Success fellow in ED has entered into an agreement with the University of Utah’s Policy Innovation Lab to support a full-time Pay for Success Fellow at ED. With additional expertise provided by this fellow, ED is deepening its capacity and developing ways to use Pay for Success to expand effective educational programs and promote innovation.
  • The IES Research Grants Program supports the development and iterative testing of new, innovative approaches to improving education outcomes. IES makes research grants with a goal structure. “Goal 2: Development and Innovation” supports the development of new education curricula; instructional approaches; professional development; technology; and practices, programs, and policies that are implemented at the student-, classroom-, school-, district-, state-, or federal-level to improve student education outcomes.
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY16?

  • ED’s five largest competitive grant programs in FY16 include: 1) TRIO ($900 million); 2) GEAR UP ($323 million); 3) Teacher Incentive Fund ($230 million); 4) Charter Schools Grants ($333 million); and 5) Preschool Development Grants ($250 million).
  • The Evidence Planning Group (EPG) advises program offices on ways to incorporate evidence in grant programs, including use of evidence as an entry requirement or priority to encourage the use of practices where there is evidence of effectiveness, and/or an exit requirement or priority to build new evidence. For the past several years, ED has reported publicly on Performance.gov on its Agency Priority Goal (APG) focused on directing an increasing percentage of funds available for new competitive awards towards projects that are supported by evidence. In FY15, ED spent 29% of its funding available for new discretionary awards on projects that are supported by promising, moderate, or strong evidence, based on EDGAR evidence levels, surpassing both the FY15 and FY16 targets for that APG.
  • While not all of ED’s FY16 decisions have been finalized yet, ED has announced the following FY16 competitions, which include the use of evidence beyond a logic model: 1) Alaska Native and Native Hawaiian Serving Institutions, 2) Asian American and Native American Pacific Islander-Serving Institutions Program, 3) College Assistance Migrant Program, 4) Educational Technology, Media, and Materials for Individuals with Disabilities— Stepping-up Technology Implementation, 5) High School Equivalency Program, 6) Hispanic-Serving Institutions – Science, Technology, Engineering, or Mathematics, 7) National Professional Development, 8) Native American-Serving Nontribal Institutions Program, 9) Technical Assistance and Dissemination To Improve Services and Results for Children With Disabilities, and 10) TRIO Talent Search.
  • The Investing in Innovation (i3) program ($120 million in FY16) provides competitive grants to local school districts and non-profit organizations that have demonstrated positive impacts to innovate, expand, and scale evidence-based activities to improve student achievement, although details for the FY16 competition have not been announced. ESSA authorizes an Education Innovation and Research (EIR) Grants program.
  • Additionally, ESSA requires that ED give priority to applicants demonstrating strong, moderate, or promising levels of evidence within the following seven competitive grant programs: Literacy Education for All, Results for the Nation; Supporting Effective Educator Development; School Leader Recruitment and Support; Statewide Family Engagement Centers; Promise Neighborhoods; Full-Service Community Schools; and Supporting High- Ability Learners and Learning.
  • ESSA authorizes the Supporting Effective Educator Development program that awards grants to applicants with a demonstrated record of improving student outcomes while giving priority to applicants demonstrating strong, moderate, or promising evidence of effectiveness (as described above). And ESSA authorizes the Replication and Expansion of High-Quality Charter Schools program that awards grants to applicants based on their demonstrated success in improving student outcomes.
  • ED’s FY17 budget, which for P-12 programs is based on ESSA, prioritizes funding evidence-based activities. For example, the budget includes $180 million for the EIR program, an increase of $60 million over the FY16 enacted level for its predecessor, the i3 program. ED also proposes building new evidence to increase the effectiveness of the Magnet Schools Assistance Program. Requests like the $100 million in FITW program, $30 million HBCU/MSI Innovation for Completion Fund competitive grant program, and the use of up to $20 million to develop a TRIO Demonstration Initiative, in consultation with the TRIO community, demonstrate ED’s commitment to building and using evidence to improve college access and completion.
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY16? (Note: Meeting this criteria requires both Agency and Congressional action.)

Repurpose for Results

In FY16, did the agency shift funds away from any practice, policy, or program which consistently failed to achieve desired outcomes? (Note: Meeting this criteria requires both Agency and Congressional action.)

  • Since 2010, ED has worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see A-72 to A-73) (-$66.5 million in FY11) and Mentoring Grants (see p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
  • ED also tries to shift program funds to support more effective practices by prioritizing the use of entry For ED’s grant competitions where there is evaluative data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the grant competition design of future projects. For example, an impact evaluation of the Teacher Incentive Fund (TIF) will inform ED’s FY16 competition design for TIF, including focusing applicants’ attention on practices more likely to be effective.
  • Additionally, ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, ESSA’s Education Innovation and Research (EIR) program – the successor to i3 – supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students.
Back to the Index

Visit Results4America.org