2016 Federal Index


U.S. Department of Labor

Score
9
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY16?

  • DOL’s Chief Evaluation Officer is a senior official with responsibility for all activities of the Chief Evaluation Office (CEO), and coordination of evaluations Department-wide. CEO includes 15 full-time staff and contractors plus 1-2 detailees at any given time. The CEO is responsible for the appropriated budget for the Departmental Program Evaluation ($10 million in FY16) and the Department’s evaluation set-aside funds ($30 million in FY16). In FY16, the CEO will directly oversee an estimated $40 million in evaluation funding and collaborate with DOL agencies on additional evaluations being carried out through an additional $15 million to evaluate Employment and Training Administration (ETA) pilots, demonstrations and research and evaluations of large grant programs including the Performance Partnership Pilots (P3), American Apprenticeship Initiative (AIA), the Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program, and Reentry Programs for Ex-Offenders. The CEO also participates actively in the performance review process during which each operating agency meets with the Deputy Secretary to review progress on performance goals established for the year required under Government Performance and Results Act (GPRA). The CEO’s role is to incorporate evidence and evaluation findings as appropriate and to identify knowledge gaps that might be filled by evaluations or convey evidence that can inform policy and program decisions or performance.
Score
9
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/ learning agenda(s) and did it publicly release the findings of all completed evaluations in FY16?

  • DOL has a formal Evaluation Policy Statement that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. In addition, the Chief Evaluation Office publicly communicates the standards and methods expected in DOL evaluations in formal procurement statements of work.
  • DOL also develops, implements, and publicly releases an annual Evaluation Plan (i.e., Department-level learning agenda), as do each of DOL’s 17 operating agencies. The agency learning agendas form the basis for the DOL’s Evaluation Plan. The 2016 Evaluation Plan was released for public comment in the Federal Register and is posted on the CEO website.
  • All DOL reports and findings are publicly released and posted on the CEO website. DOL agencies also post and release their reports.
Score
8
Resources

Did the agency invest at least 1% of program funds in evaluations in FY16? (Note: Meeting this criteria requires both Agency and Congressional action.)

  • In FY 16, DOL’s CEO will directly oversee an estimated $40 million in evaluation funding. Additionally CEO will collaborate with DOL agencies on additional evaluations being carried out, with approximately $15 million to evaluate Employment and Training Administration (ETA) pilots, demonstrations and research and evaluations of large grant programs, including, for example, the Performance Partnership Pilots (P3), American Apprenticeship Initiative (AIA), the Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program, and Reentry Programs for Ex-Offenders. The combined amount of $55 million represents approximately .44% of DOL’s FY16 discretionary budget of $12.4 billion. (For many of the largest programs, however, up to 5% of their budgets is dedicated to program evaluation and related activities).
  • DOL’s Chief Evaluation Office directly funds evaluations and also combines CEO funds with agency funds to jointly sponsor some evaluations. The largest discretionary programs can use program funds for evaluations and technical assistance, often up to 5% by statute. For example, three separate rounds of grants funded by H1-B worker visa fees totaling about $400 million in FY16 support training particular populations, such as high school students transitioning to work, long-term unemployed workers, and apprenticeship training, and between 3% and 7% of these grant funds (at least $25 million) is expected to be invested in evaluations in FY16. Another example, in FY14 and FY15, up to 5% of the funds available for the workforce innovation activities were used for technical assistance and evaluations related to the projects carried out with these funds. The legislation provided further that the Secretary may authorize awardees to use a portion of awarded funds for evaluation, upon the Chief Evaluation Officer’s approval of an evaluation plan. Further, several DOL agencies also have separate evaluation appropriations. DOL studies funded through individual agencies are also coordinated with DOL’s CEO.
  • The Administration’s FY14-FY17 budget requests recommended allowing the U.S. Secretary of Labor to set aside up to 1% of all operating agencies’ budgets for evaluations, coordinated by CEO. In FYs 2012-2015, Congress authorized the Secretary to set aside up to 0.5% of these funds for evaluations, in addition to the separate evaluation funds that exist in many DOL agencies. In FY16, Congress authorized DOL to set aside up to .75% of operating agency budgets in evaluations.
Score
9
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY16?

  • DOL’s Performance Management Center (PMC) is responsible for the Department’s extensive performance management system, which includes over 400 measures whose results are reviewed quarterly by the Deputy Secretary. PMC’s activities are intended to improve DOL’s program performance through data-driven analysis, sharing best practices, and implementing activities associated with the Government Performance and Results Modernization Act of 2010 (GPRA). Using a PerformanceStat-type reporting and dashboard system, PMC coordinates quarterly meetings between the Deputy Secretary and each agency head, to review performance results and analysis of the priority performance measures contributing to DOL’s strategic goals, to make commitments related to performance improvement, and to follow up on the progress of previous performance improvement commitments. PMC also oversees the Strategic Planning process and analyzes performance data in collaboration with agencies to achieve continuous performance improvement. CEO actively participates in the quarterly performance reviews to incorporate findings from evaluations as appropriate.
  • One of the most important roles that DOL’s CEO plays is to facilitate the interaction between program and evaluation analysts, and performance management and evaluation. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The quarterly GPRA meetings with the Deputy Secretary routinely include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
  • To promote the use of evidence based strategies DOL’s Employment and Training Administration (ETA) also continues to manage the Workforce Systems Strategies website, which identifies a range of potential strategies informed by research evidence and peer exchanges to support grantees in providing effective services to customers.
Score
9
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY16?

  • DOL’s Bureau of Labor Statistics (BLS) (approximately $600 million in FY16) serves as the principal Federal agency responsible for measuring labor market activity, working conditions, and price changes in the U.S. BLS has 111 Cooperative Agreements with 50 States and 4 Territories for labor market and economic data sharing, 505 “letters of agreement” on data usage with academics to conduct statistical research, and data sharing  agreements with the Bureau of Economic Analysis and the Census Bureau.
  • DOL’s Employment and Training Administration (ETA) has agreements with 52 States and Territories for data sharing and exchange of wage data for performance accountability purposes.
  • DOL’s CEO, Employment Training Administration (ETA), and the Veterans Employment and Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. In this past year DOL has entered into interagency data sharing agreements with HHS and obtained data to support 9 job training and employment program evaluations (Reemployment Assistance Demonstration Evaluation with unemployment insurance claimants, Young Parents Demonstration Evaluation, Enhanced Transitional Jobs Program Evaluation, Youthbuild Evaluation, Workforce Investment Act Evaluation, Green Jobs/Health Care Demonstration Evaluation, Re-entry for Ex-Offenders Evaluation, Transition Assistance Program for separating activity duty military persons, and the Job Training Scorecard Feasibility Study).
  • DOL’s worker protection agencies have open-data provisions on enforcement activity for firms from DOL’s five labor enforcement agencies online and accessible through the Enforcement Data Base (Mine Safety and Health Administration, Wage and Hour Division, Occupational Safety and Health Administration, and the Employee Benefits Security Administration).
  • The privacy provisions for BLS and DOL’s Employment and Training Administration (ETA) are publicly available online.
  • In FY16, DOL expanded efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in DOL’s CEO office, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes. The Data Analytics unit has also developed the Data Exchange and Analysis Platform (DEAP) with high processing capacity and privacy provisions to share, link, and analyze program and survey data across DOL programs and agencies and with other agencies. Internal use of DEAP is available now and public access will be available in the future.
  • The Workforce Innovation Opportunity Act (WIOA) calls for aligned indicators of performance for WIOA authorized programs. DOL’s Employment and Training Administration has worked within DOL and with the U.S. Department of Education to pursue the deepest WIOA alignment possible, including indicators definitions, data elements, and specifications to improve the quality and analytic value of the data. DOL chose to include several additional DOL programs in this process, which will result in unprecedented alignment of data and definitions for 13 federal programs (11 DOL and 2 Education). DOL and ED have issued the proposed rule for public comment and will finalize it in late spring 2016, and has also issued the related Information Collection Requests for public comment, and expect to finalize those Information Collection requires prior to that date.
  • ETA continues funding and technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data and education data longitudinally. ETA and DOL’s Veteran’s Employment and Training Service have also modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes. An example of the expanded data reporting requirements is the Homeless Veterans Reintegration Program FY16 grants.
Score
9
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY16?

  • DOL uses the Cross-agency Federal Evidence Framework for evaluation planning and dissemination.
  • DOL’s Clearinghouse for Labor Evaluation and Research (CLEAR) is an internet-based evidence clearinghouse of evaluation reports that reviews designs, methodologies, and findings according to specific standards developed by technical work groups. Each study is scored and given a “causal evidence rating” according to the scoring rubric in the standards. CLEAR is a user-friendly, searchable website, that includes academic quality reviews for each study included in the system, appropriate for peer academic researchers, potential evaluation contractors submitting technical proposals, program practitioners seeking information on “what works”, policy makers, and the general public.
  • DOL uses the CLEAR evidence guidelines and standards when discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are thus used in grants for evidence-based programs demonstrations and in reviewing evaluations in the structured evidence reviews conducted in CLEAR. Requests for proposals also indicate the CLEAR standards apply to all CEO evaluations. Also, DOL has a “Department Evaluation Policy Statement” that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. In addition, CEO publicly communicates the standards and methods expected in all DOL evaluations, and the standards are incorporated into formal procurement statements of work, with scoring for awards based on the standards.
  • Additionally, DOL collaborates with other agencies (HHS, ED-IES, NSF, CNCS) on refining cross-agency evidence guidelines and developing technological procedures to link and share reviews across clearinghouses. The Interagency Evidence Framework conveys the categories of evaluations, the quality review of evaluation methodologies and results, and the use of evaluation finings. The framework is accepted Department- wide.
Score
7
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY16?

  • DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth which includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. DOL is the lead agency for the evaluation of P3. DOL’s CEO and ETA prepared an evaluation technical assistance webinar for P3 grantees in 2014 and will be repeated for the next round of grantees in 2016. Beginning in FY16, the national P3 evaluation contractor is also providing evaluation TA to grantees for methodological design issues and data and management information systems.
  • DOL has initiated six behavioral insights tests (three in unemployment insurance, two in OSHA, and one in EBSA for pension contributions), and two behavioral insights testing different messaging to encourage voluntary compliance embedded into a larger experimental evaluations (in OSHA and Unemployment Insurance). The behavioral tests are being conducted in FY16. Initial findings will be released in April 2016 and will be posted on the CEO website.
Score
7
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY16?

  • In FY16, the five largest competitive grant programs awarded were: 1) American Apprenticeship Initiative ($175 million), 2) Face Forward Grants Program ($59 million), 3) Disability Employment Initiative ($60 million), 4) Homeless Veterans Reintegration Program ($35 million), and 5) the Workforce Innovation Fund/Pay for Success 2016 ($35 million in FY16). All have national evaluations designed by CEO and the relevant agencies, and two also require grantees to use a portion of their fund for high-quality evaluations on which incentive and priority points were received in the application funding competitive selection process.
  • DOL includes rigorous evaluation requirements in all competitive grant programs, involving either: 1) full participation in a national evaluation as a condition of grant receipt; 2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions or multi-site rigorous tests), or 3) full participation in a national evaluation as well as rigorous grantee (or local) The $10 million Linking to Employment Assistance Pre-Release Grant program to improve employment for formerly incarcerated individuals serves as an example of the requirement to participate in a national evaluation as a condition of the grant.
  • The Trade Adjustment Assistance Community College and Career Training Grant Program (TAACCCT) program ($2 billion in FY12-14 available through FY 2017; including $410 million in FY 2016) provides grants to community colleges and other higher education institutions to develop and expand evidence-based education and training for dislocated workers changing careers. Up to 10% of each grant can be spent on evaluation. DOL has awarded $11 million for technical assistance and a national evaluation of the program.
  • The Workforce Innovation Fund grants ($232 million total, including $35 million awarded in FY 2016) and Pay for Success ($35 million total) are awarded to rigorously test innovation training and employment strategies, with rigorous evaluations incorporated into the programming. PFS is a social investment pilot with payment based on rigorous randomized control trial impacts.
Score
7
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY16? (Note: Meeting this criteria requires both Agency and Congressional action.)

  • In FY16, the 5 largest non-competitive grant programs at DOL are in the Employment and Training Administration, all of which allocate funding, by statute, and all include performance metrics (e.g., unemployment insurance payment integrity, WIOA common measures) tracked quarterly: 1) the Unemployment Insurance State grants ($2.6 billion in FY 2016); 2) the Employment Security program state grants ($680 million in FY 2016); and 3) three authorized programs under the Workforce Innovation and Opportunity Act (WIOA). The 3 WIOA-authorized grants are: 1) Youth Workforce Investment program ($873 million in FY 2016), 2) Adult Employment and Training program ($816 million in FY 2016), and 3) Dislocated Workers Employment and Training program ($1.2 billion in FY 2016).
  • WIOA includes evidence and performance provisions beginning in Program Year 2016 which: (1) increase the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorize them to invest these funds in Pay for Performance initiatives; (2) authorize states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorize local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorize States and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
Score
6
Repurpose for Results

In FY16, did the agency shift funds away from any practice, policy, or program which consistently failed to achieve desired outcomes? (Note: Meeting this criteria requires both Agency and Congressional action.)

  • DOL’s evidence-based strategy is focused on program performance improvement and expansion of strategies and programs on which there is evidence of positive impact from rigorous evaluations. The department takes all action possible to improve performance before considering funding reductions or program termination. However, DOL does use program performance measures to make decisions about future funding. For example there is currently a proposal to close a Job Corps Center because of its chronic low performance. Closure of this center will allow DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in March 2016, DOL requested public comments on this proposal. Additionally, all discretionary grant performance is closely monitored and has been used to take corrective action and make decisions about continued funding.
Back to the Index

Visit Results4America.org