2021 Federal Standard of Excellence


U.S. Department of Labor

69
Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY21?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Chief Evaluation Officer serves as the U.S. Department of Labor (DOL) evaluation officer. The Chief Evaluation Officer oversees DOL’s Chief Evaluation Office (CEO), housed within the Office of the Assistant Secretary for Policy (OASP), and the coordination of Department-wide evaluations, including office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.  
  • CEO is directly appropriated $8.04 million and then, may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary. In FY19, that number was .03% of funds, or $3.3 million, bringing the spending total to $11.34 million. The FY20 number is not known yet, because the Secretary has not determined the set-aside amount. 
  • CEO includes nine full-time staff plus a small number of contractors and one to two detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies such as the Employment and Training Administration (ETA), which has six FTE’s dedicated to research and evaluation activities with which the CEO coordinates extensively on the development of a learning agenda, management of studies, and dissemination of results.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Chief Data Officer serves as the U.S. Department of Labor (DOL) Chief Data Officer. Building on existing efforts initiated before the OPEN Government Data Act, the Secretary released a Secretary’s Order (02-2019) directing the department to create a Chief Data Officer position and a data governance board to help realize the strategic value in data, as well as to establish, coordinate, and manage policy, processes, and standards for data management. The Chief Data Officer chairs DOL’s data governance body, and leads data governance efforts, open data efforts, and associated efforts to collect, manage, and utilize data in a manner that best supports its use to inform program administration and foster data-informed decision-making and policymaking.
  • DOL has arranged for two permanent staff to support governance and open data efforts as well as compliance with the Evidence Act, the Federal Data Strategy, and DOL’s data governance goals. 
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support, improve, and evaluate the agency’s major programs?
  • DOL, through a Secretary’s Order, has created a structure that coordinates and leverages the important roles within the organization to accomplish objectives like those in the Evidence Act. The Secretary’s Order mandates collaboration between the Chief Data Officer, the Chief Performance Officer, Chief Evaluation Officer, Chief Information Officer, and Chief Statistical Officer. This has allowed DOL’s Evidence Officials to more closely coordinate with both regular and ad hoc meetings. For example, in FY19, all four Evidence Officials reviewed DOL agency learning agendas and Evidence Act reports.  
  • The Secretary’s Order mandates a collaborative approach to reviewing IT infrastructure and data asset accessibility, developing modern solutions for managing, disseminating and generating data, coordinating statistical functions, supporting evaluation, research and evidence generation, and supporting all aspects of performance management including assurances that data are fit for purpose.
  • DOL continues to leverage current governance structures, such as the Chief Evaluation Officer continuing to play a role in the formation of the annual budget requests of DOL’s agencies, recommendations around including evidence in grant competitions, and providing technical assistance to the Department leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: The Chief Evaluation Officer traditionally participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMC). The Chief Evaluation Officer reviews agency operating plans and works with agencies and the PMC to coordinate performance targets and measures and evaluates findings; quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Score
7
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY21?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • DOL has an Evaluation Policy that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. The policy represents a commitment to using evidence from evaluations to inform policy and practice. The policy ​​states that “evaluations should be designed to address DOL’s diverse programs, customers, and stakeholders; and DOL should encourage diversity among those carrying out the evaluations.”
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • The Chief Evaluation Office (CEO) develops, implements, and publicly releases an annual DOL evaluation plan. The evaluation plan is based on the agency learning agendas as well as the Department’s Strategic Plan priorities, statutory requirements for evaluations, and Secretarial and Administration priorities. As of August 2021, the Department is seeking public input and comment on the draft FY22-26 strategic and evaluation plans. The evaluation plan includes the studies the CEO intends to undertake in the next year using set-aside dollars. Appropriations language requires the Chief Evaluation Officer to submit a plan to the U.S. Senate and House Committees on Appropriations outlining the evaluations that will be carried out by the Office using dollars transferred to the CEO–the DOL evaluation plan serves that purpose. The evaluation plan outlines evaluations that the CEO will use its budget to undertake. The CEO also works with agencies to undertake evaluations and evidence building strategies to answer other questions of interest identified in learning agencies, but not undertaken directly by the CEO.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • In FY21, the Department is developing its annual evaluation plan, building from individual agencies and learning agendas to create a combined document. DOL has leveraged its existing practices and infrastructure to develop the broad, four-year prospective research agenda, or Evidence-Building Plan, per the Evidence Act requirement. Both documents will outline the process for internal and external stakeholder engagement.
  • The draft FY22-26 Evidence-Building Plan identifies “Equity in Employment and Training Programs” and “Barriers to Women’s Employment” as priority areas.
2.4 Did the agency publicly release all completed program evaluations?
  • All DOL program evaluation reports and findings funded by the CEO are publicly released and posted on the complete reports section of the CEO website. DOL agencies, such as the Employment and Training Administration (ETA), also post and release their own research and evaluation reports. Some program evaluations include data and results disaggregated by race, ethnicity, and gender, among others, where possible. DOL’s website also provides accessible summaries and downloadable one-pagers on each study. CEO is also in the process of ramping up additional methods of communicating and disseminating CEO-funded studies and findings, and published its first quarterly newsletter in September 2020.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 3115, subchapter II (c)(3)(9))
  • The U.S. Department of Labor’s (DOL) Chief Evaluation Office (CEO) has sponsored an assessment of DOL’s baseline capacity to produce and use evidence, with the aim of helping the Department and its agencies identify key next steps to improve evidence capacity. CEO developed technical requirements and contracted with the American Institutes for Research (AIR)/IMPAQ International, LLC (IMPAQ) (research team) to design and conduct this independent, third-party assessment. 
    This assessment included the 16 DOL agencies in the Department’s Strategic Plan. It reflects data collected through a survey of targeted DOL staff, focus groups with selected DOL staff, and a review of selected evidence documents.
  • DOL’s Evaluation Policy touches on the agency’s commitment to high-quality, methodologically rigorous research through funding independent research activities. Further, CEO staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. The CEO also employs technical working groups on the majority of evaluation projects whose members have deep technical and subject matter expertise. The CEO has leveraged the FY20 learning agenda process to create an interim Capacity Assessment, per Evidence Act requirements, and is conducting a more detailed assessment of individual agencies’ capacity, as well as DOL’s overall capacity, in these areas for publication in 2022.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • DOL employs a full range of evaluation methods to answer key research questions of interest, including when appropriate, impact evaluations. Among DOL’s active portfolio of approximately 50 projects, the study type ranges from rigorous evidence syntheses to implementation studies to quasi-experimental outcome studies to impact studies. Examples of current DOL studies with a random assignment component include an evaluation of a Job Corps’ demonstration pilot, the Cascades Job Corps College and Career Academy. An example of a multi-arm randomized control trial was the Reemployment Services and Eligibility Assessments evaluation, which assessed a range of strategies to reduce unemployment insurance duration and improve employment as well as wage outcomes.
Score
5
Resources

Did the agency invest at least 1% of program funds in evaluations in FY21?

3.1 ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY21 budget.
  • The Department of Labor invested $8.04 million on evaluations, evaluation technical assistance, and evaluation capacity-building, representing .072% of the agency’s $11.1 billion discretionary budget in FY21.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • CEO is directly appropriated $8.04 million and then, may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary. In FY19, that number was .03% of funds, or $3.3 million, bringing the spending total to $11.34 million. The FY20 number is not known yet, because the Secretary has not determined the set-aside amount. CEO also collaborates with DOL program offices and other federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to other agencies or programs. In FY19, CEO oversaw approximately $9.94 million in evaluation and evidence building activities, and in FY18, CEO oversaw approximately $21 million in evaluation and evidence building activities. While in FY17, DOL’s CEO oversaw an estimated $40 million in evaluation funding.
  • This amount only represents the dollars that are directly appropriated or transferred to the CEO. Additionally, many DOL evaluations and research studies are supported by funds appropriated to DOL programs and/or are carried out by other offices within DOL. In some programs, such as the America’s Promise grant evaluation and the Reentry Grant Evaluation, evaluation set asides exceed 1% (2.9% and 2.8% respectively for these programs).
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • Grantees and programs that participate in DOL evaluations receive technical assistance related to evaluation activities and implementation such as the Evaluation and Research Hub (EvalHub). DOL agencies, like ETA, are also making a concerted effort to help states and local areas build evaluation capacity to meet the program evaluation requirements for the Workforce Innovation and Opportunity Act and Reemployment Services and Eligibility Assessment (RESEA) through tools such as RESEA program evaluation technical assistance (RESEA EvalTA). A suite of evaluation technical assistance resources are being developed throughout FY20, including webinars and other tools and templates to help states understand, build, and use evidence. DOL’s evaluation technical assistance webinar series for states has been posted online to the RESEA community of practice. This series will ultimately hold 11 webinars. To date, most of the posted webinars have been viewed by the field between 2,000-4,000 times. Additional RESEA EvalTA products are being developed and will be posted on the RESEA community of practice, the DOL Chief Evaluation Office’s website, and in CLEAR, as appropriate.
Score
10
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY21?

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
  • In March 2021, DOL held the agency’s first Summer Data Equity Challenge, awarding $10,000 – $30,000 to researchers studying the impact of DOL policies and programs on traditionally underserved communities. Awardees will use data to find gaps in DOL’s knowledge and, ideally, propose practical solutions to fill those gaps and reduce disparities in outcomes.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • DOL’s performance reporting and dashboard system support quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements. These performance reviews connect to DOL’s broader performance and evaluation activities. DOL’s OCIO developed a new dashboard last year for agency leadership use only – the CXO Dashboard – to interactively assess progress on performance by providing instant access to key administrative data to enable data-driven decisions.
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
5
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY21?

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • DOL’s open government plan was last updated in 2016, and subsequent updates are being considered after the formal release of the Federal Data Strategy and the Evidence Act.  
  • DOL also has open data assets aimed at developers and researchers who desire data-as-a-service through application programming interfaces hosted by both the Office of Public Affairs and the Bureau of Labor Statistics (BLS). Each of these has clear documentation, is consistent with the open data policy, and offers transparent, repeatable, machine-readable access to data on an as-needed basis. The Department is currently developing a new API v3 which will expand the open data offerings, extend the capabilities, and offer a suite of user-friendly tools. 
  • The Department has consistently sought to make as much data available to the public regarding its activities as possible. Examples of this include DOL’s Public Enforcement Database, which makes available records of activity from the worker protection agencies and the Office of Labor Management Standards’ online public disclosure room
  • The Department also has multiple restricted-use access systems which go beyond what would be possible with simple open-data efforts. BLS has a confidential researcher access program, offering access under appropriate conditions to sensitive data. Similarly, the Chief Evaluation Office (CEO) has stood up a centralized research hub for evaluation study partners to leverage sensitive data in a consistent manner to help make evidence generation more efficient.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The Department has conducted extensive inventories over the last ten years, in part to support common activities such as IT modernization, White House Office of Management and Budget (OMB) data calls, and the general goal of transparency through data sharing. These form the current basis of DOL’s planning and administration. Some sections of the Evidence Act have led to a different federal posture with respect to data, such as the requirement for data to be open by default, and considered shareable absent a legal requirement not to do so, or unless there is a risk that the release of such data might help constitute disclosure risk. Led by the Chief Data Officer and DOL Data Board, the Department is currently re-evaluating its inventories and its public data offerings in light of this very specific requirement and re-visiting this issue among all its programs. Because this is a critical prerequisite to developing open data plans, as well as data governance and data strategy frameworks, the agency hopes to have a revised inventory completed during FY21.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • DOL’s CEO, Employment and Training Administration (ETA), and Veterans Employment and Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. Since FY20,  DOL has entered into interagency data sharing agreements with HHS and obtained data to support 10 job training and employment program evaluations.
  • Since FY20, the Department continued to expand efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in CEO, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes. 
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • The State Wage Interchange System (SWIS) is a mechanism through which states can exchange wage data on an interstate basis with other states in order to satisfy performance related reporting requirements under the Workforce Innovation and Opportunity Act (WIOA), as well as for other permitted purposes specified in the agreement. The SWIS agreement includes the U.S. Department of Labor’s Adult, Dislocated Worker, and Youth programs (Title I) and Employment Service program (Title III); the Department of Education’s Adult and Family Literacy Act program (Title II) and programs authorized under the Carl D. Perkins Career and Technical Education Act of 2006 (as amended); and, the Vocational Rehabilitation program (Title IV). The Departments have established agreements with all 50 states, the District of Columbia and Puerto Rico. 
  • ETA continues to fund and provide technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data with education data to support state program administration and evaluation. These grants support the development and expansion of longitudinal databases and enhance their ability to share performance data with stakeholders. The databases include information on programs that provide training and employment services and obtain similar information in the service delivery process.
  • ETA is also working to assess the completeness of self-reported demographic data, to inform both agency level equity priorities and future technical assistance efforts for states and grantees to improve the completeness and quality of this information. ETA incorporated into FOAs the requirement to make any data on credentials transparent and accessible through use of open linked data formats.
  • ETA has been working with the Department’s OCIO to build new case management systems for its National and discretionary grantees known as the Grants Performance Management System (GPMS). In addition to supporting case management by grantees, GPMS supports these grantees in meeting WIOA-mandated performance collection and reporting needs, and to enable automation to ensure programs can continue to meet updated WIOA requirements. ETA is working to integrate GPMS into the Workforce Investment Performance System (WIPS) as programs onboard into GPMS to seamlessly calculate and report WIOA primary indicators of performance and other calculations in programs’ quarterly performance reports (QPRs).
Score
9
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY21?

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • DOL’s Clearinghouse for Labor Evaluation and Research (CLEAR) evidence guidelines, which describe quality standards for different types of studies, are applied to all independent evaluations, including all third party evaluations of DOL programs, determined eligible for CLEAR’s evidence reviews across different topic areas. Requests for proposals also indicate these CLEAR standards should be applied to all Chief Evaluation Office (CEO) evaluations when considering which designs are the most rigorous and appropriate to answer specific research questions.
  • In addition, the DOL Evaluation Policy identifies principles and standards for evaluation planning and dissemination. Additionally, DOL collaborates with other agencies (U.S. Department of Health and Human Services (HHS), the U.S. Department of Education’s Institute of Education Sciences (IES), the National Science Foundation (NSF), and the Corporation for National and Community Service (CNCS)) to develop technological procedures to link and share reviews across clearinghouses. 
6.2 Did the agency have a common evidence framework for funding decisions?
  • DOL uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or timeframe to help inform agencies what strategies appear promising and where gaps exist.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • DOL’s CLEAR is an online evidence clearinghouse. CLEAR’s goal is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly, so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation studies, and causal impact studies. For causal impact studies, CLEAR assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs. CLEAR’s study summaries and icons, found in each topic area, can help users quickly and easily understand what studies found and how much confidence to have in the results.
  • CLEAR’s search tool allows users to find studies based on target population, including race and other demographic characteristics.   
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • DOL promotes the utilization of evidence-based practices in a variety of ways. For example, the Employment and Training Administration (ETA) maintains a user-friendly database and a community of practice, Workforce System Strategies (WSS) that highlights the use of evidence-based interventions and the Evaluation and Research Hub (EvalHub), respectively. WSS is a comprehensive database of over 1,400 profiles that summarize a wide range of findings from reports, studies, technical assistance tools and guides that support program administration and improvement. The EvalHub is a community of practice created to support evidence and evaluation-capacity building efforts within state workforce development programs. In another effort to promote evidence-based practices, ETA has supported an Applied Data Analytics program offered through the Coleridge Initiative for multiple teams from state workforce agencies. DOL agencies, like ETA, are also making a concerted effort to help states build evaluation capacity to meet the program evaluation requirements for the Reemployment Services and Eligibility Assessment (RESEA) program through tools such as RESEA program evaluation technical assistance (RESEA EvalTA). 
Score
5
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY21?

7.1 Did the agency have staff dedicated to leading its innovation efforts to improve the impact of its programs?
  • The Department of Labor’s Chief Innovation Officer is responsible for efforts to use innovative technologies, partnerships and practices to accelerate the Department’s mission. The Chief Innovation Officer reports to the Deputy Secretary and also serves as the Senior Advisor for Delivery for the department.
  • DOL’s Chief Data Officer and Chief Evaluation Office (CEO) Data Analytics team developed a secure data analysis platform accessible to all DOL staff, pre-loaded with common statistical packages and offering the capability to access and merge various administrative data for analysis. DOL supports staff in executing limitless web-based A/B testing and other behaviorally-informed trials, with the shared service of the advanced Granicus platform’s GovDelivery communications tool, including free technical support. This tool enhances the Department’s ability to communicate with the public, such as through targeted email campaigns, and to adjust these communications, informed by testing and data, to increase engagement on relevant topics. The CEO also has developed toolkits and detailed resources for staff to effectively design behaviorally informed tests, shared on their new Behavioral Interventions website.
7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • The CEO uses a variety of communication tools to share rigorous research results, lessons learned, promising practices, and other implications of its research. These include internal briefings from independent contractors and researchers, a brownbag series that features evidence-based promising practices and results shared by DOL staff, for DOL staff, and an external expert seminar series featuring new findings or innovations in relevant areas of work. CEO staff consistently use research findings in the development of new research, and DOL agencies use findings to design and guide new discretionary grant programs, to refine performance measures for grantees, and to make decisions on compliance and enforcement practices.
  • DOL is strongly committed to promoting innovation in our policies and practices. For example, the Employment and Training Administration’s (ETA) competitive funding routinely funds innovative programming, since grantees typically bundle various program services and components to best meet the needs of the people being served by them in their local contexts. A particularly good example of where this innovation is happening is in the Administration’s high priority area of apprenticeships. In FY19, ETA issued nearly $184 million in Scaling Apprenticeship Through Sector-Based Strategies grants to public-private partnerships to expand apprenticeships to healthcare, information technology and other industries. In FY20, ETA awarded nearly $100 million in Apprenticeship: Closing the Skills Gap grants. Additionally, ETA and CEO are conducting an evaluation of the American Apprenticeship Initiative, which since 2016 has provided $175 million in grants to 45 grantees across the nation.
  • In addition, CEO’s Behavioral Insights team works with a number of DOL agencies on a continuous basis to identify and assess the feasibility of conducting studies where insights from behavioral science can be used to improve the performance and outcomes of DOL programs. The Wage and Hour Division’s (WHD) Transformation Team is one such example where continuous improvement efforts are driving innovation. Their work has identified potential areas where behavioral interventions and trials may inform program improvement. CEO is also working across agencies–including WHD, ETA, Women’s Bureau, Veterans Employment and Training Service (VETS), Office of Federal Contract Compliance Programs (OFCCP), and International Labor Affairs Bureau (ILAB)–to identify and assess the feasibility of other areas where insights from behavioral science can be used to improve the performance and outcomes of DOL programs.
  • DOL has also built capacity for staff innovation through the Performance Management Center’s Continuous Process Improvement (CPI) Program, an agency-wide opportunity which trains and certifies agency staff on Lean Six Sigma (LSS) methodologies through real-time execution of DOL process improvement projects. The program includes classroom sessions that prepare participants for LSS Black Belt certification examinations, including the American Society for Quality (ASQ) as well as DOL’s own certification.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • DOL, through the annual Learning Agenda process, systematically identifies gaps in the use of evidence. Innovation is about filling known gaps via dissemination, further research, or generation of quick turnaround assessments, like those offered to the Department by CEO’s Behavioral Insights Program.
  • DOL typically couples innovation with rigorous evaluation to learn from experiments. For example, DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth which includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. DOL is the lead agency for the evaluation of P3. A final report is available on the CEO’s completed studies website. 
  • DOL routinely uses Job Corps’ demonstration authority to test and evaluate innovative and promising models to improve outcomes for youth. Currently, the CEO is sponsoring a rigorous impact evaluation to examine the effectiveness of one of these pilots, the Job Corps Experimental Center Cascades, with results expected in FY22.
Score
8
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY21?

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY21, the five largest competitive grant programs are:
    1. Senior Community Service Employment Program (Approximately $405 million in continuation funds: eligible applicants are non-profit organizations, federal agencies, and tribal organizations)
    2. State Apprenticeship Expansion, Equity and Innovation Grants (Approximately $99 million: eligible applicants are states)
    3. National Farmworker Jobs Program (Approximately $94 million in continuation funds: eligible applicants are entities with an understanding of the problems of eligible migrant and seasonal farmworkers; a familiarity with the agricultural industries and the labor market needs of the proposed service area; and the ability to demonstrate a capacity to administer and deliver a diversified program of workforce investment activities)
    4. YouthBuild (Approximately $89 million: eligible applicants are public and private non-profit agencies)
    5. Pathway Home (Approximately $61 million: eligible applicants are non-profit organizations with (501)(c)(3) status; public institutions of higher education; nonprofit postsecondary education institutions; state or local governments; any Indian or Native American entity eligible for grants under section 166 of WIOA; and for-profit businesses and business-related nonprofit organizations)
  • During the summer of 2021 ETA held a series of stakeholder listening sessions focused on grant equity in an effort to establish a baseline understanding of potential barriers to greater equity in the mix of grant applicants, peer reviewers, awardees, and communities served. This information will help inform future grant making decisions.
8.2 Did the agency use evidence of effectiveness to allocate funds in the five largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • The Employment and Training Administration’s (ETA) YouthBuild applicants are awarded points based on past performance, viewing these metrics as important to demonstrating successful career outcomes for youth. As a pre-apprenticeship program that prepares young people for the construction industry and other in-demand industries, YouthBuild supports the evidence-based national strategy of apprenticeship. Other competitive grant programs that score applications for past performance and use of evidence-informed strategies are the Senior Community Service Employment Program and the National Farmworker Jobs Program.
8.3 Did the agency use its five largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • All five of DOL’s largest grant programs may be involved in evaluations designed by the Chief Evaluation Office (CEO) and the relevant DOL agencies. In each case DOL required or encouraged (through language in the funding announcement and proposal review criteria) grantees to use evidence-based models or strategies in grant interventions and/or to participate in an evaluation, especially to test new interventions that theory or research suggest are promising.
  • For example, DOL is conducting an evaluation of the Pathway Home grant program. This evaluation will build knowledge about the grant models and include the development of a feasibility and design options paper for implementation and impact evaluations. Additionally, DOL has recently launched a multi-year implementation study of the Senior Community Service Employment Program as well as other workforce programs for older workers to build the evidence base on these programs and identify future research options. There are options for more rigorous evaluations in the contract as appropriate.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • DOL includes requirements of demonstrated effectiveness in the allocation of funds, as well as the commitment to building new evidence in order to receive funds, both of which are of equal importance given the fact that many DOL-funded programs lack a sufficient body of evidence to only support those that are already evidence-based. For example, among recent Employment and Training Administration (ETA) competitive grant programs, this has involved requiring: (1) a demonstration of an approach as being evidence-based or promising for receipt of funds (i.e., Reentry Funding Opportunity Announcement) or for potential to receive additional funds (i.e., TechHire); (2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions or multi-site rigorous tests); or (3) full participation in an evaluation as well as rigorous grantee (or local) evaluations (i.e. American Apprenticeship Initiative and the Strengthening Community College Training Grants). Additionally, applicants for the International Labor Bureau’s (ILAB) competitive funding opportunities are required to conduct and/or participate in evaluations as a condition of award.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
  • In 2015, DOL funded an evaluation of the 36-month Linking Employment Activities Pre-Release (LEAP) Program which included an implementation study of LEAP pilot programs that provided jail-based American Job Centers (AJCs) to individuals preparing to re-enter society after time in jail. The findings of the evaluation identified many promising practices for offering both pre- and post-release services and were published in 2018 (see the Final Report and Issue Brief Compendium). In 2020, DOL funded a 42-month Pathway Home Pilot Project and accompanying evaluation that builds on lessons learned from the LEAP program by providing workforce services to incarcerated individuals pre- and post-release. For example, the requirement in the Pathway Home grant for participants to maintain the same caseworker pre- and post-release, was suggested as a promising practice in the LEAP Implementation Study. 
  • DOL funded a national evaluation of the Trade Adjustment Assistance Community College and Career Training (TAACCCT) grant program, which was a $1.9 billion initiative consisting of four rounds of grants, from 2011 to 2018. The grants were awarded to institutions of higher education (mainly community colleges) to build their capacity to provide workforce education and training programs. The implementation study assessed the grantees’ implementation of strategies to better connect and integrate education and workforce systems, address employer needs, and transform training programs and services to adult learners. The synthesis identified key implementation and impact findings based on a review of evaluation reports completed by grantees’ third-party evaluators. The outcomes study examined the training, employment, earnings, and self-sufficiency outcomes of nearly 2,800 participants from nine grants in Round 4. Findings from these studies provide evidence-based practices and insights that are being applied to the new Strengthening Community College Initiative Funding Opportunity Announcement, as well as future DOL investments.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • DOL has a formal Evaluation Policy. Guidance on using funds to conduct and/or participate in program evaluations and/or to strengthen their evaluation-building efforts can be found in each grant funding opportunity, and is a condition of many grants. The “Special Program Requirements” section of the respective grant funding opportunity notifies grantees of this responsibility. Generally, this section states: “As a condition of grant award, grantees are required to participate in an evaluation, if undertaken by DOL. The evaluation may include an implementation assessment across grantees, an impact and/or outcomes analysis of all or selected sites within or across grantees, and a benefit/cost analysis or assessment of return on investment. Conducting an impact analysis could involve random assignment (which involves random assignment of eligible participants into a treatment group that would receive program services or enhanced program services, or into control group(s) that would receive no program services or program services that are not enhanced).
  • DOL may require applicants to collect data elements to aid the evaluation. As a part of the evaluation, as a condition of award, grantees must agree to: (1) make records available to the evaluation contractor on participants, employers, and funding; (2) provide access to program operating personnel, participants, and operational and financial records, and any other pertaining documents to calculate program costs and benefits; (3) in the case of an impact analysis, facilitate the assignment by lottery of participants to program services (including the possible increased recruitment of potential participants); and (4) follow evaluation procedures as specified by the evaluation contractor under the direction of DOL, including after the period of operation. After award, grantees will receive detailed guidance on ETA’s evaluation methodology, including requirements for data collection. Grantees will receive technical assistance to support their participation in these activities.
Score
7
Use of Evidence in Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY21?

9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY21, the five largest non-competitive grant programs are:
    1. Adult Employment and Training Activities ($862,649,000; eligible grantees: city, county, and/or state governments);
    2. Youth Activities ($921,130,000; eligible grantees: city, county, and/or state governments);
    3. Dislocated Worker Employment and Training formula grants ($1,061,553,000; eligible grantees: city, county, and/or state governments);
    4. UI State Administration ($2,365,816,000; eligible grantees: city, county, and/or state governments);
    5. Employment Security Grants to States ($670,052,000; eligible grantees: city, county, and/or state governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128) is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for federal evaluations.
  • WIOA’s evidence and performance provisions: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Section 116(e) of WIOA describes how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within, and high-level outcomes from, the workforce development system. 
  • The Employment and Training Administration sponsors the WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Professionals can access a variety of resources and tools, including an Evaluation Peer Learning Cohort to help leaders improve their research and evaluation capacities. The WorkforceGPS includes links to resources on evaluation assessment readiness, evaluation design, and performance data, all focused on improving the public workforce system.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Reemployment Services and Eligibility Assessments (RESEA) funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130 million to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating that show a demonstrated capacity to improve outcomes for participants; this percentage increases in subsequent years until after FY26, when states must use no less than 50% of such grant funds for such interventions. 
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of American Job Centers (AJCs), and to identify variations in service delivery, organization structure, and administration across AJCs. 
  • Career Pathways Descriptive and Analytical Study: WIOA requires DOL to “conduct a multistate study to develop, implement, and build upon career advancement models and practices for low-wage healthcare providers or providers of early education and child care.” In response, DOL conducted the Career Pathways Design Study to develop evaluation design options that could address critical gaps in knowledge related to the approach, implementation, and success of career pathways strategies generally, and in early care and education specifically. The Chief Evaluation Office (CEO) has recently begun the second iteration of this study. The purpose of this project is to build on the evaluation design work CEO completed in 2018 to build evidence about the implementation and effectiveness of career pathways approaches and meet the WIOA statutory requirement to conduct a career pathways study. It will include a meta-analysis of existing impact evaluation results as well as examine how workers advance through multiple, progressively higher levels of education and training, and associated jobs, within a pathway over time, and the factors associated with their success.
  • Analysis of Employer Performance Measurement Approaches: the goal of the study was to examine the appropriateness, reliability and validity of proposed measures of effectiveness in serving employers required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs. Though the authors did not find an overwhelming case for adopting either one measure or several measures, adopting more than one measure offers the advantage of capturing more aspects of performance and may make results more actionable for the different Title I, II, III, and IV programs. Alternatively, a single measure has the advantage of clarity on how state performance is assessed and fewer resources devoted to record keeping.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The Employment and Training Administration’s (ETA) RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. ETA released specific evaluation guidance to help states understand how to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WOIA requires states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WOIA programs in an annual progress which includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for Federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations to complete this report.
Score
4
Repurpose for Results

In FY21, did the agency shift funds away from or within any practice, policy, interventions, or program that consistently failed to achieve desired outcomes?

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Employment and Training Administration’s (ETA) prospective YouthBuild and Job Corps grant applicants are selected, in part, based on their past performance. These programs consider the entity’s past performance of demonstrated effectiveness in achieving critical outcomes for youth. 
  • Reforming Job Corps provides an example of such efforts to repurpose resources based upon a rigorous analysis of available data. As part of this reform effort, DOL’s FY20 budget request ends the Department of Agriculture’s (USDA) involvement in the program, unifying responsibility in DOL. Workforce development is not a core USDA role, and the 25 centers it operates are overrepresented in the lowest performing cohort of centers.
  • A rigorous 2012 evaluation of the Trade Readjustment Assistance (TAA) Program demonstrated that workers who participated in the program had lower earnings than the comparison group at the end of a four-year follow-up period, in part because they were more likely to participate in long-term job training programs rather than immediately reentering the workforce. However, this training was not targeted to in-demand industries and occupations, and, as found in Mathematica’s evaluation of the TAA program, only 37% of participants became employed in the occupations for which they trained. In the FY21 budget request, the Department addresses these issues by continuing to propose reauthorization of the TAA program that focuses on apprenticeship and on-the-job training, earn-as-you-learn strategies that ensure that participants are training for relevant occupations.
  • DOL’s FY20 budget request eliminates funding for the Senior Community Service Employment Program (SCSEP). SCSEP has a goal of transitioning half of participants into unsubsidized employment within the first quarter after exiting the program, but has struggled to achieve even this modest goal.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • The Department’s Employment and Training Administration sponsors the WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems.
  • Professionals can access a variety of resources and tools, including a learning cohort community to help leaders improve their research and evaluations capacities. The WorkforceGPS includes links to resources on assessment readiness, evaluation design, performance data all focused on improving the public workforce system.
Back to the Standard

Visit Results4America.org