2021 Federal Standard of Excellence


Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY21? (Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

Score
6
Millennium Challenge Corporation
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • MCC is committed to using high-quality data and evidence to drive its strategic planning and program decisions. The Monitoring and Evaluation plans for all programs and tables of key performance indicators for all projects are available online by compact and threshold program and by sector, for use by both partner countries and the general public. Prior to investment, MCC performs a Cost-Benefit Analysis to assess the potential impact of each project, and estimates an Economic Rate of Return (ERR). MCC uses a 10% ERR hurdle to more effectively prioritize and fund projects with the greatest opportunity for maximizing impact. MCC then recalculates ERRs at investment closeout, drawing on information from MCC’s monitoring data (among other data and evidence), to test original assumptions and assess the cost effectiveness of MCC programs. In an effort to complete the evidence loop, MCC now includes evaluation-based cost-benefit analysis as a part of its independent final evaluation. As a part of the independent evaluation, the evaluators analyze the MCC-produced ERR and associated project assumptions five or more years after investment close to understand if and how the benefits actually accrued. These evaluation-based ERRs add to the evidence base by better understanding the long-term effects and sustainable impact of MCC’s programs.
  • In addition, MCC produces periodic reports that capture the results of MCC’s learning efforts in specific sectors and translate that learning into actionable evidence for future programming. Once MCC has a critical number of evaluations in a given sector, the agency endeavors to draw portfolio-wide learning from that sector in the form of Principles into Practice reports. In FY21, MCC published a new Principles into Practice report on its research related to learning in the water, sanitation, and hygiene sector: Lessons from Evaluations of MCC Water, Sanitation, and Hygiene Programs. MCC is also currently working on forthcoming Principles into Practice reports on its general education and evidence-based scorecard selection process.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • MCC continues to implement and expand a new reporting system that enhances MCC’s credibility around results, transparency, learning, and accountability. The Star Report and its associated quarterly business process captures key information to provide a framework for results and improve the ability to promote and disseminate learning and evidence throughout the compact and threshold program lifecycle. For each compact and threshold program, evidence is collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements. Critically, this information is available in one report after each program ends. Each country will have a Star Report published roughly seven months after completion.
  • Continual learning and improvement is a key aspect of MCC’s operating model. MCC continuously monitors progress towards compact and threshold program results on a quarterly basis using performance indicators that are specified in the Monitoring and Evaluation (M&E) Plan for each country’s investments. The M&E Plans specify indicators at all levels (process, output, and outcome) so that progress towards final results can be tracked. Every quarter each partner country submits an Indicator Tracking Table that shows actual performance of each indicator relative to the baseline that was established before the activity began and the performance targets that were established in the M&E Plan. Key performance indicators and their accompanying data by country are updated every quarter and published online. MCC management and the relevant country team review this data in a formal Quarterly Performance Review meeting to assess whether results are being achieved and integrate this information into project management and implementation decisions.
  • In FY21, MCC also launched an exciting new interactive sector-level learning product: Sector Results and Learning pages. Sector Results and Learning pages are interactive web pages that promote learning and inform program design by consolidating the latest monitoring data, independent evaluation results, and lessons from the key sectors in which MCC invests. Critically, this information is now publicly available, in one place, for the first time. An interactive learning database allows practitioners to efficiently retrieve past learning to inform new programs. MCC has published Sector Results and Learning pages for the WASH and transportation sectors. Pages that focus on Agriculture and Irrigation, Education, Energy, and Land will become available throughout 2021.
Score
8
U.S. Department of Education
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ED’s current FY18-22 Strategic Plan includes two parallel goals, one for P-12 and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices. The OPEPD ELG co-chair is responsible for both strategic objectives. An FY22-26 Strategic Plan is under development and will be released in February 2022.
  • All Department Annual Performance Reports (most recent fiscal year) and Annual Performance Plan (upcoming fiscal year) are located on ED’s website
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • The Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across ED to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where ED and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision-makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where ED and field can improve by building, understanding, and using evidence. The Grants Policy Office collaborates with offices across the Department on a variety of activities, including reviews of efforts used to determine grantee performance. 
  • ED is focused on efforts to disaggregate outcomes by race and other demographics and to communicate those results to internal and external stakeholders. For example, in Q1 of FY21, OCDO launched the Education Stabilization Fund (ESF) Transparency Portal at covid-relief-data.ed.gov, allowing ED to track performance, hold grantees accountable, and provide transparency to taxpayers and oversight bodies. The portal was updated in June 2021 to include Annual Performance Report (APR) data from CARES Act grantees allowing ED and the public to monitor support for students and teachers and track progress of the grantees. The portal content displays key data from the APRs, summarizing how the CARES Act funds were used by states and districts from March through September 2020, and by institutions of higher education from March through December 2020. For example, the Elementary and Secondary School Emergency Relief (ESSER) form asks for counts of students that participated in various activities to support learning recovery or acceleration for subpopulations disproportionately impacted by the COVID-19 pandemic. Categories include students with one or more disabilities, low-income students, English language learners, students in foster care, migratory students, and students experiencing homelessness, and five race/ethnicity categories.  
  • Portal at covid-relief-data.ed.gov, allowing ED to track performance, hold grantees accountable, and provide transparency to taxpayers and oversight bodies. The portal was updated in June 2021 to include Annual Performance Report (APR) data from CARES Act grantees allowing ED and the public to monitor support for students and teachers and track progress of the grantees. The portal content displays key data from the APRs, summarizing how the CARES Act funds were used by states and districts from March through September 2020, and by institutions of higher education from March through December 2020. The APR forms for the next data collection in FY22 provide Education Stabilization Fund (ESF) grantees with the opportunity to further disaggregate the data collected on the ESF funds. For example, the Elementary and Secondary School Emergency Relief (ESSER) form asks for counts of students that participated in various activities to support learning recovery or acceleration for subpopulations disproportionately impacted by the COVID-19 pandemic. Categories include students with one or more disabilities, low-income students, English language learners, students in foster care, migratory students, and students experiencing homelessness, and five race/ethnicity categories.  
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • As part of its performance improvement efforts, senior career and political leadership convene quarterly in ongoing Performance Review (QPR) meetings. As part of the QPR process, the Performance Improvement Officer leads senior career and political officials in a review of ED’s progress towards its two-year Agency Priority Goals and four-year Strategic Goals. In each QPR, assembled leadership reviews metrics that are “below target” and brainstorm potential solutions–and celebrates progress toward achieving goals that are “on track” for the current fiscal year.  
  • The Department conducted after-action reviews after the FY19 and FY20 competition cycles to reflect on successes of the year as well as opportunities for improvement. The reviews resulted in process updates for FY21. In addition, the Department updated an optional internal tool to inform policy deliberations and progress on the Secretary’s policy priorities, including the use of evidence and data.
Score
10
U.S. Agency for International Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • USAID partners with the U.S. Department of State to jointly develop and implement clear strategic goals, strategic objectives, and performance goals, which are articulated in the FY18-22 U.S. Department of State – USAID Joint Strategic Plan (JSP). USAID and the Department of State have commenced development of the FY22-26 JSP to incorporate new strategic themes. As part of the planning process issues of racial equity, diversity, and inclusion are being considered under both management-oriented and programmatic goals. The FY22-26 JSP will include a section on evidence-building and USAID and Department of States’ respective learning agendas will be included in the annex. 
  • The Agency measures progress towards its own strategic goals, strategic objectives, and performance goals using data from across the Agency, including from annual Performance Plan and Reports (PPRs) completed by operating units, and uses that information to report on performance externally through the Annual Performance Plan/Annual Performance Report (APP/APR) and the Agency Financial Report.
  • To aggregate and track performance in key sectors, USAID works with the U.S. Department of State to develop and manage over 100 standard foreign assistance indicators that have common definitions and defined collection methods. Once finalized, USAID publishes illustrative indicator data on a publicly available website known as Dollars to Results. Finally, USAID reports on Agency Priority Goal (APG) and Cross Agency Priority (CAP) goal progress on www.performance.gov.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • Most of USAID’s innovation or co-created programs and those done in partnerships reflect a data-driven “pay for results” model, where milestones are agreed by all parties, and payments are made when milestones are achieved. This means that, for some programs, if a milestone is unmet, funds may be re-applied to an innovation or intervention that is achieving results. This rapid and iterative performance model means that USAID more quickly understands what is not working and can move resources away from it and toward what is working.
  • Approaches such as prizes, Grand Challenges, and ventures can also be constructed to be “pay for results only” where interventions such as “Development Impact Bonds“ are used to create approaches where USAID only pays for outcomes and not inputs or attempts only. The Agency believes this model will pave the way for much of USAID’s work to be aligned with a “pay for results” approach. USAID is also piloting the use of the impact per dollar of cash transfers as a minimum standard of cost-effectiveness for applicable program designs. Most innovations funded at USAID have a clear “cost per impact” ratio. 
  • Additionally, USAID Missions develop Country Development Cooperation Strategies (CDCSs) with clear goals and objectives and a Performance Management Plan (PMP) that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and regular review of performance measures to use data and evidence to adapt programs for improved outcomes. USAID also promotes data-informed operations performance management to ensure that the Agency achieves its development objectives and aligns resources with priorities. USAID uses its Management Operations Council to conduct an annual Strategic Review of progress toward achieving the strategic objectives in the Agency’s strategic plan.
  • To improve linkages and break down silos, USAID continues to develop and pilot the Development Information Solution (DIS)–an enterprise-wide management information system that will enable USAID to collect, manage, and visualize performance data across units, along with budget and procurement information, to more efficiently manage and execute programming. USAID is currently in the process of world-wide deployment of the performance management module with almost half of its operating units now live in the system. 
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • USAID’s Program Cycle policy (ADS 201.3.2.18) requires that Missions conduct at least one portfolio review per year that focuses on progress toward strategy-level results. Missions must also conduct a CDCS mid-course stocktaking at least once during the course of implementing their Country Development Cooperation Strategy, which typically spans five years.
  • USAID developed an approach to explicitly ensure adaptation through learning called Collaborating, Learning, and Adapting (CLA). It is incorporated into USAID’s Program Cycle guidance (ADS 201.3.5.19) where it states: “Strategic collaboration, continuous learning, and adaptive management link together all components of the Program Cycle.” Through CLA, USAID ensures its programming is coordinated with others, grounded in a strong evidence base, and iteratively adapted to remain relative throughout implementation. 
  • The CDOs team maintains an internal dashboard which is shared with the Evaluation Officer and Statistical Official to help track progress against milestones on an ongoing basis. This helps ensure that data needs are being met and achieving intended results.  
  • In addition to this focus through its programming, USAID has two senior bodies which oversee Enterprise Risk Management, and meet regularly to improve the accountability and effectiveness of USAID programs and operations through holistic risk management. USAID tracks progress toward strategic goals and annual performance goals during data-driven reviews at Management Operations Council meetings. Also, through input from the Management Operations Council, an annual Agency-wide customer service survey, and other analysis, USAID regularly identifies opportunities for operational improvements at all levels of the Agency as part of its operational learning agenda as well as the agency-wide learning agenda. The initial set of learning questions under the Agency learning agenda included four questions that focused on operational aspects of the agency’s work which influence everything from internal policy, design and procurement processes, program measurement, and staff training. As the Agency Learning Agenda is being revised, the focus on including key operational questions to support continuous improvement remains.
Score
6
Administration for Children and Families (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • Every four years, HHS updates its Strategic Plan, which describes its work to address complex, multifaceted, and evolving health and human services issues. ACF was an active participant in the development of the FY18-FY22 HHS Strategic Plan, which includes several ACF-specific objectives. HHS is starting the process of developing an updated FY22-FY26 HHS Strategic Plan, and ACF will be an active participant in this process. ACF regularly reports on progress associated with the current objectives as part of the FY21 HHS Annual Performance Plan/Report, including the ten total performance measures from ACF programs that support this Plan. ACF performance measures primarily support Goal Three: “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and  Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of ten performance measures that ACF reports on as part of the HHS Strategic Plan.
  • In April 2021, the Assistant Secretary for ACF announced the launch of the implementation of an ambitious agency-wide equity agenda and named the Associate Commissioner of the Administration for Children Youth and Families as lead for the implementation of the Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • OPRE currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model–a systematic, evidence-informed approach to program improvement–which has since informed targeted TA efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
  • ACF programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants & Healthy Marriage and Responsible Fatherhood programs) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes.
  • ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review: a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
5
AmeriCorps
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • AmeriCorps has submitted two drafts of the FY22-26 strategic plan (goals and objectives) to OMB and will finalize it by the end of calendar year 2021.  
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • Several data collection efforts were undertaken in FY21 for the primary purpose of continuous improvement and learning.
    • The Office of the Chief Risk Officer conducted a risk assessment survey and an entity level controls survey. The risk assessment survey was conducted to identify any risks associated with managing the American Rescue Plan increases in agency funding. The findings were used to develop risk mitigation strategies. Similarly, findings from the entity level controls survey will be used to make improvements in organizational business processes needed to achieve the agency’s mission.
    • The Office of Research and Evaluation completed an internal evaluation focused on the successes and challenges of its new regional structure and staff positions. Lessons learned after a year of implementation will be used to improve organizational communication processes and have led to the decision to invest in a third-party workforce analysis.
    • The agency conducted a staff survey focused on issues of diversity, equity and inclusion in the workforce. These findings, as well as the Federal Employee Viewpoint Survey findings, will be used by the CEO and COS to develop improvements in the agency’s work environment.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The agency’s Chief Financial Officer has stood up a standing cross-agency meeting with the Offices of Human Capital, Information Technology, Risk, Research and Evaluation, as well as the Department Heads (Chief of Staff, Chief Operating Officer, Chief of Program Operations to ensure that the agency has a data-driven performance management system. Once the agency has approved performance indicators (as part of its final strategic plan) this meeting will be used to assess progress against strategic goals and their corresponding performance targets.
Score
10
U.S. Department of Labor
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
  • In March 2021, DOL held the agency’s first Summer Data Equity Challenge, awarding $10,000 – $30,000 to researchers studying the impact of DOL policies and programs on traditionally underserved communities. Awardees will use data to find gaps in DOL’s knowledge and, ideally, propose practical solutions to fill those gaps and reduce disparities in outcomes.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • DOL’s performance reporting and dashboard system support quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements. These performance reviews connect to DOL’s broader performance and evaluation activities. DOL’s OCIO developed a new dashboard last year for agency leadership use only – the CXO Dashboard – to interactively assess progress on performance by providing instant access to key administrative data to enable data-driven decisions.
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
10
U.S. Dept. of Housing & Urban Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • HUD’s FY18–22 Strategic Plan, as amended by HUD’s FY21 Annual Performance Plan, defines strategic objectives, priority outcome goals, and program metrics supporting each objective. Progress on program metrics is tracked through the Annual Performance Plan. In FY21, HUD began to develop a new Strategic Plan that will reflect enhanced support for evidence-building through integration with the Learning Agenda and structured Capacity Assessment as provided by the Evidence Act.
  • HUD uses data and evidence extensively to improve outcomes and return on investment. The primary means are through PD&R’s investments in data collection, program demonstrations and evaluations, and research guided by a multi-year learning agenda. HUD’s extensive use of outcome-oriented performance metrics in the Annual Performance Plan; and senior staff oversight and monitoring of key outcomes and initiatives through quarterly performance management meetings that will be supported by a new CFO performance management module under development.
  • A HUD initiative to modernize technologies for using data to improve outcomes includes elements of intelligent automation and artificial intelligence, using advanced data analytics and visualization, and building electronic records management, intelligent data extraction, and electronic forms.
  • In 2019, HUD expanded the Standards for Success data collection and reporting framework for discretionary grant programs to cover Resident Opportunities and Self-Sufficiency Service Coordinator (ROSS) grants, Multifamily Housing Service Coordinator grants, and Multifamily Housing Budget-Based Service Coordinator Sites. The framework supports better outcomes by providing a more standardized performance measurement framework, better alignment with Departmental strategies, and more granular reporting to support analytics.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • HUD uses data and evidence extensively to improve outcomes and return on investment. The primary means are through PD&R’s investments in data collection, program demonstrations and evaluations, and research guided by a multi-year learning agenda. HUD’s extensive use of outcome-oriented performance metrics in the Annual Performance Plan; and senior staff oversight and monitoring of key outcomes and initiatives through quarterly performance management meetings that will be supported by a new CFO performance management module under development.
  • A HUD initiative to modernize technologies for using data to improve outcomes includes elements of intelligent automation and artificial intelligence, using advanced data analytics and visualization, and building electronic records management, intelligent data extraction, and electronic forms.
  • In 2019, HUD expanded the Standards for Success data collection and reporting framework for discretionary grant programs to cover Resident Opportunities and Self-Sufficiency Service Coordinator (ROSS) grants, Multifamily Housing Service Coordinator grants, and Multifamily Housing Budget-Based Service Coordinator Sites. The framework supports better outcomes by providing a more standardized performance measurement framework, better alignment with Departmental strategies, and more granular reporting to support analytics.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • HUD’s senior staff support continuous improvement and oversight and monitoring of key outcomes and initiatives through quarterly performance management meetings. These processes are supported by ongoing, significant investments in evidence-building as documented in the Annual Performance Plan and the iterative process of developing the Research Roadmap learning agenda, as well as development of a new performance management module by the Chief Financial Officer. Monitoring and analysis based on administrative data have a symbiotic and complementary relationship with structured evaluation and program demonstrations.
  • HUD’s Office of Policy Development and Research also hosts ongoing Knowledge Collaboratives designed to support continuous learning and improve performance. Examples include a Data Knowledge Collaborative, an RCT Knowledge Collaborative, and a Knowledge Collaborative on Equity in Evaluation, as well as a new inter-office user group that shares information and tools for using statistical software effectively. For example, a recent meeting of the Knowledge Collaborative on random assignment experiments considered the topics of research preregistration and multiple hypothesis testing. The agenda for the meeting included discussions of the concepts of preregistration and multiple hypothesis testing and of the steps that HUD PD&R could take to encourage (or require pre-registration) of its research studies and guidance it could develop for contracted researchers regarding the multiple comparisons problem. The Knowledge Collaborative on Equity in Evaluation also recently worked on revising HUD’s Evaluation Policy to incorporate considerations of equity throughout.
Score
7
Administration for Community Living (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • ACL employs a moderate approach for analyzing evidence to find ways to improve return on investment that addresses multiple parts of the agency. In FY20, as part of its ongoing effort to ensure that agency funds are used effectively, ACL funded a 3-year contract, focused on ACL’s Administration in Aging, to identify approaches to measure how and to what extent parts of the Aging Network leverage Older Americans Act funds to increase their available resources as well as how the Aging Network uses resources to measure and improve the quality of services available/provided. NIDILRR conducts research as part of their new employment research agenda to continue development of return-on-investment models that can be used by Vocational Rehabilitation agencies to optimize the services they provide. In addition, in January 2021 ACL announced a new phase for the  Innovative Technology Solutions for Social Care Referrals challenge competition. This is in addition to those launched in 2020 (Innovative Solutions to Address the direct Support Professional Crisis, Mental Health Challenge, Disability Employment Challenge). The goal of all the prize competitions is to encourage effective and efficient methods for meeting ACL’s mission and improving services to its target populations. ACL also recently published the results of a study measuring the economic value of volunteerism for Older Americans Act programs.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • As part of ACL’s performance strategy and learning agenda approach OPE staff present performance data to ACL leadership several times a year. In addition, ACL leadership reviews performance data as part of the budget justification process that informs program funding decisions. OPE staff conduct annual meetings with ACL staff to report performance measure data and results to discuss methods for incorporating performance and evaluation findings into funding and operational decision-making. As part of annual evaluation planning efforts, staff from ACL’s Office of Performance and Evaluation consult with ACL center directors to identify evaluation priorities and review proposed evaluation approaches to ensure that the evaluation questions identified will provide information that will be useful for program improvement. Two projects started in late 2020 with the goal of improving agency performance are a study of how the services provided by ACL grantees influence the social determinants of health (SDOH) and an evaluation of how ACL supports grantee use of evidence-based programs that are required under Title IIID of the Older Americans Act. In 2021 ACL began using the National Standards for Culturally and Linguistically Appropriate Services (CLAS) in Health and Health Care to inform its evaluation framework. Specifically, ACL funded this project to explore the extent to which ACL grantees employ CLAS Standards in their service delivery processes, particularly their responsiveness to cultural practices, language and communication needs, LGBTQ+ needs, and health literacy.  ACL also funded a study to examine the use and financial value of volunteers to its programs.  In addition to a final report  ACL also developed an effective practice guide to help grantees use volunteers effectively.
Score
8
Substance Abuse and Mental Health Services Administration (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • The SAMHSA Strategic Plan FY2019-FY2023 outlines five priority areas with goals and measurable objectives to carry out the vision and mission of SAMHSA. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress. 
  • In addition, SAMHSA collects Disparity Impact Statements (DIS) from grantees to ensure SAMHSA programs are inclusive of underserved racial and ethnic minority populations in their services, infrastructure, prevention, and training grants. These populations have been underrepresented in SAMHSA grants. 
  • The DIS is based on the framework of Access, Use and Outcomes: 
    • Access: Who are the subpopulations being served by the program? 
    • Use: What types of services does each subpopulation get? 
    • Outcomes: Given the specified outcomes of the program, how do these vary by subpopulations? 
  • The DIS is a Secretarial Priority from the Department of Health & Human Services’ Action Plan to Reduce Racial and Ethnic Health Disparities. The objective is to “Assess and heighten the impact of all HHS policies, programs, processes, and resource decisions to reduce health disparities. HHS leadership will assure that: … (c) Program grantees, as applicable, will be required to submit health disparity impact statements as part of their grant applications.” The Secretarial Priority focused on underserved racial and ethnic minority populations (e.g., Black/African American; Hispanic/Latino; Asian American, Native Hawaiian and Pacific Islander; and American Indian/Alaska Native). SAMHSA’s Office of Behavioral Health Equity also includes LGBTQ+ populations as an underserved, disparity-vulnerable group.  
  • The standard Government and Performance Results Act (GPRA) data collected by the grantee are used to inform the access, use and outcomes questions. No new data are collected for the DIS. Disaggregating the data by subpopulations will help target gaps in who is included in the grant, what the differences in services provided across subpopulations are, and how outcomes differ across subpopulations.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • The Office of Evaluation in partnership with SAMHSA Program Centers, oversees the identification of a set of performance indicators to monitor SAMHSA programs in collaboration with program staff and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations. SAMHSA’s Performance Accountability and Reporting System (SPARS) serves as the mechanism for the collection of performance data from agency grantees. SAMHSA Program Centers staff examine data entered into SPARS on a regular and real-time basis to manage grant programs and improve outcomes. The data in SPARS is available in .csv file, via report or through data visualization (bar charts, etc). 
  • In FY21, SAMHSA staff and grantees were able to view demographic data to compare clients by race, ethnicity, gender, age, etc. over time. On an annual basis, SAMHSA produces SPARS informed program and topical profiles to examine a program’s performance. These profiles, to be shared with grantees for FY21, include disaggregate outcomes by race and other demographics as well as changes in behavior associated with their time in the grant program. 
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • Since April 2020, CBHSQ’s Office of Evaluation has offered weekly technical assistance and training on data analysis, performance management and evaluation. These one-hour sessions offer opportunities for SAMHSA Program Center staff and CBHSQ to share challenges and opportunities faced by grantees and strategize solutions. These sessions also offer an opportunity for cross-center collaboration and process improvement as project officers share and learn from officers managing programs in other centers. These cross-center meetings allow CBHSQ to learn about challenges in the field, technological challenges using SPARS, and opportunities to make the system more user-friendly. The Project officers often share grantee questions and concerns for discussion and joint problem solving. SAMHSA collects these questions to include in FAQ documents.
  • Since April 2020, every Wednesday, the Office of Evaluation offers a Center specific webinar covering selected grant programs and data visualizations providing a targeted approach to capacity building. These are designed for performance management of discretionary grants but Center and agency leadership are invited and have attended. These are internal to SAMHSA and not open to political leaders. For example, on the second Wednesday of each month, the Office of Evaluation focuses on the Center for Substance Abuse Treatment’s grant programs to offer a deep dive into performance while also discussing efforts to increase data quality. The third Wednesday includes a focus on the Center for Substance Abuse Prevention and the fourth Wednesday focuses on programs funded through the Center for Mental Health Services.
  • SAMHSA has been modernizing the SPARS system to include data visualization and more useful performance management reports. The annual program profiles offer another opportunity for SAMHSA staff to work collaboratively to better understand the challenges facing grantees and allow for modifications and the development of technical assistance for continuous quality improvement. These two-page resource documents provide a snapshot of descriptive data on client-level demographics and activities completed in the previous year.
Back to the Standard

Visit Results4America.org