2021 Federal Standard of Excellence


Administration for Children and Families (HHS)

71
Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY21?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Deputy Assistant Secretary for Planning, Research, and Evaluation at the Office of Planning, Research, and Evaluation (OPRE) serves as the Administration for Children and Families Chief Evaluation Officer. The Deputy Assistant Secretary oversees OPRE, which supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversaw a research and evaluation budget of approximately $175 million in FY21. OPRE has 70 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs, policies, and the populations they serve. In August 2019, the Department of Health and Human Services’ (HHS) Assistant Secretary for Planning and Evaluation was named the Chief Evaluation Officer of HHS.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The HHS Chief Information Officer serves as the HHS Chief Data Officer. In August 2019, the HHS Chief Information Officer was named the acting Chief Data Officer of HHS. In September of 2019, the Assistant Secretary for Children and Families designated the Deputy Assistant Secretary for Planning, Research, and Evaluation as the primary ACF member to serve on the HHS Data Council, the body responsible for advising the HHS Chief Data Officer on implementation of Evidence Act activities across HHS.
  • Additionally, in 2016, ACF established a Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of ACF data. The Director of DDI reports to the Deputy Assistant Secretary for Planning, Research, and Evaluation and oversees work to improve the quality, usefulness, interoperability, and availability of data and to address issues related to privacy and data security and data sharing. DDI has 12 federal staff positions and an FY21 budget of approximately $7.5M (not including salaries).
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support, improve, and evaluate the agency’s major programs?
  • As of September 2019, ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation serves as the primary ACF representative to HHS’ Leadership Council, Data Council, and Evidence and Evaluation Council–the HHS bodies responsible for implementing Evidence Act activities across HHS. These cross-agency councils meet regularly to discuss agency-specific needs and experiences and to collaboratively develop guidance for department-wide action.
  • Within ACF, the 2016 reorganization that created the Division of Data and Improvement (DDI) endowed ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation with oversight of the agency’s strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. ACF reviews program office performance measures and associated data three times per year in sync with the budget process; OPRE has traditionally worked with ACF program offices to develop research plans on an annual basis and has worked to integrate the development of program-specific learning agendas into this process. In addition, OPRE holds both regular and ad hoc meetings with ACF program offices to discuss research and evaluation findings, as well as other data topics.
Score
10
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY21?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • ACF’s evaluation policy confirms ACF’s commitment to conducting evaluations and using evidence from evaluations to inform policy and practice. ACF seeks to promote rigor, relevance, transparency, independence, and ethics in the conduct of evaluations. ACF established the evaluation policy in 2012 and published it in the Federal Register on August 29, 2014. In late 2019, ACF released a short video about the policy’s five principles and how ACF uses them to guide its work.
  • As ACF’s primary representative to the HHS Evidence and Evaluation Council, the ACF Deputy Assistant Secretary for Planning, Research, and Evaluation co-chairs the HHS Evaluation Policy Subcommittee–the body responsible for developing an HHS-wide evaluation policy. HHS released its Department-wide evaluation policy in 2021.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • In accordance with OMB guidance, ACF contributed to the HHS-wide evaluation plan. The Office of Planning, Research, and Evaluation (OPRE) also annually identifies questions relevant to the programs and policies of ACF and proposes a research and evaluation spending plan to the Assistant Secretary for Children and Families. This plan focuses on activities that OPRE plans to conduct during the following fiscal year.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • In accordance with OMB guidance, HHS is developing an HHS-wide evidence-building plan. To develop this document, HHS asked each sub-agency to submit examples of their agency’s priority research questions, potential data sources, anticipated approaches, challenges and mitigation strategies, and stakeholder engagement strategies. ACF drew from its existing program-specific learning agendas and research plans, has contributed example priority research questions, and has anticipated learning activities for inclusion in the HHS evidence-building plan. The HHS evidence-building plans set to be released in early 2022 as a part of the HHS strategic plan.
  • In 2020, ACF released a research and evaluation agenda, describing research and evaluation activities and plans in nine ACF program areas with substantial research and evaluation portfolios: Adolescent Pregnancy Prevention and Sexual Risk Avoidance, Child Care, Child Support Enforcement, Child Welfare, Head Start, Health Profession Opportunity Grants, Healthy Marriage and Responsible Fatherhood, Home Visiting, and Welfare and Family Self-Sufficiency.
  • In addition to fulfilling requirements of the Evidence Act, ACF has supported and continues to support systematic learning and stakeholder engagement activities across the agency. For example:
    • Many ACF program offices have or are currently developing detailed program-specific learning agendas to systematically learn about and improve their programs—studying existing knowledge, identifying gaps, and setting program priorities. For example, ACF and HRSA have developed a learning agenda for the MIECHV program, and ACF is supporting ongoing efforts to build a learning agenda for ACF’s Healthy Marriage and Responsible Fatherhood (HMRF) programming.
    • ACF will continue to release annual portfolios that describe key findings from past research and evaluation work and how ongoing projects are addressing gaps in the knowledge base to answer critical questions in the areas of family self-sufficiency, child and family development, and family strengthening. In addition to describing key questions, methods, and data sources for each research and evaluation project, the portfolios provide narratives describing how evaluation and evidence-building activities unfold in specific ACF programs and topical areas over time, and how current research and evaluation initiatives build on past efforts and respond to remaining gaps in knowledge.
    • ACF works closely with many stakeholders to inform priorities for its research and evaluation efforts and solicits their input through conferences and meetings such as the Research and Evaluation Conference on Self-Sufficiency, the National Research Conference on Early Childhood, and the Child Care and Early Education Policy Research Consortium Annual Meetings; meetings with ACF grantees and program administrators; engagement with training and technical assistance networks; surveys, focus groups, interviews, and other activities conducted as a part of research and evaluation studies; and through both project-specific and topical technical working groups, including the agency’s Family Self-Sufficiency Research Technical Working Group. ACF’s ongoing efforts to engage its stakeholders will be described in more detail in ACF’s forthcoming description of its learning activities.
2.4 Did the agency publicly release all completed program evaluations?
  • ACF’s evaluation policy requires that “ACF will release evaluation results regardless of findings…Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely–usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2020, OPRE released over 130 research publications. OPRE publications are publicly available on the OPRE website.
  • Additionally, ACF develops and uses research and evaluation methods that are appropriate for studying diverse populations, taking into account historical and cultural factors and planning data collection with disaggregation and subgroup analyses in mind. Whenever possible, ACF projects report on subgroups. Recent examples include the Parents and Children Together (PACT) Evaluation substudy of program strategies and adaptations used by selected responsible fatherhood programs serving Hispanic fathers, and the American Indian and Alaska Native Head Start Family and Child Experiences Survey (AI/AN FACES) which has been fielded to capture information on the characteristics, experiences, and development of Head Start children and families in Region XI, which predominantly serves AIAN children and families. In February 2021 OPRE released a brief on Methods, Challenges, and Best Practices for Conducting Subgroup Analysis.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 3115, subchapter II (c)(3)(9))
  • In accordance with OMB guidance, ACF is contributing to an HHS-wide capacity assessment, which is set to be released in early 2022 as a part of the HHS strategic plan. In order to support these and related efforts, OPRE launched the ACF Evidence Capacity Support project in 2020. The Evidence Capacity project provides support to ACF’s efforts to build and strengthen programmatic and operational evidence capacity, including supporting learning agenda development and the development of other foundational evidence through administrative data analysis. Given the centrality of data capacity to evidence capacity, ACF has also been partnering with the HHS OCDO to develop and pilot test a tool to conduct an HHS-wide data capacity assessment, consistent with Title II Evidence Act requirements. In support of specifically modernizing ACF’s Data Governance and related capacity, ACF launched the ACF Data Governance Consulting and Support project. The Data Governance Support project is providing information gathering, analysis, consultation, and technical support to ACF and its partners to strengthen data governance practices within ACF offices, and between ACF and its partners at the federal, state, local, and tribal levels.
  • ACF has also sought to build capacity to support culturally responsive evaluation, including sponsorship of the National Research Center on Hispanic Children & Families and the Tribal Early Childhood Research Center, and development of “A Roadmap for Collaborative and Effective Evaluation in Tribal Communities.” ACF also has a new grant opportunity for an African American Children and Families Research Center, which is intended to lead and support research on the needs of African American populations served by ACF and promising approaches to promote social and economic well-being among low-income African American populations, further providing leadership on culturally competent research that can inform policies concerning low-income African American populations and foster significant scholarship regarding the needs and experiences of the diverse African American population throughout the nation. ACF also continues to support the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts as follows:
  • Coverage: ACF conducts research in areas where Congress has given authorization and appropriations. Programs for which ACF is able to conduct research and evaluation using dedicated funding include Temporary Assistance for Needy Families, Health Profession Opportunity Grants, Head Start, Child Care, Child Welfare, Home Visiting, Healthy Marriage and Responsible Fatherhood, Personal Responsibility Education Program, Sexual Risk Avoidance Education, Teen Pregnancy Prevention, Runaway and Homeless Youth, Family Violence Prevention Services, and Human Trafficking services. These programs represent approximately 85% of overall ACF spending.
  • Quality: ACF’s Evaluation Policy states that ACF is committed to using the most rigorous methods that are appropriate to the evaluation questions and the populations with whom research is being conducted and feasible within budget and other constraints, and that rigor is necessary not only for impact evaluations, but also for implementation/process evaluations, descriptive studies, outcome evaluations, and formative evaluations; and in both qualitative and quantitative approaches.
  • Methods: ACF uses a range of evaluation methods. ACF conducts impact evaluations as well as implementation and process evaluations, cost analyses and cost benefit analyses, descriptive and exploratory studies, research syntheses, and more. ACF also develops and uses methods that are appropriate for studying diverse populations, taking into account historical and cultural factors and planning data collection with disaggregation and subgroup analyses in mind. ACF is committed to learning about and using the most scientifically advanced approaches to determining effectiveness and efficiency of ACF programs; to this end, OPRE annually organizes meetings of scientists and research experts to discuss critical topics in social science research methodology and how innovative methodologies can be applied to policy-relevant questions.
  • Effectiveness: ACF’s Evaluation Policy states that ACF will conduct relevant research and disseminate findings in ways that are accessible and useful to policymakers, practitioners, and the diverse populations that ACF programs serve. OPRE engages in ongoing collaboration with ACF program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions such as ACF regulations and funding opportunity announcements. For example, when ACF’s Office of Head Start significantly revised its Program Performance Standards–the regulations that define the standards and minimum requirements for Head Start services–the revisions drew from decades of OPRE research and the recommendations of the OPRE-led Secretary’s Advisory Committee on Head Start Research and Evaluation. Similarly, ACF’s Office of Child Care drew from research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of funds dedicated to improving the quality of programs, and other information to inform the regulations accompanying the reauthorization of the Child Care and Development Block Grant.
  • Independence: ACF’s Evaluation Policy states that independence and objectivity are core principles of evaluation and that it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, ACF protects independence in the design, conduct, and analysis of evaluations. To this end, ACF conducts evaluations through the competitive award of grants and contracts to external experts who are free from conflicts of interest; and, the Deputy Assistant Secretary for Planning, Research, and Evaluation, a career civil servant, has authority to approve the design of evaluation projects and analysis plans; and has authority to approve, release, and disseminate evaluation reports.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • ACF’s Evaluation Policy states that in assessing the effects of programs or services, ACF evaluations will use methods that isolate to the greatest extent possible the impacts of the programs or services from other influences and that for causal questions, experimental approaches are preferred. As of April 2021, at least 20 ongoing OPRE projects included one or more random assignment impact evaluations. To date in FY21, OPRE has released RCT impact findings related to Health Profession Opportunity Grants and TANF job search assistance strategies.
  • OPRE’s template for research contracts includes a standard task for stakeholder engagement, which states that “involving stakeholders in the evaluation may increase understanding, acceptance, and utilization of evaluation findings… Where appropriate, stakeholders should have the opportunity for input at multiple phases of a project…accomplished in a transparent way while safeguarding the objectivity and independence of the study.” Four OPRE projects focused on early childhood programs that serve American Indian and Alaska Native (AIAN) families are exemplars of using a stakeholder engaged approach at each stage of the research cycle to understand and co-create knowledge: Tribal Early Childhood Research Center (TRC), AIAN Family and Childhood Experiences Survey (FACES) 2015, Multi-Site Implementation of Evaluation of MIECHV with AIAN Families (MUSE), and AIAN FACES 2019. Additionally, a planned solicitation for FY22, Advancing Contextual Analysis and Methods of Participant Engagement in OPRE (CAMPE), will explore how OPRE can further incorporate participatory methods and analysis of contextual factors into research and evaluation projects.
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY21?

3.1 ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY21 budget.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • In FY21, the Administration for Children and Families has an evaluation budget of approximately $209 million, a $1 million increase from FY20.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
Score
6
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY21?

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • Every four years, HHS updates its Strategic Plan, which describes its work to address complex, multifaceted, and evolving health and human services issues. ACF was an active participant in the development of the FY18-FY22 HHS Strategic Plan, which includes several ACF-specific objectives. HHS is starting the process of developing an updated FY22-FY26 HHS Strategic Plan, and ACF will be an active participant in this process. ACF regularly reports on progress associated with the current objectives as part of the FY21 HHS Annual Performance Plan/Report, including the ten total performance measures from ACF programs that support this Plan. ACF performance measures primarily support Goal Three: “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and  Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of ten performance measures that ACF reports on as part of the HHS Strategic Plan.
  • In April 2021, the Assistant Secretary for ACF announced the launch of the implementation of an ambitious agency-wide equity agenda and named the Associate Commissioner of the Administration for Children Youth and Families as lead for the implementation of the Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • OPRE currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model–a systematic, evidence-informed approach to program improvement–which has since informed targeted TA efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
  • ACF programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants & Healthy Marriage and Responsible Fatherhood programs) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes.
  • ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review: a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
6
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY21?

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ACF’s Interoperability Action Plan was established in 2017 to formalize ACF’s vision for effective and efficient data sharing. Under this plan ACF and its program offices will develop and implement a Data Sharing First (DSF) strategy that starts with the assumption that data sharing is in the public interest. The plan states that ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • In 2020, ACF released a Compendium of ACF Administrative and Survey Data Resources. The Compendium documents administrative and survey data collected by ACF that could be used for evidence-building purposes. It includes summaries of twelve major ACF administrative data sources and seven surveys. Each summary includes an overview, basic content, available documentation, available data sets, restrictions on use, capacity to link to other data sources, and examples of prior research. It is a joint product of the Office of Planning, Research, and Evaluation (OPRE) in ACF, and the office of the Assistant Secretary for Planning and Evaluation (ASPE), U.S. Department of Health and Human Services.
  • In addition, in 2019 OPRE compiled the descriptions and locations of hundreds of OPRE-archived datasets that are currently available for secondary analysis and made this information available on a single webpage. OPRE continues to regularly update this website with current archiving information. OPRE regularly archives research and evaluation data for secondary analysis, consistent with the ACF evaluation policy, which promotes rigor, relevance, transparency, independence, and ethics in the conduct of evaluation and research. This new consolidated web page serves as a one-stop resource that will help to make it easier for potential users to find and use the data that OPRE archives for secondary analysis.
  • In 2020 ACF launched the ACF Data Governance Consulting and Support project, which is providing information gathering, analysis, consultation, and technical support to ACF and its partners to strengthen data governance practices within ACF offices, and between ACF and its partners at the federal, state, local, and Tribal levels. Initial work will focus particularly on data asset tracking and metadata management, among other topics.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • ACF has multiple efforts underway to promote and support the use of documented data for research and improvement, including making numerous administrative and survey datasets publicly available for secondary use and actively promoting the archiving of research and evaluation data for secondary use. These data are machine readable, downloadable, and de-identified as appropriate for each data set. For example, individual-level data for research is held in secure restricted use formats, while public-use data sets are made available online. To make it easier to find these resources, ACF released a Compendium of ACF Administrative and Survey Data and consolidated information on archived research and evaluation data on the OPRE website.
  • Many data sources that may be useful for data linkage for building evidence on human services programs reside outside of ACF.  In 2020, OPRE released the Compendium of Administrative Data Sources for Self-Sufficiency Research, describing promising administrative data sources that may be linked to evaluation data in order to assess long-term outcomes of economic and social interventions. It includes national, federal, and state sources covering a range of topical areas. It was produced under contract by MDRC as a part of OPRE’s Assessing Options Evaluate Long-Term Outcomes (LTO) Using Administrative Data project.
  • Additionally, ACF is actively exploring how enhancing and scaling innovative data linkage practices can improve our understanding of the populations served by ACF and build evidence on human services programs more broadly. For instance, the Child Maltreatment Incidence Data Linkages (CMI Data Linkages) project is examining the feasibility of leveraging administrative data linkages to better understand child maltreatment incidence and related risk and protective factors. Also, in August 2021, OPRE published a brief presenting findings from the 2019 TANF Data Innovation Needs Assessment. This survey of state TANF agencies was designed to understand state strengths and challenges in linking and analyzing administrative data for program improvement. Findings from the Needs Assessment informed technical assistance provided to states through ACF’s TANF Data Collaborative. Information from the brief may be helpful to states, policymakers, and other funders in helping to support states in linking data for the purpose of evidence building.
  • ACF actively promotes archiving of research and evaluation data for secondary use. OPRE research contracts include a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past ACF projects are stored in archives including the ACF-funded National Data Archive on Child Abuse and Neglect (NDACAN), the ICPSR Child and Family Data Archive, and the ICPSR data archive more broadly. OPRE has funded grants for secondary analysis of ACF/OPRE data; examples in recent years include secondary analysis of strengthening families datasets and early care and education datasets. In 2019 ACF awarded Career Pathways Secondary Data Analysis Grants to stimulate and fund secondary analysis of data collected through the Pathways for Advancing Careers and Education (PACE) Study, Health Professions Opportunity Grants (HPOG) Impact Study, and HPOG National Implementation Evaluation (NIE) on questions relevant to career pathways programs’ goals and objectives. Information on all archived datasets that are currently available for secondary analysis is available on OPRE’s website.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • ACF receives privacy and security guidance from both the ACF and HHS Office of the Chief Information Officer (OCIO). Between these two offices, there are several policies and practices in place to assure all ACF data are protected and all incidents are handled appropriately. The requirements are supported by auditing mechanisms and a privacy and security training program.
  • In 2014, ACF developed a Confidentiality Toolkit that explains the rules governing confidentiality of ACF data connected to many programs, provides examples of how confidentiality requirements can be addressed, and includes sample memoranda of understanding and data sharing agreements. In 2020, ACF launched the ACF Privacy and Confidentiality Analysis and Support project. This project is currently in the process of updating the Toolkit for recent changes in statute, and to provide real-world examples of how data has been shared across domains—which frequently do not have harmonized privacy requirements—while complying with all relevant privacy and confidentiality requirements (e.g. FERPA, HIPPA). These case studies will also include downloadable, real-world tools that have been successfully used in the highlighted jurisdictions. In addition, the project is exploring creating and maintaining a compendium of existing Privacy and Confidentiality laws for use by ACF staff.
  • ACF also takes appropriate measures to safeguard the privacy and confidentiality of individuals contributing data for research throughout the archiving process, consistent with ACF’s core principle of ethics. Research data may be made available as public use files when the data would not likely lead to harm or to the re-identification of an individual, or through restricted access. Restricted access files are de-identified and made available to approved researchers either through secure transmission and download, virtual data enclaves, physical data enclaves, or restricted online analysis.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • ACF undertakes many program-specific efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. For example, ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes encouraging and strengthening state integrated data systems, promoting proper payments and program integrity, and enabling data analytics for TANF program improvement. Similarly, in 2020 OPRE awarded Human Services Interoperability Demonstration Grants to Georgia State University and Kentucky’s Department of Medicaid Services. These grants are intended to expand data sharing efforts by state, local, and tribal governments to improve human services program delivery, and to identify novel data sharing approaches that can be replicated in other jurisdictions. ACF anticipates awarding another round of Interoperability Demonstration grants in FY22. Also in 2019, OPRE in partnership with ASPE began a project to support states in linking Medicaid and child welfare data at the parent-child level to support outcomes research. Under this project, HHS will work with two to four states to enhance capacity to examine outcomes for children and parents who are involved in state child welfare systems and who may have behavioral health issues. Of particular interest are outcomes for families that may have substance use disorders, like opioid use disorder. Specifically this project seeks to develop state data infrastructure and increase the available de-identified data for research in this area.
Score
8
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY21?

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
  • While ACF does not have a common evidence framework across all funding decisions, certain programs do use a common evidence framework for funding decisions. For example:
  • The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices.
  • The Personal Responsibility Education Program Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected 44 models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • ACF sponsors several user-friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions have disseminated and promoted evidence-based interventions by rating the quality of evaluation studies and presenting results in a user-friendly searchable format. Current evidence reviews include: 1) Home Visiting Evidence of Effectiveness (HomVEE) Home Visiting Evidence of Effectiveness, which provides an assessment of the evidence of effectiveness for early childhood home visiting models that serve families with pregnant women and children from birth to kindergarten entry; 2) The Pathways to Work Evidence Clearinghouse, a user-friendly website that reports on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects” and allows users to search for interventions based upon characteristics of the clients served by the intervention; and 3) ACF’s Title IV-E Prevention Services Clearinghouse, whose easily accessible and searchable website allows users to find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review.
  • Additionally, most ACF research and evaluation projects produce and widely disseminate short briefs, tip sheets, or infographics that capture high-level findings from the studies and make information about program services, participants, and implementation more accessible to policymakers, practitioners, and other stakeholders. For example, the Pathways for Advancing Careers and Education (PACE) project released a series of nine short briefs to accompany the implementation and early impact reports that were released for each of the nine PACE evaluation sites.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ACF’s evaluation policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policymakers, practitioners, and the communities that ACF serves and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policymakers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. OPRE research contracts include a standard clause requiring contractors to develop a dissemination plan during early project planning to identify key takeaways, target audiences, and strategies for most effectively reaching the target audiences. OPRE’s dissemination strategy is also supported by a commitment to plain language; OPRE works with its research partners to ensure that evaluation findings and other evidence are clearly communicated. OPRE also has a robust dissemination function that includes the OPRE website, including a new blog, an OPRE e-newsletter, and social media presence on Facebook, Twitter, Instagram, and LinkedIn.
  • OPRE also hosts an annual “Evaluation and Monitoring 101” training for ACF staff to help agency staff better understand how to design, conduct, and use findings from program evaluation and performance monitoring, ultimately building the capacity of agency staff and program offices to use evaluation research and data analysis to improve agency operations.
  • OPRE biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency (RECS) and the National Research Conference on Early Childhood (NRCEC) to share research findings with researchers and with program administrators and policymakers at all levels. OPRE also convenes the Network of Infant and Toddler Researchers (NITR) which brings together applied researchers with policymakers and technical assistance providers to encourage research-informed practice and practice-informed research; and the Child Care and Early Education Policy Research Consortium (CCEEPRC) which brings together researchers, policymakers, and practitioners to discuss what we are learning from research that can help inform policy decisions for ACF, States, Territories, localities, and grantees and to consider the next steps in early care and education (ECE) research.
  • The Children’s Bureau (CB) sponsors the recurring National Child Welfare Evaluation Summit to bring together partners from child welfare systems and the research community to strengthen the use of data and evaluation in child welfare; disseminate information about effective and promising prevention and child welfare services, programs, and policies; and promote the use of data and evaluation to support sound decision-making and improved practice in state and local child welfare systems. ACF also sponsors several:
Score
6
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY21?

7.1 Did the agency have staff dedicated to leading its innovation efforts to improve the impact of its programs?
  • In late 2019, ACF stood up a customer experience initiative to enhance ACF’s delivery and administration of human services. This initiative focuses on ways to improve the experiences of both grantees and ACF employees. In 2020, ACF named a Chief Experience Officer (CEO) to lead these efforts. To date, the CEO has led efforts to understand and improve upon the experiences of ACF grantees receiving funding from multiple HHS operating divisions, evaluate and address the challenges that organizations face in applying for competitive grants, and develop an internal tool for ACF teams to assess and improve upon their capability to provide excellent technical assistance to ACF grantees. In 2021, ACF launched an Innovation Incubator initiative which began with a series of three Human-Centered Design trainings offered to ACF employees to equip staff with the skills and resources to identify problems, brainstorm ideas for improvement, and pilot solutions using an empathetic, “people-first” mindset. Participating staff also have access to the ACF Innovators community, a shared platform which supports interoffice idea generation and collaboration. ACF also has an ongoing Human-centered Design for Human Services (HCD4HS) project to explore the application of Human-centered Design across its service delivery programs at the federal, state, and local levels.
7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • ACF’s mission to “foster health and well-being by providing federal leadership, partnership and resources for the compassionate and effective delivery of human services” is undergirded by six values: dedication, professionalism, integrity, stewardship, respect, and excellence. ACF’s emphasis on excellence, “exemplified by innovations and solutions that are anchored in available evidence, build knowledge and transcend boundaries,” drives the agency’s support for innovation across programs and practices.
  • For example, ACF’s customer experience initiative is supporting the development of innovative practices for more efficient and responsive agency operations–improving how ACF understands and meets the needs of grantees and improving their capacity for service delivery. For example, ACF, in partnership with HRSA, convened a gathering of grantees who receive Head Start grants from ACF and federally qualified Health Center grants from HRSA to create opportunities for grantees to learn from one another and share best practices. ACF also helped a grantee analyze their data across both Head Start and Health Center programs to make operational improvements to their program.
  • ACF also administers select grant programs–through innovation projects, demonstration projects, and waivers to existing program requirements–that are designed to both implement and evaluate innovative interventions, as a part of an ACF-sponsored evaluation or an individual evaluation to accompany implementation of that innovation. For example:
  • ACF projects that support innovation include:
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
Score
7
Use of Evidence in Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY21?

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY21, the five largest competitive grant programs are:
    1. Head Start ($10.7 billion; eligible applicants: public or private non-profit organizations, including community-based and faith-based organizations, or for-profit agencies);
    2. Unaccompanied Children Services ($1.3 billion; eligible applicants: private non-profit and for-profit agencies
    3. Preschool Development Grants ($275 million; eligible applicants: states);
    4. Healthy Marriage Promotion and Responsible Fatherhood Grants ($148.8 million; eligible applicants: states, local governments, tribal entities, and community-based organizations, both for profit and non-for-profit, including faith-based);
    5. Transitional Living Program Runaway and Homeless Youth  ($116.8 million; eligible applicants: community-based public and private organizations)
8.2 Did the agency use evidence of effectiveness to allocate funds in the five largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • ACF reviewed performance data from the 2015 cohort of Healthy Marriage and Responsible Fatherhood grantees (using the nFORM system) to set priorities, interests, and expectations for HMRF grants that were awarded in 2020. For example, because nFORM data indicated that organizations were more likely to meet enrollment targets and engage participants when they focused on implementing one program model, ACF’s 2020 FOA, which led to 113 HMRF grant awards in September 2020, mentioned specific interest in grantee projects, “that implement only one specific program model designed for one specific youth service population (p. 12).”
  • In its award decisions, ACF gave “preference to those applicants that were awarded a Healthy Marriage or Responsible Fatherhood grant between 2015 and 2019, and that (a) [were] confirmed by ACF to have met all qualification requirements under Section IV.2, The Project Description, Approach, Organizational Capacity of this FOA; and (b) [were] confirmed by ACF to have received an acceptable rating on their semi-annual grant monitoring statements during years three and four of the project period. [ACF gave] particular consideration to applicants that: (1) designed and successfully implemented, through to end of 2019, an impact evaluation of their program model, and that the impact evaluation was a fair impact test of their program model and that was not terminated prior to analysis; or (2) successfully participated in a federally-led impact evaluation” (p. 17).
  • ACF also evaluated HMRF grant applicants based upon their capacity to conduct a local impact evaluation and their proposed approach (for applicants required or electing to conduct local evaluations); their ability to provide a reasonable rationale and/or research base for the program model(s) and curriculum(a) proposed; and their inclusion of a Continuous Quality Improvement Plan, clearly describing the organizational commitment to data-driven approaches to identify areas for program performance, testing potential improvements, and cultivating a culture and environment of learning and improvement, among other things. Further, The Compliance And Performance reviews (CAPstone) entail a thorough review of each grantee’s performance. The Office of Family Assistance (OFA) sends a formal set of questions about grantee performance that the grant program specialists and TA providers answer ahead of time, and then they convene meetings where the performance of each grantee is discussed by OFA, OPRE, and the TA provider at length using nFORM data and the answers to the formal questions mentioned above.
  • The Head Start Designation Renewal System (DRS) determines whether Head Start/Early Head Start grantees are delivering high-quality comprehensive services to the children and families that they serve. These determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. When the DRS deems grantees to be underperforming, grantees are denied automatic renewal of their grant and must apply for funding renewal through a standard open competition process. In the most recent  language, grantees who are re-competing for Head Start funds must include a description of any violations, such as deficiencies, areas of non-compliance, and/or audit finding in their record of Past Performance (p. 28). Applicants may describe the actions they have taken to address these violations. According to Head Start policy, in competitions to replace or potentially replace a current grantee, the responsible HHS official will give priority to applicants that have demonstrated capacity in providing effective, comprehensive, and well-coordinated early childhood education and development services and programs (see section 1304.20: Selection among applicants).
  • ACF manages the Runaway and Homeless Youth Training and Technical Assistance Center (RHYTTAC), the national training and technical assistance entity that provides resources and direct assistance to the Runaway and Homeless Youth (RHY) grantees and other youth serving organizations eligible to receive RHY funds. RHYTTAC disseminates information about and supports grantee implementation of high-quality, evidence-informed, and evidence-based practices. In the most recent RHYTTAC grant award, applicants were evaluated based on their strategy for tracking RHY grantee uptake and implementation of evidence-based or evidence-informed strategies. Additionally, as described in the FY21 Transitional Living Program funding opportunity announcement, successful applicants must train all staff and volunteers on evidence-informed practices and provide case management services that include the development of service and treatment plans employing evidence-informed strategies (p. 4 & 47).
  • ACF also evaluates Unaccompanied Children Services, Preschool Development Grants, and Runaway and Homeless Youth grant applicants based upon: their proposed program performance evaluation plan; how their data will contribute to continuous quality improvement; and their demonstrated experience with comparable program evaluation, among other factors.
8.3 Did the agency use its five largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language that funding opportunity announcement drafters may select to require grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis.
  • As a condition of award, Head Start grantees are required to participate fully in ACF-sponsored evaluations, if selected to do so. As such, ACF has an ongoing research portfolio that is building evidence in Head Start. Research sponsored through Head Start funding over the past decade has provided valuable information not only to guide program improvement in Head Start itself, but also to guide the field of early childhood programming and early childhood development. Dozens of Head Start programs have collaborated with researchers in making significant contributions in terms of program innovation and evaluation, as well as the use of systematic data collection, analysis and interpretation in program operations.
  • ACF’s 2020 Healthy Marriage and Responsible Fatherhood (HMRF) Grants established required evidence activities by scope of grantee services (p.4). For example, large scope services (requesting funding between $1M-$1.5M) “must propose a rigorous impact evaluation (i.e., randomized-controlled trial (RCT) or high-quality, quasi-experimental design (QED) study)…and must allocate at least 15%, but no more than 20%, of their total annual funding for evaluation” (p.19) Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities (p.18) and conduct a local evaluation (p.18) or participate in a federally led evaluation or research effort (p. 22). ACF has an ongoing research portfolio building evidence related to Strengthening Families, Healthy Marriage, and Responsible Fatherhood, and has conducted randomized controlled trials with grantees in each funding round of these grants.
  • The 2003 Reauthorization of the Runaway and Homeless Youth Act called for a study of long-term outcomes for youth who are served through the Transitional Living Program (TLP). In response, ACF is sponsoring a study that will capture data from youth at program entry and at intermediate- and longer-term follow-up points after program exit and will assess outcomes related to housing, education, and employment. ACF is also sponsoring a process evaluation of the 2016 Transitional Living Program Special Population Demonstration Project.
  • Additionally, Unaccompanied Children Services (p. 33), Preschool Development Grants (p. 30), and Runaway and Homeless Youth (p.24) grantees are required to develop a program performance evaluation plan.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • ACF’s Personal Responsibility Education Program includes three individual discretionary grant programs that fund programs exhibiting evidence of effectiveness, innovative adaptations of evidence-based programs, and promising practices that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funding through ACFs Sexual Risk Avoidance Education (SRAE) program, applicants must cite evidence published in a peer-reviewed journal and/or a randomized controlled trial or quasi-experimental design to support their chosen interventions or models.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
  • As mentioned above, ACF is conducting a multi-pronged evaluation of the Health Profession Opportunity Grants Program (HPOG). Findings from the first cohort of HPOG grants influenced the funding opportunity announcement for the second round of HPOG (HPOG 2.0) funding. ACF used findings from the impact evaluation of the first cohort of HPOG grants to provide insights to the field about which HPOG program components are associated with stronger participant outcomes. For example, based on the finding that many participants engaged in short-term training for low-wage, entry-level jobs, the HPOG 2.0 FOA more carefully defined the career pathways framework, described specific strategies for helping participants progress along a career pathway, and identified and defined key HPOG education and training components. Applicants were required to more clearly describe how their program would support career pathways for participants. Based on an analysis, which indicated limited collaborations with healthcare employers, the HPOG 2.0 FOA required applicants to demonstrate the use of labor market information, consult with local employers, and describe their plans for employer engagement. The HPOG 2.0 FOA also placed more emphasis on the importance of providing basic skills education and assessment of barriers to make the programs accessible to clients who were most prepared to benefit, based on the finding that many programs were screening out applicants with low levels of basic literacy, reading, and numeracy skills.
  • ACF’s Personal Responsibility Education Innovative Strategies Program (PREIS) grantees must conduct independent evaluations of their innovative strategies for the prevention of teen pregnancy, births, and STIs, supported by ACF training and technical assistance. These rigorous evaluations are designed to meet the HHS Teen Pregnancy Prevention Evidence-Based Standards and are expected to generate lessons learned so that others can benefit from these strategies and innovative approaches.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language instructing grantees to conduct evaluation efforts. Program offices may use this template to require grantees to collect performance data or conduct a rigorous evaluation. Applicants are instructed to include third-party evaluation contracts in their proposed budget justifications.
  • ACF’s 2020 Healthy Marriage and Responsible Fatherhood (HMRF) Grants established required evidence activities by scope of grantee services (p.4). For example, large scope services (requesting funding between $1M-$1.5M) “must propose a rigorous impact evaluation (i.e., randomized-controlled trial (RCT) or high-quality, quasi-experimental design (QED) study)…and must allocate at least 15%, but no more than 20%, of their total annual funding for evaluation” (p.19) Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities (p.18) and conduct a local evaluation (p.18) or participate in a federally led evaluation or research effort (p. 22).
  • ACF’s 2018 Preschool Development Grants funding announcement notes that “it is intended that States or territories will use a percentage of the total amount of their [renewal] grant award during years two through four to conduct the proposed process, cost, and outcome evaluations, and to implement a data collection system that will allow them to collect, house, and use data on the populations served, the implementation of services, the cost of providing services, and coordination across service partners.”
  • ACF’s rules (section 1351.15) allow Runaway and Homeless Youth grant awards to be used for “data collection and analysis.”
  • Regional Partnership Grants (RPG) (p.1) require a minimum of 20% of grant funds to be spent on evaluation elements. ACF has supported the evaluation capacity of RPG grantees by providing technical assistance for data collection, performance measurement, and continuous quality improvement; implementation of the cross-site evaluation; support for knowledge dissemination; and provision of group TA via webinars and presentation.
  • Community Collaboratives to Strengthen and Preserve Families (CCSPF) grants (p.7) require a minimum of 10% of grant funds to be used on data collection and evaluation activities. ACF has supported the evaluation capacity of CCSPF grantees by providing technical assistance for developing research questions, methodologies, process and outcome measures; implementing grantee-designed evaluations and continuous quality improvement activities; analyzing evaluation data; disseminating findings; and supporting data use in project and organizational decision-making processes
  • ACF also provides evaluation technical assistance to:
    • Support grantees participating in federal evaluations (e.g., projects supporting grantees from Health Profession Opportunity Grants 2.0 and Tribal Health Profession Opportunity Grants 2.0); and
Score
6
Use of Evidence in Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY21?

9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
  • Most of ACF’s non-competitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds. Several programs do have performance-based payment incentive programs, however. For example, The Adoption and Legal Guardianship Incentive Payments program, most recently reauthorized through FY21 as part of the Family First Prevention Services Act (Social Security Act §473A), provides incentive payments to jurisdictions for improved performance in both adoptions and legal guardianship of children in foster care. Additionally, the Child Support program (p.6) as an outcome-based performance management system established by the Child Support Performance and Incentive Act of 1998 (CSPIA; Social Security Act § 458). Under CSPIA, states are measured in five program areas: paternity establishment, support order establishment, current support collections, cases paying towards arrears, and cost effectiveness. This performance-based incentive and penalty program is used to reward states for good or improved performance and to impose penalties when state performance falls below a specified level and has not improved.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF – The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.” (413(a)). Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant.
  • ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals. ACF widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • ACF’s TANF Data Innovation (TDI) project, launched in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states and is now supporting a TANF agency Pilot program with 8 Pilot sites. To support state and local efforts and build strategic partnerships, Pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act (p. 34) does not allocate funding for States to independently build evidence, the Act allows for up to 0.5% of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and evaluation of the CCDBG grant program and to disseminate the key findings of those evaluations widely and on a timely basis. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems (QRIS); and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: ACF administers several foster care and related child welfare grant programs that do not possess the funding authority for States to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the John H. Chafee Foster Care Program for Successful Transition to Adulthood program (Chafee program) for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being(NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • ACF has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds off of the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded Digital Marketing Grants to test digital marketing approaches and partnerships to reach parents that could benefit from child support services, and create or improve two-way digital communication and engagement with parents.
  • ACF continues to manage a broad child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence-informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work”) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects” (§413(g)). In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user-friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment, coaching, career pathways, and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families (BEES) project and the Next Generation of Enhanced Employment Strategies (NextGen) project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  •  Community-Based Child Abuse Prevention (CBCAP) programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA). CAPTA promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
    • 1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990) Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9% of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including: evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
Score
6
Repurpose for Results

In FY21, did the agency shift funds away from or within any practice, policy, interventions, or program that consistently failed to achieve desired outcomes?

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Family First Prevention Services Act of 2018 only allows federal matching funds only for evidence-based prevention services offered by states, thereby incentivizing states to shift their spending from non-evidence based approaches.
  • For ACF’s Child and Family Services Reviews (CFSRs) of state child welfare systems, states determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. ACF supports the states with technical assistance and monitors implementation of their plans. States must successfully complete their plans to avoid financial penalties for nonconformance.
  • The ACF Head Start program significantly expanded its accountability provisions with the establishment of five-year Head Start grant service periods and the Head Start Designation Renewal System (DRS). The DRS was designed to determine whether Head Start and Early Head Start programs are providing high quality comprehensive services to the children and families in their communities. Where they are not, grantees are denied automatic renewal of their grant and must apply for funding renewal through an open competition process. Those determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. Data from ACF’s Head Start Family and Child Experiences Survey (FACES) and Quality Features, Dosage, Thresholds and Child Outcomes (Q-DOT) study were used to craft the regulations that created the DRS and informed key decisions in its implementation. This included where to set minimum thresholds for average CLASS scores, the number of classrooms within programs to be sampled to ensure stable program-level estimates on CLASS, and the number of cycles of CLASS observations to conduct. At the time the DRS notification letters were sent out to grantees in 2011, there were 1,421 non-tribal active grants, and of these, 453 (32%) were required to re-compete (p.19).
  • Findings from the evaluation of the first round Health Profession Opportunity Grants (HPOG) program influenced the funding opportunity announcement for the second round of HPOG funding. Namely, the scoring criteria used to select HPOG 2.0 grantees incorporated knowledge gained about challenges experienced in the HPOG 1.0 grant program. For example, based on those challenges, applicants were asked to clearly demonstrate–and verify with local employers–an unmet need in their service area for the education and training activities proposed. Applicants were also required to provide projections for the number of individuals expected to begin and complete basic skills education. Grantees must submit semi-annual and annual progress reports to ACF to show their progress in meeting these projections. If they have trouble doing so, grantees are provided with technical assistance to support improvement or are put on a corrective action plan so that ACF can more closely monitor their steps toward improvemen
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • In an effort to create operational efficiencies and increase grantee capacity for mission-related activities,ACF implemented a process in 2019 in which the grants management office completes annual risk modeling of grantee financial administrative datasets, which helps identify organizations that would benefit from targeted technical assistance. The grants management office provides TA to these grantees to improve their financial management and help direct resources toward effective service delivery.
  • As mentioned in 10.1, states reviewed by a Child and Family Services Review (CFSR) and determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. ACF supports the states with technical assistance and monitors implementation of their plans. ACF also provides broad programmatic technical assistance to support grantees in improving their service delivery, including the Child Welfare Capacity Building Collaborative. The Collaborative is designed to help public child welfare agencies, Tribes, and courts enhance and mobilize the human and organizational assets necessary to meet Federal standards and requirements; improve child welfare practice and administration; and achieve safety, permanency, and well-being outcomes for children, youth, and families. ACF also sponsors the Child Welfare Information Gateway, a platform connecting child welfare, adoption, and related professionals as well as the public to information, resources, and tools covering topics on child welfare, child abuse and neglect, out-of-home care, adoption, and more.
Back to the Standard

Visit Results4America.org