2021 Federal Standard of Excellence


Use of Evidence in Non-Competitive Grant Programs*

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY21? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

Score
10
Millennium Challenge Corporation
  • MCC does not administer non-competitive grant programs.
Score
7
U.S. Department of Education
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement. 
  • Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)). 
  • The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, has revised its accountability system to shift the balance from a system focused primarily on compliance to one that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • States and school districts are implementing the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  •  In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
Score
7
U.S. Agency for International Development
  • USAID does not administer non-competitive grant programs (relative score for criteria #8 applied)
Score
6
Administration for Children and Families (HHS)
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
  • Most of ACF’s non-competitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds. Several programs do have performance-based payment incentive programs, however. For example, The Adoption and Legal Guardianship Incentive Payments program, most recently reauthorized through FY21 as part of the Family First Prevention Services Act (Social Security Act §473A), provides incentive payments to jurisdictions for improved performance in both adoptions and legal guardianship of children in foster care. Additionally, the Child Support program (p.6) as an outcome-based performance management system established by the Child Support Performance and Incentive Act of 1998 (CSPIA; Social Security Act § 458). Under CSPIA, states are measured in five program areas: paternity establishment, support order establishment, current support collections, cases paying towards arrears, and cost effectiveness. This performance-based incentive and penalty program is used to reward states for good or improved performance and to impose penalties when state performance falls below a specified level and has not improved.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF – The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.” (413(a)). Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant.
  • ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals. ACF widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • ACF’s TANF Data Innovation (TDI) project, launched in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states and is now supporting a TANF agency Pilot program with 8 Pilot sites. To support state and local efforts and build strategic partnerships, Pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act (p. 34) does not allocate funding for States to independently build evidence, the Act allows for up to 0.5% of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and evaluation of the CCDBG grant program and to disseminate the key findings of those evaluations widely and on a timely basis. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems (QRIS); and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: ACF administers several foster care and related child welfare grant programs that do not possess the funding authority for States to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the John H. Chafee Foster Care Program for Successful Transition to Adulthood program (Chafee program) for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being(NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • ACF has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds off of the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded Digital Marketing Grants to test digital marketing approaches and partnerships to reach parents that could benefit from child support services, and create or improve two-way digital communication and engagement with parents.
  • ACF continues to manage a broad child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence-informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work”) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects” (§413(g)). In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user-friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment, coaching, career pathways, and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families (BEES) project and the Next Generation of Enhanced Employment Strategies (NextGen) project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  •  Community-Based Child Abuse Prevention (CBCAP) programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA). CAPTA promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
    • 1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990) Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9% of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including: evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
Score
3
AmeriCorps
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY21, the five largest non-competitive grant programs are:
    1. AmeriCorps State formula grants program ($152,482,034 eligible grantees: states);
    2. AmeriCorps National Civilian Community Corps (NCCC) ($32.5 million; eligible grantees: nonprofit organizations);
    3. AmeriCorps VISTA ($93 million; eligible grantees: nonprofit organizations, state, tribal, and local governments, institutions of higher education);
    4. Senior Corps Foster Grandparents ($118 million; eligible grantees: nonprofit organization, local governments)
    5. Senior Corps Senior Companion Program ($50 million; eligible grantees: nonprofit organizations, local governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY18, Senior Corps Foster Grandparents and Senior Companion Program embedded evidence into their grant renewal processes by offering supplemental funding, “augmentation grants,” to grantees interested in deploying volunteers to serve in evidence-based programs. More than $3.3 million of Senior Corps program dollars were allocated, over three years, toward new evidence-based programming augmentations. Grantees will be operating with their augmentations through fiscal year 2021.
  • In a survey completed in FY20, Senior Corps grantees reported that 4,043 volunteer stations and 20,320 volunteers (10% of all volunteers) were engaged in evidence-based programming.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • In FY19, Senior Corps completed an evaluation with an independent firm to produce case studies and comparative analyses of select grantees that received an evidence-based programming augmentation to understand successes, challenges, and other issues. This report is being used to inform Senior Corps’ approach to replicating this augmentation initiative, as well as the training/technical assistance needs of grantees.
  • Senior Corps and the Administration for Community Living have continued a dialogue about how to build and broaden the evidence base for various programs designed for older adults, particularly for aging and disability evidence-based programs and practices. AmeriCorps previously utilized ACL’s list of evidence-based programs for its augmentation grants and is encouraging Senior Corps grantees to move toward more evidence-based programming.
  • For FY20, Senior Corps continued funding five demonstration grants, totaling $2,579,475, which authorize organizations to implement the Senior Corps program model with certain modifications to standard AmeriCorpspolicies. Demonstration grants allow Senior Corps to analyze potential policy changes.
  • AmeriCorps NCCC invested in a Service Project Database that provides staff access to data on all NCCC projects completed since 2012. The database thematically organizes projects, classifies project frameworks, and categorizes the outcomes of these service initiatives. NCCC is investing in an evaluation of NCCC’s impact. This research project was initiated in FY18 and is focused on evaluating member retention, studying how NCCC develops leadership skills in its members and teams, and the program’s ability to strengthen communities. Finally, NCCC will continue to invest in research grants to better understand the outcomes of its disaster response efforts.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • AmeriCorps only administers five non-competitive grant programs, as described above.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Senior Corps and the Office of Research and Evaluation completed a longitudinal evaluation of the Foster Grandparents and Senior Companion Programs in FY19 that demonstrated the positive health outcomesassociated with volunteering. A 50 year retrospective review of the research conducted on Senior Corps programs was completed at the end of FY19 and was posted on the Evidence Exchange in FY20.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • AmeriCorps does not prohibit the use of formula dollars for evaluation but each State Commission may have its own guidelines. Further, formula grantees over $500,000 have to perform evaluations using their grant funds.
Score
7
U.S. Department of Labor
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY21, the five largest non-competitive grant programs are:
    1. Adult Employment and Training Activities ($862,649,000; eligible grantees: city, county, and/or state governments);
    2. Youth Activities ($921,130,000; eligible grantees: city, county, and/or state governments);
    3. Dislocated Worker Employment and Training formula grants ($1,061,553,000; eligible grantees: city, county, and/or state governments);
    4. UI State Administration ($2,365,816,000; eligible grantees: city, county, and/or state governments);
    5. Employment Security Grants to States ($670,052,000; eligible grantees: city, county, and/or state governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128) is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for federal evaluations.
  • WIOA’s evidence and performance provisions: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Section 116(e) of WIOA describes how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within, and high-level outcomes from, the workforce development system. 
  • The Employment and Training Administration sponsors the WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Professionals can access a variety of resources and tools, including an Evaluation Peer Learning Cohort to help leaders improve their research and evaluation capacities. The WorkforceGPS includes links to resources on evaluation assessment readiness, evaluation design, and performance data, all focused on improving the public workforce system.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Reemployment Services and Eligibility Assessments (RESEA) funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130 million to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating that show a demonstrated capacity to improve outcomes for participants; this percentage increases in subsequent years until after FY26, when states must use no less than 50% of such grant funds for such interventions. 
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of American Job Centers (AJCs), and to identify variations in service delivery, organization structure, and administration across AJCs. 
  • Career Pathways Descriptive and Analytical Study: WIOA requires DOL to “conduct a multistate study to develop, implement, and build upon career advancement models and practices for low-wage healthcare providers or providers of early education and child care.” In response, DOL conducted the Career Pathways Design Study to develop evaluation design options that could address critical gaps in knowledge related to the approach, implementation, and success of career pathways strategies generally, and in early care and education specifically. The Chief Evaluation Office (CEO) has recently begun the second iteration of this study. The purpose of this project is to build on the evaluation design work CEO completed in 2018 to build evidence about the implementation and effectiveness of career pathways approaches and meet the WIOA statutory requirement to conduct a career pathways study. It will include a meta-analysis of existing impact evaluation results as well as examine how workers advance through multiple, progressively higher levels of education and training, and associated jobs, within a pathway over time, and the factors associated with their success.
  • Analysis of Employer Performance Measurement Approaches: the goal of the study was to examine the appropriateness, reliability and validity of proposed measures of effectiveness in serving employers required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs. Though the authors did not find an overwhelming case for adopting either one measure or several measures, adopting more than one measure offers the advantage of capturing more aspects of performance and may make results more actionable for the different Title I, II, III, and IV programs. Alternatively, a single measure has the advantage of clarity on how state performance is assessed and fewer resources devoted to record keeping.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The Employment and Training Administration’s (ETA) RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. ETA released specific evaluation guidance to help states understand how to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WOIA requires states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WOIA programs in an annual progress which includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for Federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations to complete this report.
Score
4
U.S. Dept. of Housing & Urban Development
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Although the funding formulas are prescribed in statute, evaluation-based interventions are central to each program. HUD uses evidence from a 2015 Administrative Fee study of the costs that high-performing PHAs incur in administering a HCV program to revise its approach to providing administrative fees that incentivize PHAs to improve outcomes in leasing and housing mobility. HUD has also used the results of its Landlord Taskforce to provide guidance to PHAs on working effectively with landlords and to propose policy and fee changes to ensure strong landlord participation in the new Emergency Housing Voucher program funded through the American Rescue Plan. In allocating $5 billion in Emergency Housing Voucher funding to PHAs, HUD developed an allocation formula that considered (among other factors) evidence of PHA capacity to implement the program effectively and quickly.
  • HUD’s funding of public housing is being radically shifted through the evidence-based Rental Assistance Demonstration (RAD), which enables accessing private capital to address the $26 billion backlog of capital needs funding. Based on demonstrated success of RAD, for FY20 HUD proposed to transfer $95 million from the Operating Fund and Capital Fund to the Tenant-Based Rental Assistance fund to support RAD conversions. For FY21 HUD is proposing to remove the cap on the number of public housing developments to be converted to Section 8 contracts. HUD is beginning to evaluate RAD’s impacts on children. HUD is also conducting a Rent Reform demonstration and a Moving To Work (MTW) demonstration to test efficiencies of changing rent rules and effects on tenant outcomes.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Evidence-building is central to HUD’s funding approach through the use of prospective program demonstrations. These include the Public Housing Operating Fund’s Rental Assistance Demonstration (RAD), the Public Housing Capital Grants’ Rent Reform demonstration, and the Housing Choice Voucher program’s Moving To Work (MTW) demonstration grants. As Congress moved to expand MTW flexibilities to additional public housing authorities (PHAs), HUD sought authority to randomly assign cohorts of PHAs to provide the ability to rigorously test specific program innovations.
  • Program funds are provided to operate demonstrations through the HCV account, Tenant-Based Rental Assistance. These include the Tribal HUD-VA Supportive Housing (Tribal HUD-VASH) demonstration of providing permanent supportive housing to Native American veterans and the FSS-Family Unification Programdemonstration that tests the effect of providing vouchers to at-risk young adults who are aging out of foster care.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • HUD-Veterans Affairs Supportive Housing (HUD-VASH) vouchers are allocated in part on the administrative performance of housing agencies as measured by their past utilization of HUD-VASH vouchers in HUD’s Voucher Management System (Notice PIH-2019-15 (HA)). The performance information helps ensure that eligible recipients are actually able to lease units with the vouchers that HUD funds. The HUD-VASH Exit Study documented that 87,864 VASH vouchers were in circulation in April 2017, contributing substantially to the 47-percent decline in the number of homeless Veterans since 2010.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • To address a severe backlog of capital needs funding for the nation’s public housing stock, the Rental Assistance Demonstration was authorized in 2011 to convert the properties to project-based Section 8 contracts to attract an infusion of private capital. The 2019 final report on the RAD evaluation showed that conversions successfully raised $12.6 billion of funding, an average of $121,747 per unit to improve physical quality and stabilize project finances. Based on the program’s successes, the limit on the number of public housing conversions was increased to 455,000 units in 2018, nearly half of the stock, and HUD has been proposing to eliminate the cap. Additionally, HUD extended the conversion opportunity to legacy multifamily programs through RAD 2.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Communities receiving HUD block grant funding through Community Development Block Grants, HOME block grants, and other programs are required to consult local stakeholders, conduct housing needs assessments, and develop needs-driven Consolidated Plans to guide their activities. They then provide Consolidated Annual Performance and Evaluation Reports (CAPERs) to document progress toward their Consolidated Plan goals in a way that supports continued community involvement in evaluating program efforts.
  • HUD’s Community Development Block Grant program, which provides formula grants to entitlement jurisdictions, increases local evaluation capacity. Specifically, federal regulations (Section 24 CFR 570.200) authorize CDBG recipients (including city and state governments) to use up to 20% of their CDBG allocations for administration and planning costs that may include evaluation-capacity building efforts and evaluations of their CDBG-funded interventions (as defined in 570.205 and 570.206).
Score
4
Administration for Community Living (HHS)
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • As these are based on formula grants, the funding amount distributed to the States and tribal organizations are not determined using evidence-based application processes. Rather, the States and tribal organizations are responsible for directing the funds to evidence-based programs and organizations.
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Authorizing legislation for ACL’s largest non-competitive grant programs requires consideration of evidence-based programming as a requirement of funding. The Developmental Disabilities Assistance and Bill of Rights Act of 2000 allows for the withholding of funding if (1) the Council or agency has failed to comply substantially with any of the provisions required by section 124 to be included in the State plan, particularly provisions required by paragraphs (4)(A) and (5)(B)(vii) of section 124(c), or with any of the provisions required by section 125(b)(3); or (2) the Council or agency has failed to comply substantially with any regulations of the Secretary that are applicable.” As a condition of funding non-competitive grandees are required to “determine the extent to which each goal of the Council was achieved for that year” and report that information to ACL.
  • States that receive Older Americans Act Home and Community-Based Supportive Services Title III-D funds are required to spend those funds on evidence-based programs to improve health and well-being, and reduce disease and injury. In order to receive funding, states must utilize programs that meet ACL’s definition of evidence-based or are defined as evidence-based by another HHS operating division. Under the Older American Act, caregiver support programs are required to track and report on their use of evidence-based caregiver support services.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ACL’s Nutrition Services provides grants for innovations in nutrition programs and services. These research projects must have the potential for broad implementation and demonstrate potential to improve the quality, effectiveness, and outcomes of nutrition service programs by documenting and proving the effectiveness of these interventions and innovations. They must also target       services to underserved older adults with greatest social and economic need, and individuals at risk for institutional placement, to permit such individuals to remain in home and community-based settings. Consistent with the Administrator’s focus on identifying new ways to efficiently improve direct service programs, ACL is using its 1% Nutrition authority to fund $3.5 million for nutrition innovations and to test ways to modernize how meals are provided to a changing senior population. One promising demonstration currently being carried out by the Georgia State University Research Foundation (entitled Double Blind Randomized Control Trial on the Effect of Evidence-Based Suicide Intervention Training on the Home-Delivered and Congregate Nutrition Program through the Atlanta Regional Commission) which has drawn widespread attention is an effort to train volunteers who deliver home-delivered meals to recognize and report indicators of suicidal intent and other mental health issues so that they can be addressed. 
  • Under Home and Community-Based Services, FY12 Congressional appropriations included an evidence-based requirement for the first time. OAA Title III-D funding may be used only for programs and activities demonstrated to be evidence-based. The 
  • National Council on Aging maintains a tool to search for evidence-based programs that are approved for funding through OAA Title III-D.
  • ACL’s Caregiver Support Services builds evidence in a number of areas. These include a national survey of caregivers of older adult clients, gathering and reporting best practices regarding grandparents raising grandchildren, adapting and scaling evidence-based programs for children and older adults with disabilities through the RESILIENCE Rehabilitation Research and Training Center, and other similar efforts.
  • State Councils on Developmental Disabilities design five-year state plans that address new ways of improving service delivery. To implement the state plans, Councils work with different groups in many ways, including funding projects to show new ways that people with disabilities can work, play, and learn, and seeking information from the public and from state and national sources.
  • State Protection & Advocacy Systems encompass multiple avenues of protection and advocacy including specialization in individuals with developmental disabilities, assistive technology, voting accessibility, individuals with traumatic brain injury, and technical assistance. The Developmental Disabilities Assistance and Bill of Rights Act of 2000 requires Administration on Intellectual and Developmental Disabilities (AIDD) grantees to report annually on progress achieved through advocacy, capacity building, and systemic change activities.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • The 2020 reauthorization of the Older Americans Act requires that Assistive technology programs are “aligned with evidence-based practice;” that person-centered, trauma informed programs “incorporate evidence-based practices based on knowledge about the role of trauma in trauma victims’ lives;” and that a newly authorized Research, Demonstration, and Evaluation Center for the Aging Network increases “the repository of information on evidence based programs and interventions available to the aging network, which information shall be applicable to existing programs and interventions, and help in the development of new evidence-based programs and interventions.”
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Since 2017, ACL has awarded Innovations in Nutrition grants to 11 organizations to develop and expand evidence-based approaches to enhance the quality and effectiveness of nutrition programming. ACL is currently overseeing five grantees for innovative projects that will enhance the quality, effectiveness, and outcomes of nutrition services programs provided by the national aging services network. The grants total $1,197,205 for this year with a two-year project period. Through this grant program, ACL aims to identify innovative and promising practices that can be scaled across the country and to increase the use of evidence-informed practices within nutrition programs. 
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • All funding opportunity announcements published by ACL include language about generating and reporting evidence about their progress towards the specific goals set for the funds. Grantee manuals include information about the importance of and requirements for evaluation. The National Ombudsman Resource Center, funded by ACL, provides self-evaluation materials for Long-Term Care Ombudsman Programs (LTCOP) funded under Title VII of the Older Americans Act.
Score
7
Substance Abuse and Mental Health Services Administration (HHS)
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • The allocation of these state grants are based on formularies that include data provided by national data sets (including SAMHSA’s National Survey on Drug Use and Health), population estimates, as well as estimates of substance use and mental health disorders. 
  • The Substance Abuse Prevention and Treatment Block Grant requires recipients to describe and report on the evidence-based prevention programs to implement. Through the SABG, states should “identify, implement, and evaluate evidence-based programs, practices, and policies that have the ability to reduce substance use and improve health and well-being in all communities.” Funds can also be used for program implementation fidelity training. It also asks recipients about the Evidence-Based Workgroup that helps identify evidence-based strategies and programs for implementation. 
  • The Community Mental Health Services Block Grant requires recipients to describe in their plans use of not less than 10% of the MHBG funds to carry evidence-based programs that address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset. Recipients are required to describe any existing and implemented evidence-based practices, how the state promotes evidence-based practices and details on data collection and program implementation strategies. 
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Information on how to use funds for data collection and evaluation is covered in the Block Grant application. Grantees are encouraged to allocate grants funds for data collection, data analysis and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process. In FY21, SAMHSA updated their application manual to include a section on developing goals and measurable objectives (see p. 38). Specifically, the document states, “To be able to effectively evaluate your project, it is critical that you develop realistic goals and measurable objectives. This chapter will provide information on developing goals and measurable objectives. It will also provide examples of well-written goals and measurable objectives.” 
  • Grantees in non-competitive grant programs are required to submit quantitative data to SAMHSA using reporting systems associated with their grant. For example, State Mental Health Agencies receive noncompetitive grants and compile and report annual data collected from SAMHSA’s Community Mental Health Block Grant. More information on the URS or Uniform Reporting System can be found online. In this way, noncompetitive grant programs not only allow the sharing of data for research and evaluation but this allows grantees to explore data from other state grantees.
  • In the FY20-21 Block Grant Application, SAMHSA asks states to base their administrative operations and service delivery on principles of Continuous Quality Improvement/Total Quality Management (CQI/TQM). These CQI processes should identify and track critical outcomes and 72 performance measures, based on valid and reliable data, consistent with the NBHQF, which will describe the health and functioning of the mental health and addiction systems. The CQI processes should continuously measure the effectiveness of services and supports and ensure that they continue to reflect this evidence of effectiveness. The state’s CQI process should also track programmatic improvements using stakeholder input, including the general population and individuals in treatment and recovery and their families. In addition, the CQI plan should include a description of the process for responding to emergencies, critical incidents, complaints, and grievances.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • The majority of SAMHSA grants are competitively awarded. SAMHSA only has four non-competitive grants which are included above.
  • As stated in the Block Grant application (p. 45): “States may implement models that have demonstrated efficacy, including the range of services and principles identified by National Institute of Mental Health (NIMH) via its Recovery After an Initial Schizophrenia Episode (RAISE) initiative. Utilizing these principles, regardless of the amount of investment, and by leveraging funds through inclusion of services reimbursed by Medicaid or private insurance, states should move their system to address the needs of individuals with a first episode of psychosis (FEP). RAISE was a set of NIMH sponsored studies beginning in 2008, focusing on the early identification and provision of evidence-based treatments to persons experiencing FEP. The NIMH RAISE studies, as well as similar early intervention programs tested worldwide, consist of multiple evidence-based treatment components used in tandem as part of a Coordinated Specialty Care (CSC) model, and have been shown to improve symptoms, reduce relapse, and lead to better outcomes. State shall expend not less than 10% of the MHBG amount the State receives for carrying out this section for each fiscal year to support evidence-based programs that address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset. In lieu of expending 10% of the amount the State receives under 45 this section for a fiscal year as required a state may elect to expend not less than 20% of such amount by the end of such succeeding fiscal year.”
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • SOR Grantees have implemented evidence-based practices focused on safe prescribing, naloxone and medication for opioids use disorder to help support and build knowledge around the use of these EBPs.  
  • Currently, State Opioid Response grantees are using funds to examine Fentanyl Test Strips. This recent change under the Biden-Harris Administration will help to build knowledge on the utility of these evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Information on how to use funds for data collection and evaluation is covered in the Block Grant application. Grantees are encouraged to allocate grants funds for data collection, data analysis and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process. 
Back to the Standard

Visit Results4America.org