2021 Federal Standard of Excellence


AmeriCorps

71
Score
8
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY21?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Director of the Office of Research & Evaluation serves as the AmeriCorps evaluation officer. The Director of Research and Evaluation (R&E) oversees R&E’s FY21 $4 million budget and a staff of 13. On average, the agency has invested ~$1 million in the Office of Research and Evaluation staff over the past eight More information about R&E can be found here.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • AmeriCorps hired a new Chief Data Officer (CDO) in FY21. The CDO will address long standing data priorities including building the agency’s data analytics capacity as well as developing a process/structure to ensure coordination and collaboration across data integrity/management, data for performance and data for research and evaluation.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support, improve, and evaluate the agency’s major programs?
  • AmeriCorps has a Research & Evaluation Council that meets monthly to assess progress in implementing the agency’s learning agenda and evaluation plan. Members of the Council include the Director of R&E, the CIO/CDO, the Chief of Staff, as well as the Chief of Program Operations and the Chief Operating Officer.
Score
8
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY21?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • AmeriCorps has an evaluation policy that presents five key principles that govern the agency’s planning, conduct, and use of program evaluations: rigor, relevance, transparency, independence, and ethics.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • In FY19, AmeriCorps finalized and posted a five year, agency-wide strategic evaluation plan. AmeriCorps is in the process of updating its learning agenda (strategic evidence plan) to align with the agency’s FY22-26 strategic plan.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • AmeriCorps uses the terms learning agenda, evaluation plan, and strategic evidence-building plan synonymously. AmeriCorps has a strategic evidence plan that includes an evergreen learning agenda. The plan has been updated and submitted to OMB for review and comment. In addition, the draft document has been shared with AmeriCorps State and National State Commissions who will have an opportunity to provide feedback for the remainder of 2021. Additionally, the agency is devising a plan to engage external stakeholders in commenting on the revised learning agenda.
2.4 Did the agency publicly release all completed program evaluations?
  • All completed evaluation reports are posted to the Evidence Exchange, an electronic repository for evaluation studies and other reports. This virtual repository was launched in September 2015. 
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 3115, subchapter II (c)(3)(9))
  • A comprehensive portfolio of research projects has been built to assess the extent to which AmeriCorps is achieving its mission. As findings emerge, future studies are designed to continuously build the agency’s evidence base. R&E relies on scholarship in relevant fields of academic study; a variety of research and program evaluation approaches including field, experimental, and survey research; multiple data sources including internal and external administrative data; and different statistical analytic methods. AmeriCorps relies on partnerships with universities and third party research firms to ensure independence and access to state of the art methodologies. AmeriCorps supports its grantees with evaluation technical assistance and courses to ensure their evaluations are of the highest quality and requires grantees receiving $500,000 or more in annual funding to engage an external evaluator. These efforts have resulted in a robust body of evidence that national service allows: (1) national service participants to experience positive benefits, (2) nonprofit organizations to be strengthened, and (3) national service programs to effectively address local issues (along with a suite of AmeriCorps resources for evaluations).
  • While AmeriCorps is a non-CFO agency, and therefore not required to comply with the Evidence Act, including the mandated Evidence Capacity Assessment, the agency is procuring a third party to support analysis of the agency’s evaluation, research, statistical and analysis workforce capacity.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • AmeriCorps uses the research design most appropriate for addressing the research question. When experimental or quasi-experimental designs are warranted, the agency uses them and encourages its grantees to use them, as noted in the agency evaluation policy: “AmeriCorps is committed to using the most rigorous methods that are appropriate to the evaluation questions and feasible within statutory, budget and other constraints.” As of September 2021, AmeriCorps has received 46 grantee evaluation reports that use experimental design and 140 that use quasi-experimental design.
Score
10
Resources

Did the agency invest at least 1% of program funds in evaluations in FY21?
(Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1 ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY21 budget.
  • AmeriCorps invested $8,500,000.00 on evaluations, evaluation technical assistance, and evaluation capacity-building, representing 1.0% of the agency’s $843,115,000 million FY21 operating budget.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • Congress allocated $4,000,000 to AmeriCorps for its evaluation budget. This is the same amount allocated in FY21.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • R&E funds a contractor to provide AmeriCorps grantees with evaluation capacity building support ($500,000 of the $4,000,000 evaluation budget). R&E staff are also available to State Commissions for their evaluation questions and make resources (e.g., research briefs summarizing effective interventions, online evaluation planning and reporting curricula) available to them and the general public. AmeriCorps awards investment fund grants to State Commissions ($8.5 million in FY21), of which approximately one-third will be used for data and evidence capacity building activities based on prior year activities.
Score
5
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY21?

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • AmeriCorps has submitted two drafts of the FY22-26 strategic plan (goals and objectives) to OMB and will finalize it by the end of calendar year 2021.  
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • Several data collection efforts were undertaken in FY21 for the primary purpose of continuous improvement and learning.
    • The Office of the Chief Risk Officer conducted a risk assessment survey and an entity level controls survey. The risk assessment survey was conducted to identify any risks associated with managing the American Rescue Plan increases in agency funding. The findings were used to develop risk mitigation strategies. Similarly, findings from the entity level controls survey will be used to make improvements in organizational business processes needed to achieve the agency’s mission.
    • The Office of Research and Evaluation completed an internal evaluation focused on the successes and challenges of its new regional structure and staff positions. Lessons learned after a year of implementation will be used to improve organizational communication processes and have led to the decision to invest in a third-party workforce analysis.
    • The agency conducted a staff survey focused on issues of diversity, equity and inclusion in the workforce. These findings, as well as the Federal Employee Viewpoint Survey findings, will be used by the CEO and COS to develop improvements in the agency’s work environment.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The agency’s Chief Financial Officer has stood up a standing cross-agency meeting with the Offices of Human Capital, Information Technology, Risk, Research and Evaluation, as well as the Department Heads (Chief of Staff, Chief Operating Officer, Chief of Program Operations to ensure that the agency has a data-driven performance management system. Once the agency has approved performance indicators (as part of its final strategic plan) this meeting will be used to assess progress against strategic goals and their corresponding performance targets.
Score
6
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY21?

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • AmeriCorps has three policies related to managing the agency’s data assets: Information Technology Management (policy 381), Information Technology Governance (policy 382) and Information Technology Data Management (policy 383).
  • The CIO, the CDO, and the Director of Research and Evaluation/Evaluation Officer will be working together in FY21 to reconstitute and reconvene the agency’s Data Council and determine what kind of charter/agency policy may be needed for establishing the role of the Council with regard to managing the agency’s data assets. In essence, the role of the Council, under the direction of the CDO, will be to prioritize data asset management issues such as creating an annual Fact Sheet (so all externally facing numbers have a single authoritative source), creating a more user-friendly interface for the agency’s data warehouse/data inventory, and keeping the agency’ open data platform current.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The agency’s Information Technology Data Management Policy addresses the need to have a current and comprehensive data inventory. The agency has an open data platform.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • AmeriCorps has a data request form and an MOU template so that anyone interested in accessing agency data may use the protocol to request data. In addition, public data sets are accessible through the agency’s open data platform. The agency’s member exit survey data was made publicly available for the first time in FY19. In addition, nationally representative civic engagement and volunteering statistics are available, through a data sharing agreement with the Census Bureau, on an interactive platform. The goal of these platforms is to make these data more accessible to all interested end-users.
  • The Portfolio Navigator pulls data from the AmeriCorps data warehouse for use by the agency’s Portfolio Managers and Senior Portfolio Managers. The goal is to use this information for grants management and continuous improvement throughout the grant lifecycle.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • The agency has a new Privacy Policy (policy 153) that was signed in FY20 and posted internally. The Information Technology Data Governance Policy addresses data security. The agency conducts Privacy Impact Assessments which are a privacy review of each of AmeriCorps’ largest electronic systems which are then published online (click on the first 3 listings or PRISM).
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • AmeriCorps provides assistance to grantees, including governments, to help them access agency data. For example, AmeriCorps provides assistance on using the AmeriCorps Member Exit Survey data to State Service Commissions (many of which are part of state government) and other grantees as requested and through briefings integrated into standing calls with these entities.
Score
7
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY21? (Example: What Works Clearinghouses)

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • AmeriCorps uses the same standard scientific research methods and designs for all of its studies and evaluations following the model used by clearinghouses like Department of Education’s What Works Clearinghouse, the Department of Labor’s Clearinghouse for Labor Evaluation and Research, and the Department of Health and Human Services’ Home Visiting Evidence of Effectiveness project.
6.2 Did the agency have a common evidence framework for funding decisions?
  • AmeriCorps has a common evidence framework for funding decisions in the Senior Corps and AmeriCorps State and National programs. This framework, which is articulated in the AmeriCorps State and National program notice of funding, includes the following evidence levels: pre-preliminary, preliminary, moderate, and strong.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • The AmeriCorps Evidence Exchange is a virtual repository of reports and resources intended to help AmeriCorpsgrantees and other interested stakeholders find information about evidence- and research-based national service programs. Examples of the types of resources available in the Evidence Exchange include research briefs that describe the core components of effective interventions such as those in the areas of education, economic opportunity, and health.
  • R&E also creates campaigns and derivative products to distill complex report findings and increase their utility for practitioners (for example, this brief on a study about the health benefits of Senior Corps). R&E has categorized reports according to their research design, so that users can easily search for experimental, quasi-experimental, or non-experimental studies, and those that qualify for strong, moderate, or preliminary evidence levels.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • AmeriCorps has an agency-wide approach to promoting the use of evidence-based practices by the field and employs a variety of strategies including evidence briefs, broad-based support to national service organizations, and targeted technical assistance to grantees. First, R&E creates campaigns and derivative products to distill complex report findings and increase their utility for practitioners (for example, this brief on a study about the health benefits of Senior Corps). Second, AmeriCorps has created user-friendly research briefs that describe the core components of effective interventions in the areas of education, economic opportunity, and health. These briefs are designed to help grantees (and potential grantees) adopt evidence-based approaches. Third, R&E funds a contractor to provide AmeriCorps grantees with evaluation capacity building support; R&E staff are also available to State Commissions for their evaluation questions and make resources (e.g., research briefs summarizing effective interventions, online evaluation planning and reporting curricula) available to them and the general public. Fourth, AmeriCorps funds and participates in grantee conferences that include specific sessions on how to incorporate evidence and data into national service programs. Fifth, as part of the AmeriCorps State and National FY20 application process, AmeriCorps provided technical assistance to grantees on using evidence-based practices through webinars and calls. R&E and AmeriCorps conducted a process evaluation of granteeswith varied replication experiences to produce a series of products designed to help grantees implement evidence-based interventions (including a forthcoming article in The Foundation Review). SeniorCorps continues to encourage and support the use of evidence-based programs, as identified by the HHS’s Administration for Community Living, by its grantee organizations.
Score
5
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY21?
(Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements)

7.1 Did the agency have staff dedicated to leading its innovation efforts to improve the impact of its programs?
  • The Office of Research and Evaluation hired a Learning Officer in FY21. The learning officer was a program office for the Social Innovation Fund. The learning officer and other colleagues in the Office of Research and Evaluation led the development of a concept paper for a National Service Equity, Evidence, and Innovation Fund. The purpose of the proposed fund includes:
    • To apply a developmental, tiered-grantmaking process to grow DEI organization’s access and use of National Service into community solutions by either:
      • Replicating or expanding current Social Innovation Fund (SIF) moderate or strong evidence-based models augmented with National Service in underrepresented communities, or
      • Taking current evidence-based and evidence-informed National Service models (ASN, AmeriCorps Seniors) and scaling culturally appropriate solutions that work into underrepresented communities
    • Ensuring that evidence building is integrated in the process to guide and document progress.
  • The concept is informed by four agency programs, their expertise in process, award management and the body of evidence they have generated, combined with the entrepreneurial business ideas of incubator and accelerator designs. The agency plans to include this idea in its FY23 budget request and will use FY22 to further refine the ideas.
7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • AmeriCorps’s Evidence Exchange includes a suite of scaling products on the evidence exchange to help grantees replicate evidence-based interventions.
  • AmeriCorps continued to learn from its evidence-based planning grant program which “awards evidence-based intervention planning grants to organizations that develop new national service models seeking to integrate members into innovative evidence-based interventions.” AmeriCorps continued to learn from its research grantees, who receive grant funds to engage community residents and leaders in the development of new and innovative national service projects (more information available here). In addition to national service project development, these grants foster civic engagement through community research teams and build community capacity for using research to identify and understand local issues as well as identify possible solutions. Examples of these research-to-action projects include:
    • A researcher at the University of Nevada worked with NCCC Pacific leaders to craft a series of local projects, like building sidewalks and community cleanups, emerging from her CBPR project with youth scientists working together to understand slow violence in their own communities as well as that of people experiencing homelessness in the area. The partnership between NCCC, the University of Nevada is quite strong and the local government is supportive of the work in the region. Early in 2020, housing had been donated by a local community organization and the NCCC team was assigned and scheduled to begin on April 21st but the work was placed on hold due to the COVID-19 pandemic. Recently, PI and ADP have resumed discussions about rescheduling the team’s arrival.
    • Researchers at the Virginia Polytechnic Institute, State University Tech (Virginia Tech University), and Virginia Commonwealth University have brought together community partners and stakeholders in Martinsville, VA to address the local opioid crisis. They are using an evidence-based stakeholder engagement approach (SEED) that has led to successful outcomes in Martinsville. In Year 2, they collaborated with the Minnesota AmeriCorps state commission, ServeMinnesota, to replicate this project and approach with a focus on deploying AmeriCorps volunteers to meet unmet service needs around the opioid crisis in Minneapolis. Because of the success in rural Virginia and Martinsville, this approach will be further replicated in another town in rural Virginia and another town outside Minneapolis.
    • A researcher at Mississippi State University collaborated with NCCC Southern Campus to draft a concept paper for a NCCC team when COVID-19 struck–both agreed to table ideas for FAFSA support and ACT preparation until the crisis had subsided.
    • A researcher Drew University successfully collaborated with a former senior New Jersey state government official on a concept paper for VISTAs to work with their community partner, Family Promise, to build the organization’s capacity to work with local landlord’s and people experiencing homelessness–a role identified through their grant funded research. They were awarded two VISTAs.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • As part of the evaluation of the Social Innovation Program, which was designed to identify and rigorously test innovative approaches to social service problems, AmeriCorps continues to receive evaluation reports from grantees. As of May 2020, AmeriCorps has received 129 final SIF evaluation reports, of which 31 (24%) were experimental designs and 74 (57%) were quasi-experimental designs. Further, the evidence-based planning grant program and the research grant program both seek to generate innovative national service models. The planning grants require an evaluation plan. The research grants use evidence to inform action planning and solutions.
  • The Office of Research and Evaluation initiated a third-party evaluation of the 2018 grant program cohort (all using community-based participatory action research method) to identify outcomes, including the outcomes of national service projects developed through participatory research. The evaluation is part of a larger Program Lifecycle Evaluation project and is one of the bundled evaluations initiated in FY21. 
Score
13
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY21?
(Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY21, the five largest competitive grant programs are:
    1. AmeriCorps State and National program (excluding State formula grant funds) ($253,704,774; eligible grantees: nonprofit organizations, state governments, tribal governments, local governments, institutions of higher education);
    2. Senior Corps RSVP program ($51,355,000; eligible grantees: nonprofit organizations, local governments).
  • The Social Innovation Fund (SIF) grants were integrated into the Office of Research and Evaluation in FY19.
8.2 Did the agency use evidence of effectiveness to allocate funds in the five largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • AmeriCorps’s AmeriCorps State and National grants program (excluding State formula grant funds), allocated up to 44 out of 100 points to organizations that submit applications supported by performance and evaluation data in FY21. Specifically, up to 24 points can be assigned to applications with theories of change supported by relevant research literature, program performance data, or program evaluation data; and up to 20 points can be assigned for an applicant’s incoming level of evidence and the quality of the evidence. Further, in 2020 AmeriCorps prioritized the funding of specific education, economic opportunity, and health interventions with moderate or strong levels of evidence.
  • Since AmeriCorps’ implementation of a scoring process that assigns specific points for level of evidence, the percentage of grant dollars allocated to strong, moderate, preliminary, and no evidence categories has shifted over time (see chart below), such that more FY20 grant dollars were awarded to applicants with strong and moderate levels of evidence for proposed interventions, and fewer grant dollars were awarded to applicants with little to no evidence of effectiveness. Note that 68% of FY21 grant dollars versus 51% of FY20 grant dollars were invested in interventions with a strong or moderate evidence base.

  • In FY18, Senior Corps RSVP embedded evidence into their grant renewal processes by offering supplemental funding, “augmentation grants,” to grantees interested in deploying volunteers to serve in evidence-based programs. More than $3.3 million of Senior Corps program dollars were allocated, over three years, toward new evidence-based programming augmentations. Grantees will be operating with their augmentations through fiscal year 2021.
  • In a survey completed in FY20, Senior Corps grantees reported that 4,043 volunteer stations and 20,320 volunteers (10% of all volunteers) were engaged in evidence-based programming.
8.3 Did the agency use its five largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • AmeriCorps State and National grantees are required to evaluate their programs as part of the grant’s terms and conditions. Grantees receiving more than $500,000 required to conduct an independent, external evaluation (see p. 23 of the FY21 notice of funding for a description of these requirements).
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • AmeriCorps administers only two competitive grant programs, described above. 
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • AmeriCorps State and National grantees, including city, county, tribal, and state governments, are required to use their AmeriCorps funds to evaluate their programs. In FY21, AmeriCorps awarded $8.5 million for the Commission Investment Fund that supports State Commissions, which are typically housed within state government–approximately one third of these grants will focus on building the capacity of State Commissions and their grantees to collect and use performance and evaluation data.
  • AmeriCorps’s Evidence Exchange includes a suite of scaling products on the evidence exchange to help grantees replicate evidence-based interventions.
Score
3
Use of Evidence in Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY21? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY21, the five largest non-competitive grant programs are:
    1. AmeriCorps State formula grants program ($152,482,034 eligible grantees: states);
    2. AmeriCorps National Civilian Community Corps (NCCC) ($32.5 million; eligible grantees: nonprofit organizations);
    3. AmeriCorps VISTA ($93 million; eligible grantees: nonprofit organizations, state, tribal, and local governments, institutions of higher education);
    4. Senior Corps Foster Grandparents ($118 million; eligible grantees: nonprofit organization, local governments)
    5. Senior Corps Senior Companion Program ($50 million; eligible grantees: nonprofit organizations, local governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY18, Senior Corps Foster Grandparents and Senior Companion Program embedded evidence into their grant renewal processes by offering supplemental funding, “augmentation grants,” to grantees interested in deploying volunteers to serve in evidence-based programs. More than $3.3 million of Senior Corps program dollars were allocated, over three years, toward new evidence-based programming augmentations. Grantees will be operating with their augmentations through fiscal year 2021.
  • In a survey completed in FY20, Senior Corps grantees reported that 4,043 volunteer stations and 20,320 volunteers (10% of all volunteers) were engaged in evidence-based programming.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • In FY19, Senior Corps completed an evaluation with an independent firm to produce case studies and comparative analyses of select grantees that received an evidence-based programming augmentation to understand successes, challenges, and other issues. This report is being used to inform Senior Corps’ approach to replicating this augmentation initiative, as well as the training/technical assistance needs of grantees.
  • Senior Corps and the Administration for Community Living have continued a dialogue about how to build and broaden the evidence base for various programs designed for older adults, particularly for aging and disability evidence-based programs and practices. AmeriCorps previously utilized ACL’s list of evidence-based programs for its augmentation grants and is encouraging Senior Corps grantees to move toward more evidence-based programming.
  • For FY20, Senior Corps continued funding five demonstration grants, totaling $2,579,475, which authorize organizations to implement the Senior Corps program model with certain modifications to standard AmeriCorpspolicies. Demonstration grants allow Senior Corps to analyze potential policy changes.
  • AmeriCorps NCCC invested in a Service Project Database that provides staff access to data on all NCCC projects completed since 2012. The database thematically organizes projects, classifies project frameworks, and categorizes the outcomes of these service initiatives. NCCC is investing in an evaluation of NCCC’s impact. This research project was initiated in FY18 and is focused on evaluating member retention, studying how NCCC develops leadership skills in its members and teams, and the program’s ability to strengthen communities. Finally, NCCC will continue to invest in research grants to better understand the outcomes of its disaster response efforts.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • AmeriCorps only administers five non-competitive grant programs, as described above.
9.5 What are the agency’s strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Senior Corps and the Office of Research and Evaluation completed a longitudinal evaluation of the Foster Grandparents and Senior Companion Programs in FY19 that demonstrated the positive health outcomes associated with volunteering. A 50 year retrospective review of the research conducted on Senior Corps programs was completed at the end of FY19 and was posted on the Evidence Exchange in FY20.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • AmeriCorps does not prohibit the use of formula dollars for evaluation but each State Commission may have its own guidelines. Further, formula grantees over $500,000 have to perform evaluations using their grant funds.
Score
6
Repurpose for Results

In FY21, did the agency shift funds away from or within any practice, policy, interventions, or program that consistently failed to achieve desired outcomes?

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • AmeriCorps’s AmeriCorps State and National denied funding to eleven FY21 applicants that requested $3,667,772 for new or recompete funding because they did not demonstrate evidence for the proposed program. These funds were reallocated and invested in applications with a demonstrated evidence base. 
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • AmeriCorps launched a grant management tool (the “Portfolio Navigator”) that allows Portfolio Managers to access data about grantee organizations in real time to facilitate improved oversight and support.
  • AmeriCorps Office of Research and Evaluation continued to invest $500,000 in evaluation technical assistance support for grantees which is available to all competitive AmeriCorps State and National grantees seeking to improve their ability to demonstrate empirically their effectiveness. FY19 investments targeted to grantees struggling to achieve outcomes continued in FY20. More specifically, in FY21, the following ongoing support was provided to lower-performing grantees using reallocated FY19 program dollars:
    • Two grantees required intensive evaluation technical assistance (TA) and are being closely monitored by AmeriCorps State and National. To ensure that two grantees are on track with implementing their evaluation plans, the grantees identified several milestones for their evaluation and with support have made progress in FY20.
    • Tribal grantees have faced a variety of challenges in developing and implementing their evaluation plans. Through evaluation TA support, AmeriCorps hopes that the tribal grantees will receive the additional assistance needed to improve their plans. During FY20, 11 tribal grantees received TA to improve the quality of data collection and evaluation plans.
Back to the Standard

Visit Results4America.org