January 29, 2024
Building the Foundation Needed to Achieve the Evidence Act’s Potential for Transforming Federal Policymaking
By Kelly Bidwell (GSA), Joseph Clift (USDA/FNS), Mary Hyde (AmeriCorps), and Lauren Supplee (HHS/ACF/OPRE)
The five-year anniversary of the Foundations for Evidence-Based Policymaking Act of 2018 (the Evidence Act) provides a point of reflection on the journey to ensure the Federal government centers its work in evidence building and use. Transforming Federal policymaking and practice by embedding evidence into agency operations does not happen overnight, but the Evidence Act, and subsequent guidance, provides a roadmap to guide Federal action.
Though our titles vary, each of us serves as the lead for well-established federal evaluation offices situated within different agencies across the government. All of our evaluation offices predate passage of the Evidence Act, and many have existed for 20 to 30 years. Our collective experience demonstrates that to build evidence and reach a steady state of integrating evidence into decision making takes time and sustained investment in financial and human capital. Below, we offer insights to share key enablers of success for building and strengthening evaluation offices across the Federal government.
A Quick Overview of Our Organizational Histories
Some of our evaluation offices go back almost 30 years.
-
The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) Office of Planning, Research and Evaluation (OPRE) was established in 1995, shortly after ACF’s creation. Initially, the office focused on welfare, employment, and family self-sufficiency, but has expanded over time to cover many ACF programs.
-
AmeriCorps’ Office of Research and Evaluation (ORE) began around 30 years ago at AmeriCorps’ founding to embed research, evaluation, and evidence building into the work of the agency from the start. AmeriCorps considers relevant evidence for the four domains the agency seeks to impact (see 2023 State of the Evidence Report | AmeriCorps).
-
The research and evaluation function at the U.S. Department of Agriculture’s (USDA) Food and Nutrition Service (FNS) supported evaluation research in the 1980s and, as early as 20 years ago in support of transparency for its research and evaluation efforts, began publishing its annual Research and Evaluation Plans on its website.
-
Finally, the youngest of our offices, the General Services Administration’s Office of Evaluation Sciences (OES) began in 2015 with a goal to help Federal agencies answer priority questions with rigorous evaluation methods and administrative data. OES has completed over 135 evaluations since 2015.
Enablers of Success
Through our long histories, our evaluation offices have learned what facilitates success and how to establish the organizational structures to support that success. Based on empirical literature studying when, why, and how evidence is used in organizations, ACF OPRE developed an evidence capacity framework with five key dimensions of evidence capacity: culture, infrastructure, engagement, human capital and leadership. The examples below highlight ways each of our offices institutionalize these enablers of evidence use in our organizations. We hope these examples are useful to evaluation offices to set a vision for the future.
Culture is about how the organization uses evidence to support its work. Among our offices, we have integrated evidence into organizational culture by activities such as:
-
Embedding an evidence review into agency review processes for draft Notice of Funding Opportunity announcements and templates, Notices of Proposed Rulemaking, reports to Congress, updates to policy guidance, and budget formulation documents; and
-
Creating permanent positions (learning officers, team leads) who work closely with program partners to ensure evidence is integrated into regular program operations and external communications.
Infrastructure consists of evidence-related tools, budget, routines, and processes that enable the building and use of evidence. A strong evidence infrastructure requires elements such as:
-
Financial resources to create tools and resources. While exact dollar amounts needed are hard to say definitively, Results for America has historically advocated for at least 1% of an agency’s budget be reserved for evidence building. In a recent survey of the Evaluation Officer Council, 60% reported having less than $5 million available at their agencies for significant evaluations and nearly half had less than $1 million. Our experience suggests evaluation offices need more funds and authority, and can hopefully build that support over time. As one example, for AmeriCorps’ ORE, the appropriated budget has fluctuated over the years, ranging from $2 million to $12 million, and, when combined with matches from awardees, AmeriCorps does meet the threshold of 1% of operational funds spent on research and evaluation.
-
Funding authorities and flexibilities. Some of our offices (OPRE and FNS) benefit from statutory set-asides for research and evaluation or specific line items for evidence building, including to support the staff to conduct the work. For example, while OPRE has conducted research on economic self-sufficiency from its inception, it was only in the Consolidated Appropriations Act of 2017 where language was included specifying that 0.33 percent of the Temporary Assistance for Needy Families (TANF) program appropriation shall be available for research, technical assistance, and evaluation. Another funding flexibility that OPRE recently gained was language to provide multi-year authority for funds allocated for research, evaluation, and statistical activities. As most evaluation projects take longer than one year to accomplish, this flexibility is invaluable to building a strong portfolio of work.
-
Strategic evidence planning processes. Our offices have leveraged Learning Agendas to guide decision making, support the creation of relevant evidence, and inform future evidence investments.
-
Evaluation governance structure. AmeriCorps established the Research and Evaluation Council as a governance structure to ensure enterprise-wide engagement in evidence building and use. The Council has authority of the agency resources invested in building and using evidence and is comprised of the agency’s most senior-level leaders and representative of key program and business offices.
-
Integrating open science practices. In addition, with the aim of supporting the broader evidence community and ensuring process transparency, OES shares publicly and regularly updates its evaluation process and methodological resources.
Engagement focuses on the systems and processes to promote collaboration and communication about evidence. Research on evidence use consistently points to the importance of people – evidence is shared and used within the context of trusted relationships. Engagement can be supported through:
-
Building evidence in close partnership with the intended users of the evidence which requires funding, time, leadership, and humility on both sides to establish and sustain these partnerships. As one example, OPRE and OES have staff whose job responsibilities includes building relationships, listening to the needs of partners, jointly engaging in budget planning, and sharing back relevant findings with partners.
-
Including program staff and leadership in dissemination through opportunities like briefings, newsletters, or webinars.
Human Capital ensures staff have the skills and expertise to build and understand evidence. Some ways we have strengthened human capital include:
-
Staffing offices with masters and doctoral trained social scientists either as FTEs or as external experts. As one example, OES has harnessed external talent through a competitive fellowship program, bringing on over 175 external experts over the last 10 years, largely from academic institutions, on rotational terms at GSA to contribute to priority evaluations in their areas of expertise.
-
Ensuring sufficient staffing. A recent survey conducted by the Evaluation Officer Council found there is a median of 3.5 FTEs dedicated in many evaluation offices to support Evidence Act activities for the entire Federal department whereas some of our offices have between 20-70 FTE to oversee research, evaluation, and data projects. We hope over time, these offices can continue to grow as our offices have grown.
-
Building program and policy staff capacity to understand and use evidence. For example, OPRE invests significant efforts into building program and policy staff capacity through efforts such as our Program Managers Guide to Evaluation, our Evaluation 101, our Data Community of Practice, and our continuous quality improvement efforts with grantees. FNS hosts trainings for new staff on research and scientific integrity and supports development opportunities for staff to attend training classes at external programs.
Leadership is whether and how leaders foster an evidence culture including resources and processes to integrate evidence into everyday work.
-
One step in this direction is OPM’s recent addition to their goals to build a data-fluent senior executive service including incorporating data fluency into the Federal Executive Core Qualification framework by which all executives are assessed. Additionally, a partnership between the OMB Evidence Team and the Federal Executive Institute is offering a new Evidence-Based Data-Driven Decision-Making course for senior executive service members.
-
While many of our offices had the functions of an Evaluation Officer prior to the Act, OMB guidance further clarified and solidified this key role in agencies and sub-agencies. Fewer of our offices at the time had the corollary Chief Data Officer (CDO) role and the designation and clarification of the roles and responsibilities of this role has been instrumental in advancing evidence work.
Bringing it All Together
Spurred on by the Evidence Act, the field of evidence creation and use continues to grow and evolve. Though agencies across the Federal government have responded to the call by establishing Evaluation Officers, publishing Learning Agendas and Annual Evaluation Plans, and taking steps to build their evaluation and evidence building capacity, many have accomplished this with little to no additional investments in staffing or budgets. Especially for newer offices that have only emerged in the last five years, we recognize that it will take time for them to get established. Hopefully, the lessons shared here can help accelerate their progress by building awareness of, and an appreciation for, the key enablers for long term success.
Tags: