With Ohio’s Election Day on March 17th, it is a good reminder of the roles we all play in our democracy and public policy process. Below are some lessons we at Scale learned as evaluators working with policy makers. We initially shared some of these thoughts as a contributor to the American Evaluation Association AEA365blog in 2017 .
Lessons Learned:
• Legislative processes may influence your evaluation design and timeline. Publicly sponsored projects may have reporting deadlines written into legislation or their funding streams may be subject to annual budgeting reviews. Projects sponsored by private philanthropy may also be influenced by the legislative cycle as findings may be helpful to craft or change public policy.
• Policymakers may get data and information from a variety of sources. It was common for a policymaker to have visited a program site or talked extensively with program champions. Program critics may also be vocal to policymakers. External criticism may be based on program perceptions (rooted in experiences or in ideology), or a sense of competition for resources. Your evaluation data will need to be clear and easily accessible to cut through what may be noise.
• You may need various reports of the same analysis. For one evaluation, we produced a one pager of highlights for quick reference by high level administrators and officials, a 6-page summary of lessons to insert in a public annual report, and a full technical report with more detailed explanation of methodology and data for staffers and stakeholders.
Suggestions for Evaluators in Public Policy:
• Spend time refining research questions related to what legislative decision-makers want to or should know regarding the project and related policies.
• Regardless of the scope of your program evaluation, identify what policies and funding streams impact the program. This understanding helps you to gain clarity on who the stakeholders are and their interests and constraints.
• In your evaluation design, consider legislative timelines. Think about what data you may be able to reasonably collect, analyze, and report to provide insights to legislators in line with the legislative decision-making process.
• Encourage your client to think independently from your evaluation about courses of productive action they may take if findings are less favorable than expected. Consider building in extra review time for analysis so that the client can process data and determine how to make lessons actionable or identify questions that may emerge from policymakers about the results or the evaluation approach.
Resources:
• The National Conference of State Legislators has a Program Evaluation Society for its state policy staff members. It is helpful to see what
materials policy staff members may reference
when they would like to implement or review an evaluation.
• On the federal level, the
National Data Coalition
has been keeping track of efforts related to the Foundation for Evidence-Based Policymaking Act which has promoted increased use of research and evaluation in policymaking.
• You may map out stakeholder interests, including policymaker’s interest, in your evaluations in a”
Power/interest matrix.”