First, you and your team carried out a needs assessment that helped you identify and prioritize critical needs. Then, you identified an evidence-based strategy that would help you address those needs.
What happens next?
A deliberative, systematic and data-driven needs assessment process equipped you with objective information to identify and prioritize critical needs. Similarly, a deliberative, systematic and data-driven evaluation plan will give you objective insights on whether the strategy is working and is truly helping you address your critical needs.
Just as there is no one-size-fits-all approach to meeting local needs, there is no one-size-fits-all approach to evaluating whether an evidence-based strategy is working for you. Where do you start and what are key considerations to keep in mind?
First and foremost, plan ahead. It is never too soon to think about how you will know whether your evidence-based strategy is working.
You already started laying the groundwork for evaluation when you carried out your needs assessment. As part of your needs assessment, you and your team talked about where you currently are on an outcome and compared it to where you want to be.
This means that you have already thought about baseline data and end goals; what you need now is a plan to understand how the evidence-based strategy is affecting progress towards your goals.
In planning ahead, the questions you ask yourself should address the following:
Much like carrying out a needs assessment, evaluating whether your evidence-based strategy is working is a team effort. As you think about the team you want to assemble to carry out this work, consider the following:
|Data Analysis & Evaluation||
Having the answers that you need when you need them will put you in a better position to make programmatic, staffing, scheduling, budgetary and other operational decisions. Consider the following:
|Short- and Long-Term Targets||
|Data Collection & Analysis||
|Opportunities for Course Correction||
The data that you used to identify your critical needs can serve as a great starting point for your program evaluation. You may find that you will need to supplement this data with additional information – especially when it comes to understanding implementation. Consider the following:
|Types of Data||
|Progress Towards Goals||
Data analysis gives you an objective lens through which to understand whether your evidence-based strategy is working. Data alone rarely gives us a clear-cut answer on next steps in implementing strategies to improve outcomes. Once you start to see results from data analysis and program evaluation, consider the following questions:
|Review and Understand Results||
|Adjustment vs. Changing Strategies||
There are many ways that districts can work with research partners to learn more about what is working. Resources to help you learn more about developing research partnerships include the following:
- While targeted toward state education agencies, the Data Quality Campaign’s Roadmap for Effective Data Use and Research Partnerships between State Education Agencies and Education Researchers includes many points relevant to districts as well.
- The National Center for Education Statistics (NCES) Forum Guide to Supporting Data Access for Researchers: A Local Education Agency Perspective provides best practices and templates for data sharing with researchers.
- The National Network of Education Research-Practice Partnerships (RPP) includes an RPP Knowledge Clearinghouse.
- Research-Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts outlines three major types of research practice partnerships and provides guidance on developing these partnerships.
- The SEA of the Future: Building Agency Capacity for Evidence-Based Policymaking is written with state agencies in mind but offers many insights into the importance of carrying out education research and evaluation.
- The William T. Grant Foundation published a wide range of resources designed to educate districts, state education agencies and researchers on how to leverage research-practice partnerships.
Engaging in research partnerships sometimes involves data sharing with external partners. When planning to share data with research partners for evaluation purposes, districts should always start by talking with their legal and IT departments. As you work with staff across your district to develop a data sharing plan, resources than can help you develop that plan include:
- The Ohio Department of Education published guidance on data sharing for program evaluation. This guidance should not replace consultation with legal staff at a district but may provide helpful insights or generate important questions to ask your legal, data or program staff or research partners.
- The Ohio Department of Education Data Privacy Report (2014) provides information about how federal and state law effect data collection, sharing and reporting at the state level. The report includes information about how the state approaches data privacy, data security and data sharing.
- The DATA DRIVES School-Community Collaboration: Seven Principles of Effective Data Sharing is a resource developed by StriveTogether and Data Quality Campaign that, in addition to outlining seven principles of data sharing, also provides links to case studies, sample documentation and additional data sharing resources.
- The National Center for Education Statistics (NCES) Forum Guide to Education Data Privacy (2016) is a resource designed to help states and districts better understand the steps they can take to protect student privacy.
- The U.S. Department of Education’s Protecting Student Privacy website provides a wide range of resources for education agencies interested in understanding FERPA regulations and the impact of those regulations on student data collection, sharing and reporting.