This is a guest article by Miguel Davis from Macro Connect.
Schools now have more data than ever about their students. And, that should translate into making better decisions about curriculum and instruction. The trouble with realizing this optimal learning experience for students, it seems, is that educators need the time and direction to analyze this overabundance of data points.
Advances in technology, standardized testing, student information systems, and instructional software have provided districts with an overload of information about learning trends, patterns, and student progress — many of which probably are never identified. Large amounts of information can easily get lost when those who need to sort through it don’t have a goal or outcome in mind. Mike Parker, assistant director of the Center for Accountability Solutions at the American Association of School Administrators, says, “We recommend not fishing blindly as you review data, if you don’t have a purpose in mind, it’s easy to get off track.”
Set relevant goals and avoid organizational complexity
In order to avoid data chaos, districts have begun creating and/or staffing positions like chief information officer, data specialist, and chief data officer to take the lead in distinguishing what their data means. The data trend in districts is only set to grow with increased connectivity and use of more and more devices inside the classroom.
Unfortunately, too much of the data collection is done in the spirit of compliance rather than with the purpose of putting it to good use. Better use of school-collected data includes but isn’t limited to measuring and predicting student progress, program effectiveness, instructional effectiveness, allocating resources more wisely, and promoting accountability.
Rather than school districts being data-driven, perhaps it’s best for them to be SMART, an homage to the common goal-setting acronym meaning specific, measurable, achievable, relevant, and time-based or words to that effect. As with goal-setting, the SMART use of data is most effective in schools when users follow a structured process with the following steps:
- Set a vision.
- Communicate the vision.
- Train all stakeholders.
- Have action/reaction in execution.
The order in which these steps are completed is key to its success. Collecting and sifting through data just for the sole purpose of completing data entry tasks makes it tough to gain traction as well as see any considerable positive results.
Define performance metrics to achieve expected results
A CIO, superintendent, or administrator along with their committee should ask non-data related questions about areas where they feel they’re falling short or might improve upon considerably. In Data Analysis for Continuous School Improvement, author Victoria Bernhardt identified 7 main questions to ask in the initial stages of data analysis:
- What is the purpose of the school or district?
- What do you expect students to know and be able to do by the time they leave school? (standards)
- What do you expect students to know and be able to do by the end of each year? (benchmarks)
- How well will students be able to do what they want to do with the knowledge and skills they acquire by the time they leave school? (performance)
- Do you know why you are getting the results you get?
- What would your school and educational processes look like if your school was achieving its purpose, goals, and expectations for student learning?
- How do you want to use the data you will gather?
It’s also important to identify how (if you don’t already have key performance indicators in place) you’re going to gather the best data to answer the questions you’re asking. Narrow the number of data points to the smallest amount possible to make data collection simple. Often this requires drilling down to leading performance metrics that drive end results in an area.
A common trap is relying on test scores. Standardized tests are supposed to “act like a report card for the community, demonstrating how well local schools are performing…testing is the first step to improving schools, teaching practice and educational methods through data collection,” Using Data to Improve Schools: What’s Working.
However, standardized tests aren’t frequent enough to measure the growth of individual students and are lagging indicators. By the time you get your results back, it’s too late to intervene. As Suzanne Bailey, a restructuring issues consultant at Tools for Schools said, “If your data set is only test scores, you are doomed.” Data points that are collected far more than a few times a year will be the data points that can influence results.
Some schools are finding success with assessment alternatives like project-based learning, and the use of platforms by students with adaptive interfaces and interactive dashboards. Systems that are updated regularly and are easy to access help build good practice around those specific data systems.
Boost engagement through clear communication
Notoriously, at the macro level, education is slow to adopt change. Small-scale, teachers are often fearful of change because they’re risk averse. “American education remains basically modeled on an approach hundreds of years old. Students with varying levels of ability sit in classes organized by grade level before a ‘sage on the stage’ who teaches reading, writing, arithmetic, and a bit of science. That system, at least in the US, doesn’t seem to work well enough. Among developed countries ranked by the Organization for Economic Cooperation and Development, the US is 31st in math achievement, 24th in science, and 21st in reading” (Sims). To combat organizational inertia, leadership must create the conditions for change to take root and thrive.
Encouraging discussion, feedback, and criticism is a key factor in the communication step in the process. The most fruitful conversation surrounds the why– without a well-defined why your initiative becomes another item on a checklist for your staff. Everyone involved in a district’s data management and metrics should be on the same page about the intended results as well as equally confident in the how aspect. It’s critical for the entire staff to buy into the plan presented before moving onto training. The how should include: the time frame, how progress will be evaluated, how it affects routine, commitments to one another, and what supports will be in place.
For example, Mark Hess, Executive Director of Instruction, Technology, and Assesment from Walled Lake Consolidated School District, invited team members from all across the district to evaluate and make a case for piloting a usage analytics and SSO tool called ClassLink. By his own evaluation, he felt the data it would collect would be powerful for teachers and admins, but without staff buy-in he reasoned it would be underutilized.
Provide training for end users
Training is essential for school staff and administration just as it is important in a large corporation. Adopting a new model or system for data analysis will require new habits from employees and a deep understanding of the vision at hand. Asking employees to follow a new process will only be successful if they feel empowered, supported, and motivated to carry on when you’re not watching. Training should also be an on-going effort rather than a one-time session or introduction. Brian Benzel, a school superintendent from Spokane, Washington, was able to summarize his findings on using data in schools with the following points:
- Start small, don’t overwhelm staff with a data dump.
- Begin with the core issues, such as student achievement markers in a single subject area.
- Listen to what the data tells about the big picture; don’t get lost in too many details.
- Work to create trust and build support by laying data on the table without fear of recrimination by staff.
- Provide training opportunities for staff on how to use data.
- Be patient, working with what is possible in the district.
Monitor the new process and highlight positive results
Regular check-ins should serve to encourage accountability as well as prevent the process from shifting off track. If something isn’t going as planned, a quick intervention is necessary to avert poor or even inaccurate results. Remember that you chose data points that are collected regularly — so bad behaviors shouldn’t hang around long enough to become habits.
Though the new system might not produce positive results immediately, honor the progress through benchmarks and highlight district wins. People respond best to positive reinforcement and proof that what they’re doing is working. The monitoring process should not only look for the positives, however, but also weed out the selected indicators that might not actually be improving results. If some are identified, there’s nothing wrong with tweaking the process for continuous expansion and development.
In practice: making data-driven decisions
To illustrate the entire process, let’s dig into a practical example. Many a district has attempted to move the needle on student achievement by improving parental engagement, but fallen flat in execution.
Often parent-facing initiatives are doomed to fail from the vision-setting stage because the engagement metrics are misaligned to the desired end result. If the goal is student achievement, then we need to select performance indicators that capture behaviors that most directly influence student achievement. Compare:
- The number of parent volunteer hours per child given to the school.
- The number of parent logins per child to the online gradebook.
The first metric is a strong sign of parent engagement but may be a step removed from a guardian’s involvement in monitoring and supporting a student’s academic activities. Additionally, in practice, volunteer programs create some limiting factors for data collection, i.e. there are only so many opportunities to volunteer, volunteer opportunities may be seasonal in nature, etc. On the other hand, logins to a grading system create an opportunity to set an ongoing expectation for parental interaction with a student’s achievement.
For example, Valley Christian Junior High (VCJH) asks parents to check their students’ PowerSchool accounts once a week. Each login is reasonable verification that a parent is staying up to date on how their child is progressing, what he/she has been working on, and what might be coming up.
In communicating the vision, it’s important to not shy away from how routines and norms will change. For the number of parent logins metric to be an effective one, data should be up to date. Imagine being a parent who logs in and sees that no grades have been entered for 3 weeks! It’s pretty unlikely that logging in will become a weekly ritual if a parent isn’t confident that new data will be waiting for them. That’s why buy-in is so critical at this stage and an effective leader will communicate why and how with input from their teams.
After some discussion, the how agreed upon for VCJH was simple: Have the previous week’s assignments entered by 5 pm every Monday. A clear-cut, hard and fast deadline like this makes compliance easy to monitor, plus it is much easier to train end users and create support structures around a recurring event.
VCJH can find and assist any teachers who are struggling to input data correctly or on time. When firing on all cylinders, this means all teachers, administrators, and parents are using up-to-date data to drive instruction and intervention at all levels. You can read the full case study on VCJH’s results using PowerSchool for parental engagement by clicking here.
Data alone can be more costly than valuable to an organization without the right plan in place. Without data fluency and data discipline, the KPI’s as well as the positions who manage them are simply counterproductive. Managing and using that data to drive measurable results will be differentiators within school systems sooner rather than later.
If you’re a school official with any access to data projects, consider where past or existing initiatives missed the mark. How can the current data system reach its untapped potential? How will a fresh vision lead to better results? Establishing a vision, communicating effectively, training staff, and then implementing and monitoring the system is a process to be done in sequential order for an optimal outcome.
Miguel Davis is the Digital Learning Manager and Client Solutions Director at Macro Connect, a Detroit based education technology solutions agency. His role is to support schools and businesses in achieving breakthrough performance through technology. Miguel is a former classroom teacher, district technology coach, and currently a board member of Playworks, a non-profit organization focused on improved student performance through recess and healthy play.
Want to write an article for our blog? Read our requirements and guidelines to become a contributor.
Originally published at AltexSoft’s blog “How to Use Data and Analytics to Help Schools Make Smart Decisions”