Four Insights for Designing K-12 Dashboards

School districts today are collecting an increasing amount of data, and with that comes a growing desire to use that data to drive better outcomes for their students. The challenge is that while the data exists in stacks of paper on desktops and digitally in different academic systems, it’s not available in an aggregated form that empowers districts to use real-time progress monitoring to improve student achievement.

In our work with school districts around the country, we saw that some districts were starting to take the initiative internally to build dashboards that bring all of their most important metrics together in one place. Because this undertaking can be time and labor intensive and require significant technical resources, districts told us they would prefer to devote their resources to analyzing and acting on what they saw in the data, not on building dashboards to make it accessible in the first place. That’s where Mosaic’s District Progress Monitoring Module, our new out-of-the box set of configurable dashboards, comes in.

User-Centered Design

When we started to design Mosaic we conducted on-site and phone interviews in districts and schools, observed people using the technology they have now, and noted their data practices and conversations. We reviewed relevant industry research, talked to expert partner organizations, read strategic plans from districts nationwide, and studied state accountability systems. Using this research as a foundation, we developed a set of dashboards that address the core needs we identified. We believe that others may be grappling with similar issues as they consider how to harness the power of data to achieve better student outcomes. Whether you opt to build or buy a data-driven performance management system, we hope the knowledge we gained from the districts and schools we work with can start you on your way to developing a system that works for your district.

Key Lessons Learned

Extensive educator and administrator feedback is critical in getting the tools right. Here are four key lessons our users taught us along the way.

  1. Balance the needs of districts, schools, and teachers
    District leaders typically value top-down alignment on strategic initiatives throughout all schools. At the same time, school leaders and teachers are often focused on the unique characteristics of their schools and classrooms and may have their own definitions of success.

    Designing a dashboard system to bridge these needs requires careful attention to balancing how both the district’s strategic goals and the goals of individual school sites are reflected on the dashboard. While designing Mosaic’s District Progress Monitoring Module, we tested prototypes that aimed to reflect this. The best designs gave school leaders and teachers the flexibility to configure their own goals within the context of the district’s high-level strategic priorities. They also included drilldowns that gave much more color to the overall metric value, which is important for reporting and noticing high-level trends but doesn’t always tell the whole story. These designs enable district leaders, school leaders and teachers to speak the same language, yet they also provide the level of nuance necessary to understand the story behind the data.

  2. Timing and context matter
    To help users get the most out of their data, it’s important to help them distinguish meaningful trends from normal variations in the data and to know at what cadence it makes sense to monitor specific metrics. The approaches we recommend for addressing this design challenge include presenting data with context on what to look for and for what time periods, and, in some cases, only showing the appropriate level of granularity for a given metric (e.g., monthly for attendance rate, where daily variation is not particularly meaningful).
  3. Provide the data necessary for thoughtful benchmarking
    Schools have a lot to learn from each other, and sharing data to identify bright spots within a district can be the first step to starting conversations about best practices. That being said, if performance data is shared without recognition of the different challenges schools and students are facing, it can feel punitive and unfair—not the ideal way to start a learning conversation.

    We’ve found that providing the necessary data for educators and administrators to identify schools with similar student populations and benchmark themselves against those can help to spark conversations that generate collaboration and learning.

  4. Allow users to introduce complexity at their own pace
    Not all users have the same comfort level with technology and data. It’s important for districts and schools to accommodate all levels of data and tech savviness.

    In designing Mosaic’s District Progress Monitoring Module, we addressed this design challenge by providing multiple layers of analysis: the dashboards start off with an extremely simple, top-level view of the data, with more detailed drill-downs and layers of analysis available for those who are hungry for them. Our dashboards also offer printing and export features for people who are more comfortable with paper or a PowerPoint presentation than a web-based tool.