Schoolzilla welcomes St. Louis Public Schools!

SLPS black and whiteIn September 2015, Schoolzilla was honored to welcome St. Louis Public Schools (SLPS) into our customer community.  We’ll be working with SLPS to implement a “one-stop-shop” KPI dashboard aligned to the district’s Transformation Plan.

By combining data from dozens of disparate sources into a live, flexible dashboard, Schoolzilla will enable SLPS to monitor its most critical inputs and outputs, celebrate and expand on successes, and quickly identify and address challenges on the road to ensuring a world-class education for all of its 25,000 students.

Under the leadership of Superintendent Dr. Kelvin Adams, SLPS is undergoing a period of historic change.

The district’s comprehensive Transformation Plan— which includes a move to a portfolio management model, as well as S.M.A.R.T goals for everything from day-to-day operations to student reading growth to staff support and retention– creates a direct alignment between district level work and individual student outcomes, focusing efforts across the district on the most strategic levers for improvement.

In order to align, monitor, and support administrators’ and educators’ work to put the Transformation Plan into action, SLPS knew they needed an easy-to-use, centralized dashboard.

SLPS mockup

They call this dashboard the “Excellent Schools Transformation Tool”, or ESTT:

“The ESTT is designed to give us live data throughout the school year to monitor progress and course correct with more conviction and specificity. This tool will be used to analyze the effectiveness of our district offices and ultimately the performance of our schools—holding us all equally responsible for providing a world-class school choice for our students.”

After a competitive RFP process, St. Louis Public Schools selected Schoolzilla, PBC as the partner to develop and maintain the ESTT. We couldn’t be more excited, or honored, to serve SLPS, their staff, students, and community.

Deputy Superintendent of Academics David Hardy said of the partnership: “We are absolutely thrilled to have Schoolzilla as a partner in our district’s transformation.  Their commitment to making sure we have the information necessary to make sustainable change for our kids is not only powerful but inspiring.  Not every partner operates the way Schoolzilla does and I wish more would!”

To test drive dashboards inspired by our partnership with St. Louis Public Schools, sign up for access to our dashboard library, and check out the “District Profile” reports.

Stopping the School-to-Prison Pipeline with Reclaiming Futures

Reclaiming Futures Logo with La Jolla Tag

Based at Portland State University, Reclaiming Futures is a national organization whose work focuses on improving juvenile justice through research-based interventions. As they broaden their impact to include working directly within K-12 school systems, they are partnering with Schoolzilla to develop a suite of research-based, interactive dashboards designed to support the needs of students who are at risk for becoming involved in the juvenile justice system. Each dashboard is designed for a particular stakeholder in students’ school lives, including teachers, counselors, principals, and other support staff.

Below is a conversation that Schoolzilla Senior Impact Manager Adam Rosenzweig had with Reclaiming Future’s Executive Director, Evan Elkin.

AR: What’s the goal of Reclaiming Futures?

EE: Reclaiming Futures operates at the intersection of public health and social justice. Our overarching goal is to help youth-serving systems, like the juvenile justice, education, and child welfare systems, improve behavioral health outcomes and achieve greater equity for youth. 

AR: Say more about equity.

EE: When we say equity, we mean fair and equal treatment in the system and also equal access to health and well-being. We work within systems where the playing field is not level for youth of color and where there are significant negative collateral consequences associated with structural racism, and we’ve recently sharpened our focus on strategies to address these racial and ethnic disparities in the systems where we work.

AR: What is Reclaiming Futures really good at?

EE: Our strategy – and, I think, our greatest impact in the jurisdictions where we work – is to bring about cross-system and cross-silo collaboration. In the places we work, we ask our sites to form leadership teams comprised of key decision makers from across a range of professional disciplines, representing key agencies and systems (like judges, probation chiefs, treatment clinic directors, etc.), and then we coach, support, educate, inspire, and otherwise cajole them into reaching consensus on a set of tangible and achievable reform goals.

Those site-based interdisciplinary groups are then invited to interact with the individuals and groups from other sites across the country in what we call our “learning collaborative.”  What has emerged from this strategy is a peer community, which we have intentionally constructed to disseminate and support the work – the mission and also the practical hands-on aspects of the work. It’s quite effective as a catalyst.

AR: Why were you interested in pursuing this partnership with Schoolzilla?

EE: Recently, we started working to adapt our approach for a school setting, and we’ve created a comprehensive school reform framework that addresses school discipline, school climate, and behavioral health.

These are complex and interrelated domains that require schools to take a critical lens to their work in order to achieve and sustain tangible outcomes. The data we invite schools to look at will be quite challenging. To support this work, our sites will need a continuous data-driven feedback loop.

We chose Schoolzilla because of how in touch they are with the school experience on the ground and, honestly, because of the elegance and power of the dashboards. We knew we needed user-friendly, intuitive, and smart dashboards to achieve success with this project.

AR: What would you like educators to know about student discipline as they begin the new school year?

EE: I think the tricky thing about school discipline is appreciating the complexity of thechallenge schools face in re-engineering their approach – particularly in schools with significant behavioral challenges where, most days, teachers may be just trying to keep the peace long enough to get some teaching done.

We know there is no quick fix, but we believe that attention to the root causes of misbehavior, greater mindfulness about the impact of the discipline decisions we make on the most vulnerable students, and a culture shift toward a more tolerant and inclusive approach will pay huge dividends for our schools.

AR: To help our community better understand your work, would you describe one of Reclaiming Futures’ other recent partnerships or projects?

EE: We’ve recently entered into a partnership with the W. Haywood Burns Institute and another national group based in Oakland called Impact Justice to develop a new framework and a data-centered strategy for behavioral health practitioners and their justice system partners to examine the key decision points around substance use and mental health treatment where racial bias can be introduced. We’re really excited about this project because it’s another opportunity to do work that is data driven and also critically important to the well-being of vulnerable youth.


 

We’re excited about this partnership because of its potential to advance the use of data to improve outcomes for some of the most vulnerable students in our schools. If you would like to learn more about this work, or if you have suggestions for partnerships that Schoolzilla should pursue, please contact us at partners@schoolzilla.com. To explore some of Schoolzilla’s current behavior dashboards, click here.

3 Ways To More User-Friendly Data

We recently sat down with designer Mayra Vega to her get advice on creating data dashboards that people will actually use.

She told us that getting people to use the reports you create starts with reorienting your concept of user needs from nouns to verbs.

Thinking of a need as a noun fast-tracks your thinking straight to what you will build for your user, often skipping over crucial considerations about what you are trying to help the user do in the first place.

In conceptualizing needs as verbs, you will consider how a user’s needs will be met by taking user experience into consideration. This approach will also surface deeper insights into what your user is trying to accomplish with your report.

Use the Needs Madlib below as a guide for thinking of needs as verbs.

madlib_best

Dig deeper into user needs with one of the three user research approaches below.

  1. Watch
    Observing your users as they navigate a report gives you a lens into where the obstacles they face exist and how your report can best fit their mental model or existing process. It may seem simpler to just ask users where they are having trouble, but Mayra says that there is more to gain from asking users to show as well as tell.“Oftentimes what they say is not exactly what they’re doing, and when you’re watching what they’re doing, you’re learning a lot about the places where they get tripped up.”

    Key questions to keep in mind: What is the user doing? How is that helpful for the user? Why?

    Below is a picture of a parent-teacher conference that we took from one of our observations when we wanted to create a report for parent-teacher conferences. We learned from this observation that teachers prefer to print out paper copies of the report for parents to look at during the meeting. Since most teachers print their materials in black and white, not color, we decided to grayscale this report to make it printer friendly.

  2. Ask
    Merely watching your users in action is not enough. Asking “why” during your observation and interviewing them about their process afterward are integral steps for understanding their experience.
    A few things to keep in mind when interviewing…

    • Don’t ask them to solve the problem.
    • Practice lots of listening. The best interviewers listen more than they speak.
    • Don’t ask binary questions. Instead say, “Tell me about a time when…”
    • Keep digging deeper by asking why.
    • Don’t ask only “yes” or “no” questions. Open-ended questions will yield more useful insights.
    • Keep questions short and simple: only ten words to a question and one question at a time.
    • Interviewing in pairs will enable you to make sure you capture the most insights. Have someone take notes while you pose questions.
  3. Do
    Try using the report that you have created as your intended users would, with their needs in mind”“You really want to put yourself in your users’ shoes and understand and gain empathy for the things that they do,” Mayra says, adding, “When you gain that empathy, you start noticing areas that you can help them with.”

    If you were going to create a report for a board of directors, for example, you could read the board packet to get an idea of how much information they have to read through before going to a board meeting. This approach will give you an insight into the nature of the information board members are looking at, the difficulties they face in processing it, and how you can improve that experience with your report.

Mayra gave a presentation on this topic at our last Schoolzilla User Summit. Check out Mayra’s full presentation, where she goes into more depth and discusses the prototype and usability testing processes for making user-centered reports HERE.

More resources for making user-centered reports:

5 Data-Driven Tips to Tackle Summer Learning Loss

The pain of summer learning loss has long been felt by educators across the country, with research on it dating back over 100 years.1 At Schoolzilla, we wanted to understand how educators are using data to gain insight into summer learning loss and what strategies they use to combat it.

We recently sat down with Chris Haid of KIPP Chicago and Roberto Vargas of Chicago International Charter School (CICS) to see how data helped them mitigate summer learning loss at their schools. Here are some tips they shared:

  1. Conduct your fall assessments immediately

First, it’s important to get a clear picture of the loss. For this reason, Haid recommends conducting your fall tests as soon as possible. Changing the time tests are administered helped teachers at KIPP Chicago get a better picture of where students were in the fall.

“We do the fall test sooner now — in the first or second week of school — to see what’s going on, but as far as instructional planning, we rely on the spring test [from the previous year],” Haid said.

  1. Use data to inform resourcing conversations

Vargas used NWEA MAP data to create a summer learning loss dashboard for CICS that illustrated the amount of loss and which subjects it occurred in.

The dashboard gives teachers a clearer view of where students are at the end of the summer and where they should be to get back on track for the school year. It uses students’ current RIT score as well as their target growth and target RIT score. Vargas explained that, with this dashboard, teachers have been able to group students whose test scores declined and create more effective teaching strategies for those students.

“Our staff will look at it and break the data down and say, ‘What’s going on in third grade that reading is so low? How can we get more resources into the third-grade classrooms to help them improve those scores?’”

Haid agreed that having visuals that clearly demonstrate the summer learning loss problem is key. “Having some really telling visuals drove all of our school leaders and regional team to the same conclusions and helped us focus our energies on what we could do, rather than if we had a problem or not.”

We’ve added a version of Vargas’s summer learning loss report to our NWEA MAP analytics suite. Click here to test drive the report!

summer loss report

  1. Take full advantage of the time after spring assessments

For the administration of KIPP Chicago, it did not seem plausible that students would regress in their studies with only six to eight weeks off for summer break. But the data was conclusive — there was a definite loss.

This data prompted educators to take a look at how they were structuring class time after the year-end spring assessment. School leaders realized that if purposeful teaching dropped off after the spring assessment, students would be less engaged during the final weeks of school, essentially extending their summer break. So extra emphasis was put on more purposeful, objective-driven teaching following the spring assessment.

  1. Plan for absolute operational readiness on day one

The operations team at KIPP Chicago asked themselves: “Are we instructionally ready from day one?” They saw that there was room for improvement, Haid said. So they put a greater focus on getting all operational systems up and running — with no delay when school doors opened — so that all tools were available for teachers to dive right into purposeful teaching and counter the effects of summer learning loss.

  1. Give teachers a window into summer activities

Vargas has since built out his dashboard to collect data on summer activities. Information on any summer programming students have taken part in, and on the curriculums they may have been exposed to, is now stored in the dashboard so that teachers have a sense of their students’ level of academic engagement during the break.

Data with an impact

The results of these data-driven decisions have been powerful and tangible for KIPP Chicago and CICS.

“We saw way less summer loss this year, nearly none in almost every class.” Haid said.

“We’ve seen a lot of improvement this year in some of our schools that were struggling last year; the report has been a big help,” Vargas said.

See the summer learning loss report! 

For those of you who have made it this far, here are some free online resources designed to help students stay sharp this summer:

edX courses offered by the nation’s top universities

Learn programming with Khan Academy

SAT practice

Math practice with TenMarks

 

http://www.whatkidscando.org/archives/whatslearned/WhatIfSummerLearning.pdf

CTA

Schoolzilla Gets a Fresh Face: Data Wall!

Schoolzilla’s powerful data warehouse and dashboard platform just got a makeover. Meet Data Wall, our new, more intuitive and visual interface.

The good news: Our data warehousing and visualization technology will continue to automatically pull all your data into one secure place and allow you to create customized data dashboards. The better news: Now you can more easily curate, navigate, and share your most important dashboards using Data Wall.

Click here to register for a demo account and test drive our dashboards!

Use Data Wall’s core features to:

  • Set clear and measurable priorities with Featured Reports. ​Mark certain reports as “featured” for your organization. These reports will show up at the top of every user’s Data Wall.

  • Allow users to personalize their experience with Favorite Reports. Any user can “favorite” the reports they find most useful to easily find them again.

  • Navigate your data with ease using Dashboard Collections. Group reports together by theme (e.g., Parent/Teacher Conference, Intervention Meetings, etc.) so that teachers, school leaders, and administrators can find the information they need when they need it.

  • Make dashboards actionable with Dashboard Descriptions. Now you can annotate visualizations to help users focus on the relevant trends in their data and think about what to do next.

Click here to register for a demo account and test drive our dashboards!

The Data Doctor Is IN

Do you work in a school district’s central office and have a spreadsheet headache you’d like help with? Schoolzilla has partnered with a funder to offer 25 complimentary 1-hour consultations where you can learn spreadsheet tools and tricks. Spend less time cleaning and organizing data and more time getting actionable insights.

Read more

Six Ways to Make Sense of Your Common Core Assessment Data

PARCC and SBAC analysis

As schools await their Common Core test results, educators, instructional leaders and data analysts across the country have been developing thoughtful ways to understand their first Smarter Balanced Assessment Consortium (SBAC) and Partnership for Assessment of Readiness for College and Careers (PARCC) data.

Schoolzilla reached out to some of these thought leaders to understand how they were planning to approach their Common Core data. This guide is based on the recommendations they shared for how schools can analyze, understand, and act on this new data

1. Define what a “good score” is.

In the first year of a new assessment, you can’t compare your data to historical results. How can you identify the bright spots and growth areas in your data without a baseline of comparison?

First, brace for a drop in proficiency.

States that have already implemented Common Core standards-aligned exams have found that student proficiency rates have dropped significantly. Most states are expecting similar performance declines on Smarter Balanced and PARCC tests due to more challenging content and more demanding test questions. Based on the cut scores Smarter Balanced approved last fall, only 33 percent of students are projected to reach the proficiency mark in 11th grade math.

This drop in proficiency means that it’s hard to interpret your scores.  Lower scores could mean less student learning; or, they could just reflect a change in the achievement measuring stick. But, in the absence of comparable historical results, there are other ways to put your data in context. You can find other points of comparison that help your families, teachers, and leaders see where your school is making progress.

Use norms.

District, state, and national norms can provide additional measuring sticks for schools. With 18 states administering Smarter Balanced and 11 states administering the PARCC tests, results will allow for more comparisons across states. Smarter Balanced released nationally normed data from its 2014 field test and is expected to do the same with the 2015 results. This data will allow analysts to visualize how their networks, districts, and schools performed relative to national averages.

Look to peers.

In addition to using national norms, instructional leaders told us they plan to compare their performance against high performing schools.

As Elise Darwish, Chief Academic Officer of Aspire Public Schools said, “The first thing I want to do is understand what ‘good’ schools’ data looks like with these new assessments. I’ll be looking at Aspire schools that have been strong in the past, so I can use their scores as a rough benchmark. I’ll also ask other school systems if they’ll share their results so we can compare.”

District_SBAC_Communications_Report

Compare to other exams. 

Analyzing Smarter Balanced and PARCC results alongside other summative assessment sources, such as NWEA MAP and historical results from previous state assessments, will create a more comprehensive, meaningful portrait of student performance.

 

parcc_scale_score

 

2. Go deeper.

In addition to snapshots of overall student performance by scale score and achievement level, schools want detailed performance reports to show strengths and weaknesses on particular areas of each test.

Both consortia will report overall scores, as well as performance levels on the particular claims/sub-claims that make up each test. Although data at the question level will not be available at first, claim-level data will help teachers and instructional leaders understand how students did on performance tasks and higher-level content assessed by the Smarter Balanced and PARCC tests.

Looking at claim- or standard-level data for your classrooms or schools is crucial  to making your data actionable. Comparing performance across claims can help you find educators who have best practices to share. It can also help set instructional priorities for the next year, instead of feeling like you’re conducting a post-mortem that’s all about last year’s scores.

Classroom_Snapshot_Claim-LEvel

3. Help families interpret their child’s results.

According to a recent survey conducted by the Public Policy Institute of California, a majority of public school families in California (55%) say they have heard nothing at all about the Smarter Balanced Assessment System. Only eight percent say they have heard a lot about the tests.

Teachers and data analysts we spoke to emphasized their desire to create reports that foster dialogue and help families make sense of their child’s scores on these new tests.  Denver Public Schools Parent/Student Portal Manager Juan Pablo Parodi explained: “We did a lot of user interviews with our parents and learned that assessment information is often meaningless to them because of the context in which they are presented with it.  We learned that in order for assessments to matter to parents, we needed to present the data in a simple, digestible way that allowed them to quickly grasp whether or not their student was on track, and could inform more constructive conversations with their students and teachers.”

4. Analyze achievement gaps.

One of the most striking findings from the SBAC field test data is the way that it continues to illuminate achievement gaps for students of color and low-income students. These gaps are not new, but it is striking to see them surface repeatedly and dramatically, especially in a wide, nationally representative sample. On the SBAC field test, for example, black fourth graders scored more than six-tenths of a standard deviation below the total group of fourth graders in math. Analyzing how Common Core results differ by race, income level, primary language, and special education classification can help you find and focus on the most critical achievement gaps in your student population.

5. Compare student achievement data with classroom observations.

Common Core standards are designed to encourage educators to make their classrooms more student-centered and their instruction more rigorous. As teaching practice expert Charlotte Danielson told Education Week, “[The Common Core] requires instructional strategies on teachers’ parts that enable students to explore concepts and discuss them with each other, to question and respectfully challenge classmates’ assertions.”

Your results will provide an opportunity to ask a related question: What student and teacher behaviors are present in  classrooms where students performed best on Common Core assessments?

Lander Arrieta, a consultant who worked with Duval County on its Common Core implementation, explained, “The first thing I’d want to do is visit classrooms where students scored well and see what those teachers are doing—are they leading student-centered classrooms and facilitating rigorous conversations between students?”

If your district uses a formal evaluation system, you may want reports that compare student results with teacher evaluations on specific instructional competencies.  With or without formal observations, walk-throughs with your “bright spot” teachers can help you think about what instructional strategies you want to help everyone on your team develop.

6. Analyze data from computer-adaptive testing.

Computer-adaptive testing creates a number of valuable metrics about a student’s test-taking experience that are unavailable through paper testing. For example, data that may become available from Smarter Balanced includes the time a student spends on a question, as well as the number of times the student changed an answer. Both consortia will also gather data about the types of accommodations available to students, as well as which assistive tools a student actually used during testing.

This data can add insight into each student’s test-taking process, informing an understanding of how the new testing format impacted results, and allowing educators access to evidence about students’ testing stamina and perseverance (or lack thereof) on more rigorous question types. Although this data will not be available initially, future analysis of statistics like testing duration, time per question, and assistive tool usage holds considerable promise for educators and education leaders.

We’d love to hear your thoughts, questions, and concerns about how you plan to analyze your Common Core results as well. Stay tuned for details on how to join the Data Champion Hub—our online community for K-12 data advocates to engage in discussions and share best practices.