Login
Teachers
Teachers
Student WorkResources
Pricing
About
Our StoryBlog
Try For FreeLogin
Best Practices

4 Tips for Analyzing Preliminary State Test Results

Four things to keep in mind when analyzing preliminary results from MCAS and other state tests

Ryan Knight

Did you know? State testing data is out now in Massachusetts. 

Yes, that’s right, no need to wait for a stale presentation by talking heads in September. You can dive right into MCAS data now, while it’s instructionally relevant.

‍

Here’s the 4 things you need to know about the preliminary MCAS release.

‍

1. Look at the difference to the state average.

‍

The preliminary data is the raw percent correct. No scaled scores or growth percentiles.

Why does that matter?

Every year, new questions are introduced into the MCAS. These questions have different levels of difficulty. The difficulty of the test therefore varies from year to year. 

Similarly, the questions on each grade’s test have different levels of difficulty.

If you compared the percent correct year-to-year or across grades, you might confuse differences in the difficulty of the tests for differences in performance.

To make it more concrete: Imagine your average percent correct was 60% in both Math and ELA. Does that mean you did the same in Math and ELA?

‍

Nope. 

ELA raw scores are typically higher than Math raw scores on MCAS. For example, in 2019 the statewide average percent correct for 5th grade was 56% in Math versus 63% in ELA. So in Math a raw score of 60% would be 4 points above the state average, while an ELA raw score of 60% would be 3 points below the state average.

Math may have actually been better than ELA, despite the same percent correct.

The solution is to subtract the state average for each grade level from all aggregations, and compare everything to the state average. 

‍

2. Be careful when analyzing areas of content

‍

The best thing about seeing the results now is that you have time to make changes to next year’s curriculum.

The worst thing is that it’s hard to make conclusions about content.

Because each question has a different level of difficulty, performance on one standard may be higher (lower) just because the questions on that standard were easier (harder).

For example, the 2019 5th grade Math MCAS had 5 questions on Geometry and 9  on Fractions. Let’s say your average percent correct was 65% in Geometry and 45% in Fractions. The Fraction score was so low! Time to blow up the Fractions unit?

Not necessarily. The Fractions questions were much harder.  The state average was 70% in Geometry and versus 44% in Fractions. There could actually be more room for improvement in Geometry. 

The best we can do is use the analysis to confirm or deny hunches that we already have. 

‍

If you were suspicious that a unit didn’t cover a standard well, and then you see lower performance on that standard, maybe your hunch was right and you should think about how to adapt the unit next year.

On the other hand, if you thought a standard went just fine during class, and you see lower performance on that standard, maybe wait for more data before blowing up your approach. 

‍

3. Remember the preliminary release is preliminary

‍

MCAS data gets released pretty much continuously through the summer at ever-greater levels of detail, which makes MCAS analysis more of a process than an event. 

The first releases are only the machine scored questions, not the more complex open response questions. 

Hold any conclusions loosely for now, and update them as you get more information.

My favorite release is students’ essays, which will come in June. 

Yes, that’s right – you will be able to read what your own students actually wrote on an essay, along with the text and the question.

‍

My favorite professional development ever is sitting with teachers to go deep on released Essays. We review the released text and the prompts, look at the standards assessed, discuss how the prompts reflect the standard, and score the students’ actual responses ourselves. We then look at the scores students actually earned to gauge our quality bar versus the MCAS quality bar. It’s also fun to compare the tasks and writing to samples from the curriculum. 

‍

4. Keep the results confidential, for now

‍

Students and families don’t get their results until August / September, and school level results are embargoed until late September. Until then, it’s best to keep results confidential. The scaling and growth norming processes are critical to fully understanding performance.

A student with a low raw percent correct may have actually made enormous growth over the prior year. Without the growth scores, we can’t tell that essential part of the story. 

Even when we do have all the information, we always need to remember that a single test is not a precise measure of individual student ability. Student scores are noisy! It’s a sample size of one –  not nearly enough data to make reliable conclusions. 

MCAS results are very helpful when aggregated to the school level, kind of helpful at the grade level, maybe helpful at the classroom level, and not very helpful at the student level.

We can learn a lot, now, about our performance as educators during this very trying year. But students should not identify themselves by their scores, now or ever.

‍

Ryan Knight

Founder & CEO @ EdLight, PBC. We believe great teaching matters most.

Copyright © 2022 EdLight. All rights reserved.

By visiting edlight.com, you accept the use of cookies,

Terms of Service and  Privacy Policy

TeachersResourcesPricing
AboutOur StoryBlog