Yes, it’s high time I address the issue that keeps coming up, that of district test scores.
At the March 23, 2023 school board meeting, a community member unknown to me spoke about her concerns regarding the district test scores, which reveal that 42% of district students are performing at less than grade-level norms, as set by the state, reported on at IllinoisReportCard.com and based on SAT scores for juniors, who are required to take the test during the school day in the spring. Afterwards, the interim superintendent Ken Arendt violated their usual practice of refusing to respond to public comment by getting his feathers ruffled and he pointed to Niche.com rankings to assert that there aren’t any underlying problems and our reputation is solid. (For clarity, Niche ranks D214 as the 13th best school district in Illinois, behind, so far as I can tell, pretty much all of our neighbor districts, including Palatine/Schaumberg 211, Barrington 220, Niles Twp 219, Evanston 202, New Trier 203, Glenbrook 225, and of course Stevenson 125.)
But a simple examination of four key summary tables shows that concerns are warranted. I compiled these tables myself using the IllinoisReportCard.com data for each of the schools, and there’s always a possibility of a typo.
Again, these are based on the SAT that all Illinois students take in the spring of their junior year. 2017 was the first year of students taking the SAT rather than the ACT, and there was no common administration of the SAT in 2020. The 2021 scores are a bit unreliable because taking the test was optional and it’s assumed that the scores are overstated because the students who were most likely to make the effort to take the exam are those to whom the scores matter personally. However, that’s more of an issue for the state averages, where only 67% of students participated; across D214, that rate was 94% and not much different than a typical year.
These are also not scores, in the numerical sense of the up-to-800 per section that a student would report to colleges. Instead, when Illinois switched to the SAT in 2017, the state selected cut-off scores that they deemed to correspond to “working at grade level.” These numbers represent the percent of students reaching that score cut-off or better.
So what do we know?
- Across all schools, scores tumbled in the aftermath of the pandemic. This was true as much at Prospect and Hersey as at the less-performing schools.
- In some cases, scores dropped even before the pandemic, as well.
- Across all schools, low-income students perform much worse than the average.
- At the worse-performing schools, scores dropped so much, both for all students and low-income students, that they are at or even below the state average.
- There are dramatic differences among the schools. If you look at Hersey’s scores in 2017, in the 70s, as a benchmark of what is reasonably achievable, we can see that Elk Grove and Wheeling are far, far below that.
But I will add some caveats, in the interest of honesty.
The state’s “grade level” cut-offs were set at a very high level, higher even than what many colleges deem to be sufficient to demonstrate a student is ready for college work (and even then, don’t take into account the further year of learning that happens after 11th grade).
The SAT is not really a good test for the purpose of measuring “grade level” because its purpose is different: to compare students to each other in terms of their preparedness for college-level work. The nature of the exam also means it doesn’t do a good job of measuring how far below grade level students are, because it was never intended for all students to take it in the first place. To actually measure students’ abilities more effectively, we’d be better off giving them the sort of placement tests that colleges use to assess whether they are ready for credit-earning classes and, if not, where they need to start instead.
There are real issues with having only 5 years of data. I am not so much concerned that the test scores are “unreliable” in terms of random variation; so many students take them each year that test scores ought not vary that much from year to year from different class composition. However, it is reasonably well-established that students’ motivation can have a significant impact on test scores, perhaps even more so than “test prep” activities, and it seems entirely possible that some years/schools’ relatively high scores at some points were the result of extra efforts to encourage students, especially when it comes to students who are not taking the exam to benefit them for college admission or scholarships but just because it is mandatory.
After all, in several instances, there were sudden and dramatic one-year drops: at Buffalo Grove, ELA scores for low-income students were 42% at grade level in 2017, but dropped to 17% in 2018, and only slightly recovered. It is just as likely that the 42% was the outlier due to particularly strong efforts to motivate students in the first year of the test, as it is that there was a drop that related to worsening instruction. The same thing happened at Wheeling (27% to 19%) and Hersey (51% to 39%, though with a post-pandemic recovery). It seems also entirely possible that there are shifts when attitudes change, for instance when schools stop asking for SAT scores or when the claims of biased tests become a self-fulfilling prophesy.
But despite these caveats, it can’t possibly be acceptable that only 14% of Elk Grove low income students met that “grade level” cut-off, or 12%/18% (ELA/Math) did at Wheeling in 2022. It’s not OK to shrug this off because Niche produces a good “score” or because D214 as a district isn’t seeing property values tumble due to perceptions.
Now, I’ve never claimed to be a subject matter expert, but it certainly stands to reason that we need to direct more resources towards student learning. I also believe that if the district was more open about the students’ needs and struggles, community members would step up and be a part of volunteer programs.