The Folly of ILEARN
A 2019 replacement for Indiana’s long-criticized ISTEP and ISTEP+ standardized tests, ILEARN is administered each year to school children across our state.
Scores are released in the summer.
Catastrophic headlines follow.
News organizations across the state raise the alarm that only 30% of our kids are proficient in both English and math.
Moms For Liberty and other anti-public-education groups refresh their propaganda with the latest numbers. Here in Carmel, our Hitler-quoting Moms For Liberty chapter immediately started pounding the drum: “Over 40% of students at Carmel Clay Schools don’t meet the state’s minimum standards.” They will continue to repeat this ad nauseam, despite the fact that our schools ranked second in the state this year and have consistently been ranked in the top four spots for both ILEARN and its predecessors.
And rest assured that our lobbyist-funded legislators will be quoting the scores in their next session when they devise yet more creative ways to take money from traditional public education and shovel it to the charter and private schools that so generously donate to their campaigns.
So, let’s talk about ILEARN.
Those who have followed this page know that I have relied on ILEARN in the past. During our 2022 school board election, I got bogged down in repeatedly debunking ILEARN-fueled claims that our schools were failing. I also have relied on ILEARN scores to evaluate the performance of charter and virtual schools relative to traditional public schools.
When this year’s scores were released in July, I resigned myself to pouring through the data and updating my ILEARN summaries related to Carmel Clay Schools. At a minimum, I knew our local Moms For Liberty chapter would be using them to spread misinformation in our community.
Before I started analyzing the scores, I decided to see if I could better educate myself on ILEARN itself. I discovered that the state annually releases questions that were used on that year’s test. As I began reviewing these, it quickly became apparent that the test contains some bad questions. By the time I finished reviewing every available question, I had come to doubt whether ILEARN is able to tell us anything about students' abilities or knowledge.
There are questions that have more correct answers than students are allowed to choose.
There are questions that have fewer correct answers than students are required to choose.
There are questions where the correct answer by the condition laid out in the question is a factually false statement.
There are questions with no correct answer.
There are even occasional questions where all of the answers are correct.
And that’s just from what we can evaluate by reviewing the questions on their own. That's because the state releases the full text of questions, but does not include which answers are scored as correct. I would be shocked if the test isn’t actually worse than we’re able to tell by just looking at the questions on their own.
See for Yourself
To make it as easy as possible for people to get insight into ILEARN, I’ve pulled examples of bad questions and am providing them on this site, with commentary. Altogether, it’s a long read, especially because many of the questions rely on reading an accompanying text. That said, I think you’ll find it worth your time if you care at all about public education, the current state of our schools & their students, or the state of Indiana using a farce of a test to mislead its citizens. If you want the shortest possible overview, I've put 10 of the worst questions on their own page. There are then pages organized by grade level that provide additional examples.
You can find the overviews at the following links:
All of the questions included are for English language arts. The majority of the math questions require students to input answers directly, and it is very difficult to evaluate the quality of questions that are not multiple choice.
Similarly, I ignored any English Language Arts questions that were not in a multiple-choice format.
I also ignored any questions that required listening to a text as opposed to reading it, as it’s much easier to include written texts on this website.
After applying the above criteria, I was left with 153 questions to evaluate. Of those, 37 have been included as examples of bad questions in this evaluation. That's just under 25%, far more than enough to significantly and negatively impact the scores students receive.
I’m sure some readers will disagree with me on some of the questions I criticize, but I believe any person committed to an honest evaluation will come away convinced that ILEARN is not a good test, nor a reliable measure of our kids’ abilities. And conversely, I am sure some readers would criticize even more questions than those I’m documenting here. The point of this overview is not to comprehensively identify and address every problem, merely to expose Indiana parents, residents and media to the fact that ILEARN has enough problems that we should not take it seriously nor allow it to be weaponized to attack educators.
It's worth pointing out that there are many experts who disagree with the entire concept of standardized testing of this nature. From what I’ve seen, they make some very good points and cite compelling research. I may eventually take a closer look at that broader issue and offer some thoughts on this site. This current overview, though, focuses solely on ILEARN as opposed to the bigger picture that is K-12 standardized testing.
Lastly, I’ve had multiple Carmel residents reach out to ask when I’ll be providing an updated ILEARN overview. I have not yet decided if I will.
On the one hand, I think it's clear that ILEARN is not accurately measuring what it claims to measure. As far as using it to evaluate year-over-year gains/losses or comparisons between schools, I have not reached a conclusion as to whether it offers any insight into those areas in spite of its significant flaws. It’s certainly possible, but not a forgone conclusion.
On the other hand, given their commitment to attacking our schools, enthusiastic embrace of misinformation, and apparent disinterest in things like facts and ethical behavior, it seems a near certainty that our local anti-public-education activists will continue to level ILEARN-based attacks at CCS. If we continue to see them use this broken test to misrepresent the state of our schools to uninformed voters, I may provide an updated ILEARN analysis with this year’s scores, with the huge caveat that the test is incredibly flawed.