woman biting pencil

Chicago’s NWEA “Irregularites,” Explained

On Friday, Chicago Public Schools released a report from outgoing Inspector General Nick Schuler’s office describing irregularities in how NWEA MAP tests have been given. The investigation was launched in reponse to “numerous complaints to the OIG over the years about alleged NWEA cheating.”

Since the 2012-13 school years, CPS has been using the NWEA’s MAP Growth test to chart student growth and attainment, and has attached a variety of high stakes to test performance, from promotion and high school admissions for students to teacher and principal evaluations to school quality ratings.

That’s not how NWEA recommends its test be used. “We don’t intend for our assessment to be high stakes,” said Christine Pitts, policy advisor for the North West Educational Association, makers of NWEA MAP Growth. That said, she added, “We don’t control the ways superintendents use it.” 

The new Inspector General report points out problems with how tests are given in CPS and suggests how to tighten test security. As a parent and long-time CPS-watcher, reading this report also tells me we need to lift the high stakes for students on the NWEA immediately and rethink our approach to accountability. I’d personally like to see CPS continue to use the NWEA MAP, but as a diagnostic tool, the way its makers intended.

Is Adult System-Gaming Or Kids’ High-Stakes Stress Behind These Irregularities?

The IG report noted two significant problems with test administration. First, although the NWEA MAP is not a timed test, more than one-quarter of the students who took it in spring 2018 took twice as long or more to finish as the national average. Second, thousands of CPS students were allowed to pause their tests five times or more. Some students paused their tests as often as 50 or 60 times. The Inspector General found that tests with longer testing times and more frequent pauses were “more likely to show unusually strong growth.” (On Friday, as Chalkbeat Chicago reported, board member Elizabeth Todd-Breland disputed that finding.)

According to the report, incidences of extremely long testing times (three times the national average) and frequent pauses were mostly concentrated in a handful of schools. However, it’s not clear whether or to what extent deliberate cheating was involved in the patterns detected. 

In an unusual move, CPS released the report with all school names still visible. This went against the IG’s own standard practices. On Friday, in an emailed statement, Schuler told WTTW, “We urge the public not to jump to conclusions about individual schools based on our report.”

However, the report does make clear where there are loopholes and potential ways to game the system. The NWEA MAP is a computer-adaptive test, which means that when a student answers a question correctly, a harder question follows. When a student answers incorrectly, an easier question follows. However, if a student pauses the test, the test replaces their current question with another one of similar difficulty. The report suggests that students and possibly teachers could be taking advantage of this feature to try to replace a question a student can’t answer with a question they can.

At the same time, interviews with students in the report, especially seventh- and eighth-graders taking the NWEA for high school admissions (7th grade) and promotion to high school (8th grade) suggest that high-stakes pressure is pushing children to extend the time the take on the test to exorbitant lengths. From p. 23 of the report:

A seventh grader … said students would have rather seen a question time out than answer it incorrectly because they were worried about the impact the test results would have on their ability to get into certain high schools.  “We were so worried about high school. We didn’t want our score to drop. . . . I know for a fact that guessing — we would never do that,” the student explained. 

Teachers and parents I spoke with also noted that extremely high-achieving students may well spend longer on the test because the computer-adaptive nature of the test makes it highly likely they will get the hardest questions, especially in math.

Is It Cheating Or Good Instruction That’s Getting Big Gains?

As I read the report, the most disturbing chart I saw listed “clusters of high-growth students,” so high, in fact, that chances were less than one in a million they would occur in a random sample of CPS students in that grade and subject. In other words, all the chart tells you is that the growth you are seeing in those grades and subject at those schools is not due to chance.

And you know what? That ought to be great news. It should mean, in at least some of the cases, it’s due to great instruction!

In fact, the report itself notes that that is a possibility. “High-gaining schools/grades/subjects flagged by the OIG are worth further inquiry, to determine both whether their tests were administered properly and what the schools might be doing wellto help their students achieve such unusual growth.” [italics mine]

As an example, the report notes that about one-quarter of Faraday Elementary’s fourth-grade students achieved astonishingly high growth in math, without taking an unusual amount of time or pausing the test.  Chicago Unheard wrote about Faraday last year, and we chose them because their high growth–and attainment–suggested something is going well there when it comes to instruction.

To sum up, what I’m taking away from this report is that the stakes on NWEA alone are much too high and we need multiple ways to assess student, educator and school performance to reduce the pressure, especially pressure on kids. It’s a shame a useful test like the NWEA MAP has become the symbol of all that is wrong with testing, because as a parent I’ve found its results helpful in making decisions about my own child’s education.

I wish this could be an opportunity to talk about how to improve assessment and accountability, not just double down on test security. But from CPS’s initial response—which is to hire a testing expert and release the names of schools mentioned in the report without appropriate context—I’m not very hopeful.

WHAT DO YOU THINK?
The following two tabs change content below.
Maureen Kelleher

Maureen Kelleher

Maureen Kelleher is a senior writer and editor at Education Post, but before that she spent a decade as a reporter, blogger and policy analyst. Her work has been published across the education world, from Education Week to the Center for American Progress. Between 1998 and 2006 she was an associate editor at Catalyst Chicago, the go-to magazine covering Chicago’s public schools. There, her reporting won awards from the Annie E. Casey Foundation, the International Reading Association and the Society for Professional Journalists. A former high school English teacher, she is also the proud mom of an elementary student in the Chicago Public Schools. Find her on Twitter at @KelleherMaureen.

More Comments