Davis: The long, strange trip of reporting PARCC results
Tim Broderick, our editor of data journalism and graphics, got the word about 4 p.m. Wednesday.
The final release of state report card info -- how our children fared on a new standardized test -- scheduled for Friday, had been moved up to ... now. (Still a bit uncertain of the reason, but that's another story.)
That set into motion a flurry of activity. Tim hustled to get his online presentation ready to go. His efforts were accompanied by a detailed overview we had planned for Friday, thanks to a herculean effort by staff writers Melissa Silverberg and Madhu Krishnamurthy. The next day, we came back with stories detailing which schools were faring best and which were struggling the most. Melissa contributed a piece chronicling how the new test, Partnership for Assessment of Readiness for College and Careers (PARCC), had only widened the achievement gap between poor and more affluent students.
Whew. What a long, strange trip this report card season has been. Some examples:
• When the state issues its annual report card, the practice has been to give all media the reams of test scores and scads of demographics. In the past, this massive data dump usually came at the beginning of October. To give everyone a chance to plow through it, the info was embargoed until the end of the month.
• But this year the info was released in three phases: 1. The overall statewide averages, but without scores for individual schools, in September. Oh, and just from test results taken online. 2. The demographic data -- administrator/teacher salaries, spending per pupil and such -- in October. 3. The final shoe-drop: Wednesday's release of school-specific PARCC data.
Perhaps the new test contributed to this scattershot approach. To be sure, the test has had its quirks. Some examples:
• It was unpopular in many quarters: Local educators said it was rolled out too quickly, not fully vetted, results came back too late to adjust curriculum. Teachers complained the increased test-taking time was further taking away from classroom learning. Some parents supported that notion.
• Many students did not take it. We reported last month that 20,000 students statewide skipped the test. In some instances in the suburbs, this stemmed from a parent- and student-orchestrated boycott in protest of the test. In other cases, it was simply a matter of the testing not aligning with classes some high schoolers were taking. (That's another quirk: Who takes PARCC in high school is driven by the courses, rather than grade level.)
• A new category was invented with PARCC: "Approaching" standards, creating an amorphous no man's land of achievement.
All this adds up to a real challenge for educators to gauge how students are doing. You can even see the dichotomy on the front pages of this paper: On Thursday, the headline was that 70 percent of students aren't meeting standards; on Friday, we celebrated the students who were succeeding. In the rapidly diversifying and evolving suburbs, though, the benchmark for success is elusive. Among the suburban schools we reported on, the number of students meeting or exceeding standards ranged from 4.8 percent to 83.2 percent.
What, precisely, does that mean? To me, it's that we all have much work to do.