I can forgive state Education Secretary Ron Tomalis for his optimism that every child in Pennsylvania would score proficient or advanced on the PSSA tests by 2014. However, his adherence to this a priori nonsense and his current interpretation of PSSA test results indicate that he is mathematically challenged.
His comments that the "data" have been artificially elevated for the last couple of years due to cheating and fell (this year) because of increased test security is, as Dan Castagna, West Mifflin Area superintendent, says "absurd" ("Pa. Districts Show Steep Drop in Test Scores," Sept. 22). Does this mean that eight years of results are garbage? How did the technical advisory committee determine this variance component due to cheating? If the cheating is as profound as Mr. Tomalis says, then that ends any discussion of using these tests to evaluate teachers!
I have been tracking PSSA scores since 2005. My focus has only been on mathematics results in the elementary grades. My first thoughts in just a cursory review of the results were that the test must have been more difficult this year. I was not involved in administering the test and have never seen it.
However, I am puzzled by Mr. Tomalis' comment that the rate at which students left questions blank -- the omit rate -- increased in every grade and subject except 11th-grade reading from 2011 to 2012. How is it that questions were left blank? As I recall, the test was not a timed test, is multiple choice and there was no penalty for guessing. So who allowed students to leave questions blank?
I have seen some possible evidence of cheating in certain elementary and charter schools for the last three years, but I do not believe there is profound cheating at the district level.
The writer, who is retired, has worked as a math tutor and has proctored the PSSA exam in the past.