When Pennsylvania Education Secretary Ron Tomalis issued statewide test scores last week, he noted that a technical advisory committee had analyzed three possible reasons for drops in student performance: funding, changes in test content and tighter test security.
But the head of the committee of outside testing experts says it didn't analyze funding. Nor was the committee's analysis quite as conclusive as it appeared in the secretary's PowerPoint and comments.
In answering a question about the relationship between scores and funding at a news conference last week, Mr. Tomalis said the technical committee was asked a "very specific question about budgets: Could budgets have impacted them? They said no."
He stated: "TAC found that the only scientific cause for the drop in scores from 2011 to 2012 was the department's investigation of past testing improprieties, which has led to heightened test security measures."
Marianne Perie, facilitator of the seven-member committee and senior associate at the Center for Assessment based in Dover, N.H., said the committee only briefly talked about funding and didn't analyze it.
"I wouldn't say we ruled it out. I would say we had no comment on it. We didn't have the information or the data to make any statements," she said. "Overall, the TAC felt very strongly that the reduction in cheating accounted for part of the score drop. Tightened security could have also led to a drop in scores for a myriad of reasons. However, proving causation is difficult, and we do not have evidence at this time that a reduction in cheating and tightened security were the only factors. The TAC did not feel it had identified all of the factors."
The committee of psychometricians -- experts on standardized testing -- was making a technical review, not a policy review.
After TAC meetings in Harrisburg May 31 and June 1, Ms. Perie said, "We walked out of there not feeling satisfied we had come up with a solid explanation for the drop."
Asked to comment on the statement that funding was not analyzed, Tim Eller, spokesman for the state Department of Education, wrote in an email that the committee "had a conversation about funding; however, its conclusion was that if there was any change in funding, it would not be realized in the first year."
"Much has been made about the issue of funding," Ms. Perie said. "As a technical advisory committee, we do not typically comment on policy issues. Funding is a policy issue. We had no data with which to make any analysis of the relationship between the decrease in funding and the drop in scores."
Mr. Eller did not provide a copy of the panel's summary report, writing that it is "under review to determine what portions can be publicly shared."
Given possible cheating skewed results in prior years, Mr. Tomalis called the latest results "a reset point for student achievement in Pennsylvania."
School superintendents and others have challenged that statement, blaming funding decreases instead.
The Corbett administration maintains it has not decreased education funding, but school districts across the state have had nearly $1 billion less both this year and last year to spend, in part because of the end of the federal economic stimulus.
That has resulted in larger class sizes and the elimination of tutoring programs aimed at helping struggling students.
Overall scores dropped on the Pennsylvania System of School Assessment math and reading tests given in the spring to grades 3-8 and 11.
Statewide, the drop in the percentage of students scoring proficient or advanced was a few percentage points, but in some schools the drops were in the double digits.
For the 2012 tests, the state instituted stricter security procedures as a result of a cheating investigation that showed excessive erasure patterns on score sheets in some schools.
The investigation began with 48 school districts and charter schools, 30 of which were cleared. Nine, including Pittsburgh Public Schools, remain under investigation.
Mr. Tomalis said "well over 100" disciplinary complaints will be filed against public education officials over the cheating issue. There were more than 148,500 professional personnel -- including administrators and teachers -- in Pennsylvania last school year.
In the schools that were flagged for atypical erasure patterns, Ms. Perie said there was on average a "huge drop" in test scores, up to 20 times more than the drop in other schools.
"The schools that were highlighted are schools where something was going on. ... What we don't know is: Were there schools that weren't flagged and had cheating going on and they just weren't caught?"
The increased security measures included having all test administrators sign a form saying they could be criminally prosecuted if they didn't follow procedures.
The measures also recommended students not to be tested by their regular teacher.
Ms. Perie believes most of the security changes "should have had good effects, reducing cheating. There could be some possible negative consequences as well.
"We encouraged Pennsylvania to monitor what's happening with those changes."
Possibilities that merit further study would include whether "teachers are so worried that they are not providing the assistance that is allowed," she said. Another possibility is young students may be uncomfortable being tested by a stranger.
Nevertheless, she said, the changes the state implemented to reduce cheating appear to be effective.
"Scores decreasing more in schools where cheating was suspected is strong evidence of that. Higher omission rates could also be evidence that teachers were no longer completing incomplete test forms," she said.
Education writer Eleanor Chute: email@example.com or 412-263-1955.