Pittsburgh Public Schools, state use different formulas to calculate value added by schools

Share with others:

Print Email Read Later

By a state's score, Pittsburgh Liberty K-5 in Shadyside is the highest performing school among Pittsburgh Public Schools.

But by a district score, Liberty is in the bottom half. It ranks 17th of 22 K-5 schools and 34th among 50 district schools.

The difference is the measuring stick -- what it counts and how it counts it.

Such differences increasingly matter. Not only do the numbers shape public perception of a school, but measuring sticks using student test scores are beginning to play a role in teacher evaluations statewide.

As the importance of test scores grows, so does the controversy, ranging from parents opting their children out of state tests to some questioning the ways results are used.

The range of ways to measure schools could give parents and educators whiplash as multiple ways of looking at schools increasingly come into play.

"We're so used to operating in a world where the information we got was very flat. It was thumbs up or thumbs down," said Mary Wolfson, director of accountability for Pittsburgh Public Schools.

"We're just not in that world anymore. It's much more dynamic."

One measuring stick gaining importance in Pennsylvania considers whether a student grew less, as much or more than expected in a year on state or other tests.

Such growth measures -- known as value-added measures or value-added models or VAM -- are beginning to play a role in teacher evaluations statewide.

While the specific VAM formulas vary, VAM is based on looking at improvement, not just on whether the student was proficient on an exam.

"The importance of value added in general is it isolates the contributions of schools and teachers to students' growth. It tells us more than just student performance or student attainment," Ms. Wolfson said.

"It goes a step further. What is the contribution schools are making to student growth and achievement?"

How the VAM is calculated makes a difference.

Suzanne Lane, a University of Pittsburgh education professor, said, "There are many statistical techniques that can be used in VAMs, and the results will vary depending on the technique and whether the data meet the assumptions."

She also said models don't consider all of the factors that could affect teacher scores, so "one cannot get a pure measure of teacher effectiveness."

Aside from the absolute test scores themselves, Pennsylvania has two ways to look at student achievement: the new building level academic score on the state School Performance Profiles, and the long-standing Pennsylvania Value Added Assessment System, known as PVAAS, which uses a proprietary value-added formula.

Pittsburgh also has its own VAM for schools and individual teachers, built by Mathematica Policy Research at a cost of $750,000 over five years and paid for by a federal grant and the Bill & Melinda Gates Foundation.

Pittsburgh this year is using its VAM for a portion of its teacher evaluations, and schools across the state will begin using PVAAS for a portion in 2014-15.

The state profile academic score -- the first of which was released last fall -- combines a variety of measures, including PVAAS, absolute test scores, attendance and other factors.

Brian Gill, senior fellow at Mathematica Policy Research, said the state profile score and the district VAM are computed so differently that it "doesn't make much sense to compare the numbers."

The district gives the state average on the district VAM a score of 50 while the state average on the state profile is 77.1. All of the schools have lower district VAM scores than state profile scores.

Unlike PVAAS, Pittsburgh's own VAM is adjusted for a variety of factors, including student family income, gifted and special education status.

Unlike the state profile score, Pittsburgh's VAM does not consider proficiency levels but only growth.

If students had high test scores but showed lower-than-expected growth during a school year, the school would have a low score.

In addition, Pittsburgh considers more grades and tests in devising its VAM than are included in PVAAS or the state profile score.

In elementary schools, PVAAS has a growth measure for grades 4 and 5 using Pennsylvania System of School Assessment tests. The state profile score also only uses state tests.

Pittsburgh's VAM uses those tests and grades and also measures growth in grades 2 and 3 based on the TerraNova, a national standardized test.

At the secondary level, the district's system also adds in district-developed curriculum-based assessments along with state tests for a building score.

"There isn't really a reason to expect that the scores would be closely connected to the state's SPP (profile) scores, which are intended to measure a lot of things, of which growth is only one piece," said Mr. Gill.

The state academic score can range up to 100 points -- 107 counting extra credit.

The Pittsburgh district VAM is calculated on a 99-point scale for 50 schools.

Liberty has an 83 on the state profile score but only 38 on the district VAM, a difference of 45 points.

Statistics being statistics, there often is some wiggle room.

The state doesn't calculate a standard error for its academic score, but the district VAM does use confidence intervals.

When confidence intervals are used, a district VAM building score on average could fall 17 points higher or 17 points lower and still mean the same thing. In the case of Liberty, its VAM score could range from 21 to 56.

Liberty and 34 other schools have scores that are not statistically significantly different from average expected growth, which is a score of 50.

"That's actually a good thing," said Ms. Wolfson said, noting that represents a year's growth in a year.

Fifteen of the 50 schools with Pittsburgh VAM have results that are statistically different from average.

Only Pittsburgh Sunnyside PreK-8 is statistically higher than the expected average. Sunnyside's state profile score is 74 while its district VAM score is 66.

That's not the highest VAM in the district -- that honor belongs to Pittsburgh Whittier K-5 in Mount Washington, which has a 69 -- but it is the only one statistically significantly above the expected average.

This doesn't mean that Sunnyside necessarily did better than Whittier. Mr. Gill said the difference between 66 and 69 is "not meaningful."

The other 14 fall below: Lincoln PreK-5; Linden K-5 in Point Breeze; Spring Hill K-5; Langley K-8 in Sheraden; Mifflin PreK-8 in Lincoln Place; Montessori PreK-8 in Friendship; Morrow PreK-8 on the North Side; Classical 6-8 in Crafton Heights; South Brook 6-8 in Brookline; CAPA 6-12, Downtown; Obama 6-12 in East Liberty; Westinghouse 6-12 in Homewood; Brashear High in Beechview; and Perry High on the North Side.

Most of those schools have differences of 30 points or more between the state profile score and the district VAM, including Linden, which scored 78 on the state measure but 28 on the district VAM.

The lowest Pittsburgh VAM is 12, the score at both Morrow, which has profile score of 52, and Brashear, which has a profile score of 61.

On the school descriptions on its website, the district translates the growth measure by subject by "above average," "about average" and "below average."

In addition to the building-level VAM score, 38 percent of district teachers will receive a teacher-level VAM score to be used as part of their evaluations, Ms. Wolfson said.

Education writer Eleanor Chute: echute@post-gazette.com or 412-263-1955.

Join the conversation:

Commenting policy | How to report abuse
To report inappropriate comments, abuse and/or repeat offenders, please send an email to socialmedia@post-gazette.com and include a link to the article and a copy of the comment. Your report will be reviewed in a timely manner. Thank you.
Commenting policy | How to report abuse


Create a free PG account.
Already have an account?