Advertising

Sunday, August 28, 2005 - Page updated at 12:00 AM

E-mail article     Print

A new way of judging how well schools are doing

Seattle Times staff reporter

Your school's "value-added" data


How to find it: Every public school in Seattle has a "value-added" report that can be found on the district's Web site: www.seattleschools.org. In the left margin, click on "School Test Scores." On the next page, select your school. A new Web page for your school will list different kinds of reports. Scroll all the way down; the value-added report is the last listed.

How to read it: The district assigns color bars to make the reports more readable. Green stands for more than a year's growth; yellow, average growth; and red, less than a year's growth.

How to act on it: If your school has any areas shaded red, Seattle Public Schools data coach Allison Harris advises parents to ask their school how the staff is addressing it.

Compiled by Sanjay Bhatt

At Gatzert Elementary School, fewer than half of the fourth-graders passed a state reading test the federal government uses to separate good schools from bad. The Central Area school's pass rate in 2004 ranked among the lowest in the district and the state.

Not a place you'd want to send your kid, right?

You might be wrong — depending on how you define success.

And if you met 10-year-old Julio Barrera. Months before taking the state test, Julio and a dozen other struggling Gatzert students spent nearly every morning before school reading passages and drawing conclusions.

"They say I'm still a little bit slow," Julio says. "I'm almost normal."

Though Gatzert students may not ace the Washington Assessment of Student Learning (WASL), a relatively new tool shows the school's efforts have given students more than a year's worth of reading skills in one year — outpacing the progress of students in many "high-performing" schools.

At a time when national leaders have declared a crisis in public education, sophisticated measurement tools now offer another way to look at how effective schools are at raising student achievement over time.

That's the point of "value-added" data analysis, which Seattle Public Schools started collecting in the fall of 1999.

In the simplest terms, this analysis relies on a student's test-score history to project his or her next test score, then compares the actual score with the projected one.

If a school's scores exceed projections, it's evidence of effective instruction and rigorous curriculum. If the scores are mostly below projections, that suggests ineffective instruction. And if a school's scores are close to projected values, the district considers that a normal gain, or a year's worth of progress.

Seattle Public Schools is using this effectiveness data to see whether teachers are helping students on the WASL, and whether schools are serving slow, average and advanced students equally.

The Seattle Times requested the district's data for all schools from 2002-04 and shared its findings with district officials, who agreed with the trends The Times identified. The analysis revealed:

• High WASL scores don't automatically mean students learned more: For example, in reading, six elementary schools — all of them in affluent neighborhoods — with above-average WASL scores gave the average student less than a year's growth.

• Districtwide, the average student in grades four and seven is gaining more than a year's growth in math and reading; in grades six, nine and 10, normal growth. But there is wide variation among schools, with high-poverty schools tending to show the most robust gains.

• High schools vary greatly: In 2004 the average 10th-grade student at three schools fell behind in reading, and at five other schools grew more than a year. Passing the WASL is a graduation requirement starting with the Class of 2008.

• The average student falls behind the year after taking the WASL, which has been given in grades four, seven and 10. In half the schools, eighth-graders didn't show a year's gain in reading and math, and in more than half the schools, fifth-graders didn't show a year's gain in math.

Those trends raise many questions: Are advanced students in some schools being challenged enough? Why are students advancing their skills in some grades and falling behind in others? Why is one high school more successful than another in taking its slowest students' skills to the next level? And does this measuring tool simply allow schools to shift the focus off low test scores?

Evaluating students' progress, not just whether they pass or fail their state's high-stakes test, has become a hot issue in statehouses and Congress since the 2001 passage of the federal No Child Left Behind Act, which mandated annual testing in reading and math by the states.

Under the law, schools that receive Title I funds — intended to support poor students — must ensure that by 2014, all students taking state tests pass them.

Schools that repeatedly fail to make "adequate yearly progress" toward this goal face escalating sanctions that affect their use of Title I funds, starting with offering families other school choices and ending with forfeiture of those funds.

Even schools that don't get Title I funds must raise the achievement of all students to avoid being considered a "failing" school. (Schools avoid sanctions if they show at least 10 percent reduction annually in the percentage of students not passing state high-stakes tests.)

School districts received nearly $13 billion last year in Title I federal funds, which are intended to provide extra support to 25 million K-12 students nationwide.

A recent federal study found that, in 2003 and 2004, about one in 10 of the nation's 50,000 Title I schools began facing sanctions. Among the states, the number of schools affected ranged from none in Wyoming to almost half in Georgia in 2004. (In Washington, fewer than 5 percent of Title I schools faced sanctions.)

In the midst of these national trends, many teachers dismiss pass rates on the WASL as a reliable measure of their effectiveness; for starters, the WASL trends compare different students' performance.

To obtain more reliable data, Seattle has been tracking the same students' year-to-year performance on the WASL and the Iowa Test of Basic Skills from grades three through eight.

Some teachers give short shrift to that data, too, because they compare performance on different tests. That won't be the case starting this year, when students in grades three through eight take the WASL.

"I think if we had at least some consistency in the testing instrument it might be valuable," said Pat Robertson, vice president of the Seattle Education Association. "And it would be certainly better than just comparing one school's scores to another school's scores in any particular year. It's about improvement and not strictly how are you doing at this particular point in time."

Adds Wing Luke Principal Ellen Punyon: "It's not what you come in with. It's what you gain while you're here."

Seattle a pioneer

Seattle was one of the first urban districts to embrace the so-called "value-added" approach. The district pays statistician William Sanders, who first applied value-added data to Tennessee schools, about $47,000 annually to crunch the numbers in his proprietary software. Several states, including Ohio and California, are now measuring academic progress as part of their systems for holding schools accountable.

Federal officials have taken note. Over the summer, a group of experts appointed by U.S. Education Secretary Margaret Spellings met to discuss how the No Child Left Behind rules could recognize student progress on high-stakes tests. Theodore Hershberg, public-policy professor at the University of Pennsylvania, is in the group.

Hershberg said the law unfairly sanctions schools with low student test scores by not considering how far behind some students were when they began the school year. The law, he says, also should hold accountable schools that don't give a full year's growth to high-achieving students — schools that experts call "slide 'n glide" schools.

"Our challenge in this country is to raise the bar for everybody," Hershberg said. "While that has to be done, the biggest challenge is meeting the competition globally. We can't have slide 'n glide schools in this country. High-achieving kids are just as worthy as low-achieving kids are."

In that regard, the Seattle district is doing an excellent job in math in seventh grade, said Sanders, manager of value-added research at the North Carolina-based SAS Institute: Slow and average students made the greatest gains in 2004, but advanced students also grew by more than a year.

"You often see that in inner-city schools where there's so much pressure on those educators to meet federal [adequate yearly progress] requirements ... it's about getting the lowest proficient kids to a proficient level, and they let their high-achieving kids slide back," Sanders said. "I would salute Seattle."

Spellings' group is examining the pros and cons of different ways to measure student academic progress — Sanders' model is one of them — and is expected to issue a report later this year.

Daniel McCaffrey, a senior statistician at RAND Corp. who reviewed value-added research last year, said it shouldn't be used for high-stakes decisions, such as closing schools or awarding bonuses, because there are too many unanswered questions about the approach's reliability.

For example, he worries that effectiveness ratings for schools with high student turnover could be biased because of missing student test scores. He also questions whether it's appropriate to chalk up affluent students' gains to a particular school, rather than acknowledging those students would succeed at any school.

Sanders said his statistical approach minimizes potential bias by taking advantage of all student data available. McCaffrey's concern about affluent students isn't borne out by real-world data, he said, that show high-achieving kids in similar schools experiencing different rates of progress.

These are the kinds of issues the state is also examining as it plans early next year to adopt a way to measure academic growth, said Joe Willhoft, director of assessment at the Office of Superintendent for Public Instruction. The Kent School District is waiting to see whether OSPI accepts value-added data before collecting its own.

"We do think parents, teachers and schools are going to want to know how individual students are doing from one year to the next," Willhoft said.

But parents don't necessarily care about how much of their child's growth was a result of the school's efforts, he said.

Marcelina Barrera does. The 42-year-old Guatemalan native wants to know the school is doing its job. She's doing everything she can as a parent to nurture high expectations for her daughter, Ana, 16, and son, Julio.

While juggling housekeeping jobs, Barrera drops Julio off and picks him up daily at school, checks in with his teachers on Gatzert's family nights and goes to the school when she can't understand his report card.

"I talk to teacher before school," said Barrera, who isn't fluent in English. "I say, 'He OK? He need more?' "

"I need harder math," Julio interjects.

Gatzert will continue the before-school intensive reading program that Julio was in last year, Principal Norma Zavala said. With fifth-graders taking the WASL next spring, she has already identified which returning students need extra academic support.

Julio's name is on the list.

Sanjay Bhatt: 206-464-3103 or sbhatt@seattletimes.com

Copyright © 2005 The Seattle Times Company

advertising


Get home delivery today!

Advertising

Advertising