by Rebecca Fairley Raney
 
 
 

School statistics, and a personal breakthrough

What it won: First place, investigative reporting, Press Club of Southern California, 1992.

This was the story that changed the course of my career.

At the time I started it, I had been covering schools in San Bernardino County for a year. In covering the beat, I got worn down by endless stories in which school people talked about all their great programs. Though they promoted the effectiveness of their programs, they rarely had evidence to back up their claims. To me, it was all fluff, and I couldn't see the point in perpetuating it.

But one morning in early summer of 1991, I was talking with Chuck Terrell, the county Superintendent of Schools. Chuck was quite a character, and he was on a tear that day. He was mad about the Bush Administration's new national test. In fact, he was so annoyed with the whole thing that he was encouraging the county's school districts to boycott the test.

Then he said one thing that really caught my attention: He told me that neither the state nor the federal education departments would release the names of the schools that had taken the test. He was suspicious as to whether the test truly reflected California's student population.

I thought he was being melodramatic, so I checked for myself. And sure enough, when I asked for the names of the schools, I was told, essentially, that the information was classified.

Suddenly, school coverage got very interesting.

I filed requests for the names of the schools under the Freedom of Information Act, and after several months, a state official sent me the list by mistake.

Then I tried something that would fundamentally change my view on news coverage: I recruited a statistician to help me figure out whether the federal statisticians had taken a good sample of California schools. As it turned out, they had taken a perfect sample. A mirror image.

But by getting that sample, we discovered something really troubling: California's school population is so diverse, and its school performance goes to such extremes, that you cannot measure school performance with a bell curve. Statistically, the state's population is not normal. So all the testing and the school-by-school comparisons that formed the basis of the state's education policy was not statistically valid.

This finding was important in an environment in which teachers could be punished and principals could be fired over test scores. In fact, we found that in state math tests, the schools produced two distinct averages: One for the students from wealthy families, and another for the poor.

While I was working on the project, Jack Schuster, professor of education and public policy at the Claremont Graduate University, told me that my work was more advanced than the research that was being conducted by the school's doctoral candidates.

Looking at the story now, I wish I had done a better job with the writing. It was my first experience with moving from the left-brain math function to the right-brain writing function, and I've learned over time that it's very tricky to make that transfer.

But I never looked at numbers and public policy the same way again. The story started me on a path of using empirical methods to report news, and it fundamentally altered the direction of my career.

State's schools statistics flawed

California is too diverse for comparisons

The San Bernardino County Sun

Sunday, Oct. 6, 1991

(EXCERPT)

Ninety-three percent of the children at San Ysidro Middle School are Latino, and only 17 percent speak English. Most of their parents did not finish high school, and their families are poor. Test scores of the school near the Mexico border are low.

Up the coast at Niguel Hills Junior High School, 96 percent of the children are White, and most students speak English. Nearly 80 percent have parents who went to college, and many families have six-figure incomes. The Orange County school's test scores are high.

Further north, nearly 40 percent of the children at Big Pine Elementary School are American Indians. Most students speak English. Few of their parents went to college, and many families are poor. Test scores of the school near the Sierra Nevada Mountains are mediocre.

These schools, diverse as they are, represent the broader view of schools statewide. As a result, they were among 98 in California selected by federal researchers to participate in a national math test of eighth-graders.

But a statistical analysis of the 98 schools by The Sun shows California is too ethnically and socioeconomically diverse to produce meaningful state averages.

Put another way: Even a representative sample of California schools is a bad sample.

"There really is no average school around," said Stavros Busenberg, professor of mathematics and statistics at Harvey Mudd College in Claremont, after reviewing the study's findings. "If one talks about average behavior, one assumes there are a lot of average entities around. In this case, the average numbers cannot mean much. That means that statistics based on normal distribution do not mean much."

Among the other findings in the study: Schools that primarily served impoverished areas made consistently low test scores, and schools that primarily served affluent areas made consistently high test scores.

Bill Honig, state superintendent of public instruction, agreed that "averages are not enough. We've been saying that for years."

But he said the state has tried to promote other yardsticks in gauging the performance of schools.

The study's findings could have broad implications:

- Years of state education legislation -- and billions of dollars in public fund allocations -- may have been based on faulty statistics.

- If any state education average is meaningless, then individual schools have been unfairly compared with the average.

- Evidence that test scores are more likely to measure a student's background than his potential could bring sweeping changes to how society labels schools, said Charles T. Kerchner, professor at the Claremont Graduate School of Education.

"These findings call into question our definition of what's a good school. That, to me, is the biggest education story of the decade."

- California is so diverse that it is statistically unsound for the federal government to break out the state's averages in reporting national test results, said federal researcher Michael Scriven, who worked on a team that evaluated the national test's procedures.

The heart of The Sun's study was a computer-generated statistical analysis of three years' worth of California Assessment Program test scores in the 98 schools covering math and reading, as well as the percentage of minority students within each school.

The study methods and results were reviewed by three statisticians -- Busenberg; David Drew, statistician for the Claremont Graduate School of Education; and Robert Newman, professor of psychology with specialization in policy behavior analysis at California State University, Long Beach.

The findings triggered outrage among some educators.

"These misleading scores are horrible," said Terry Moe, a Stanford University political science professor who co-authored a controversial education reform book called "Politics, Markets and America's Schools."

(CUT)

The scores have a powerful impact at every level of the education system. Principals can get fired over bad CAP scores. Realtors can sell houses with good CAP scores.

Considering The Sun's findings, testing experts now say CAP scores were misrepresented by the state. Because of the high variability in scores statewide, schools should have been placed within a range of scores instead of being branded with one number.

(CUT)

The study's findings on the correlation between socioeconomic background and test scores also back up another contention that Cox, a testing specialist, has held for years: Test scores have no place in politics.

"Test scores, per se, are not a sole indicator of program quality. To conclude something about the quality of a program based on a reading score -- if 50 percent of the students don't speak English -- it only means they don't speak English."

(CUT)

Honig uses the improvements in the state's average scores to lobby for education funding. With the improvements, he tells legislators that schools are getting better -- even though many mathematicians argue that the annual increases in the state's average test scores are not statistically significant.

Stanford professor Moe has long been a critic of that style of policymaking.

He offered schools in Palo Alto as an example. Many students there are children of Stanford professors and students. Others tend to come from affluent families.

The schools always get good test scores, but Moe said that doesn't necessarily mean the schools are doing something amazing.

"So we have good schools? Well, we have good kids. It could be that the schools are actually holding these kids back. But they get good scores. Those numbers say nothing about whether it's a good school."

(CUT)

State Sen. Bill Leonard, R-Big Bear Lake, said it is critical that California find new ways to measure schools' accountability.

Leonard said he's always been suspicious of the numbers he uses to sell his bills.

"I'm a believer in facts. But I'm not comfortable with the quality of the numbers.

"If the statistic lends us credibility, we don't check. I've checked on a few and decided not to look any further. The ones on my side were as empty as those on the other."

(CUT)

Timeline: Getting the information

Question: Which state schools took the National Assessment of Education Progress test?

Getting the answer:

- June 27 -- California State Department of Education officials say they do not have the list of schools tested. They refer the inquiry to the U.S. Department of Education in Washington.

- July 1 -- U.S. Department of Education spokeswoman Melinda Kitchell orally denies the request for the list.

- July 2 -- Kitchell sends a six-page fax transmission to cite a law which she says supports her denial of the request.

- July 3 -- The Sun files a Freedom of Information request with the U.S. Department of Education to get the names of the schools.

- July 18 -- The federal Education Department misses the first deadline to respond to the FOI request. The Sun calls the department's FOI officer, Alexia Roberts, who responds: "As you know, the courts have never penalized an agency for missing the 10-day limit." She gives no estimate on when the request will be answered.

- July 23 -- After calling Roberts four times and getting no answers, The Sun asks to talk to Roberts' supervisor.

- July 23 -- Alexia Roberts' boss, Patrick Sherrill, says The sun's FOI request has been shuffled from division to division, but that the staff is working hard to answer the request. He also says the federal Education Department, in general, has little time to fulfill public requests for information.

- Aug. 1 -- The Sun calls Sherrill again. He gives no indication when the request will be answered. At this point, the federal Education Department has passed the extended deadline for the FOI response. The department is also in violation of the Federal Freedom of Information Act by not issuing written acknowledgment of the FOI request.

- Aug. 14 -- Sherrill is on vacation. Ira Mills, acting chief for the Federal Information Review Branch, says department attorneys have drafted a rejection of the FOI request. Mills said the department's head statistician, Gary Phillips, is blocking the release of the information. The Sun sends a letter to inform the department it is open to court action for not responding to the FOI.

- Aug. 14 -- The Sun tries again to get the list from the state. State officials say they don't know whether they have the names. But The Sun reaches California NAEP coordinator Tej Pandey on vacation, and he directs his staff to release the list.

- Aug. 15 -- The state releases the names of the schools.

- Aug. 19 -- Pandey asks for the list back. He says the state signed an agreement with the federal government not to release the names of schools that took the test. Pandey says he has come under fire from Phillips. The Sun rejects Pandey's request.

(END)

OTHER WINNING STORIES:

1. Paradise in Peril: The California real estate crash

2. Too Violent, Too Young: Life on the street

3. School statistics, and a personal breakthrough: The secrets behind test scores

4. Winning, as an alternative to starving: Political coverage, and a miracle baby

5. Yes, traffic is funny: Life outside the carpool lane

6. Turbulent Skies: The politics of medical helicopter service

7. Nailing down the building boom

MAIN AWARDS PAGE

 
   

Mailing address: 112 Harvard Ave., #112, Claremont, CA 91711