By Yvette tenBerge
Board members, administrators, teachers and parents throughout San Diego County held their breath on Wednesday, August 15 waiting for the California Department of Education to release the 2001 Standardized Testing and Reporting (STAR) Program results. For those with children or students in the San Diego Unified School District, the wait was even more unbearable, though, as they waited to see if the promises made by the school board majority, Superintendent Alan Bersin and Chancellor of Instruction, Anthony Alvarado, would hold any water.
Anticipation turned to bewilderment, though, as parents watched Mr. Bersin present the district's interpretation of the results on television that afternoon. According to the district press release entitled "Steady Progress," an "initial analysis of the data showed that two-thirds of San Diego students exceeded the standards, met the standards or came close to meeting the English language arts standards," and that San Diego students did better than the "state overall, and better than students in other urban districts." The release goes on to state that 5,000 more students were tested in 2001, and that the district gains are made evident by comparing 1998 results with 2001 results.
One of those puzzled by the manner in which this data was presented was Frank Lucero, a math specialist who has been teaching for 24 years and who, most recently, has been responsible for training math specialists within the district and at San Diego State University. His curiosity peaked when the San Diego Union-Tribune chose to compare 1999 scores with 2001 scores, also leaving out 2000 data.
"I was intrigued that the Union-Tribune compared the 1999 scores with the 2001 scores since I was interested in comparing the 2000 scores with the 2001 scores. This would allow me to see the effect that the first year of Blueprint implementation had on our students," says Mr. Lucero, who expected to see "amazing changes" in reading scores and "some changes" in mathematics scores. These improvements seemed almost guaranteed in his opinion, since the elementary curriculum and instructional practices of the Blueprint are "so scripted that whatever changes occurred had to be the result of this Institute-mandated uniformity."
What Mr. Lucero found when he analyzed the California Department of Education information himself, though, was that the gains between 2000 scores and the 2001 scores were not only not impressive, but that, in many cases, the scores were either stagnant or they had, in fact, dropped.
"What I found was contrary to what was published in the Union-Tribune. For example, when comparing the 1999 and 2001 scores for Central Elementary, there were two-digit gains across all grade levels in reading and gains in math. In 1999, the percentage of students scoring at or above the 50th percentile in reading was 27, and in 2001, the percentage was 41. This gives you an impressive, 14-point gain. In math, the 1999 scores were 55 and the 2001 scores were 66, for an 11-point gain," says Mr. Lucero. "However, when comparing 2000 and 2001 scores there is a different scenario. Reading went from 45 in 2000 to 41 in 2001, for a decline of four points, and math went from 78 in 2000 to 66 in 2001 for a 12-point decline."
Mr. Lucero also used a sample of 10 focus schools (designated low performing schools), and 10 non-focus schools to compare the 2000 data with the 2001 data in reading and math. Over 50 percent in this sample of elementary schools had a drop in the number of students scoring at or above the 50th percentile in reading, while 70 percent had a decline in students scoring at or above the 50th percentile in math. This sample is only one of the numerous charts and graphs that Mr. Lucero created, none of which showed the "steady progress" that the district claimed.
David Smollar is the Information Service Specialist for SDUSD. He states that the reason the district did not highlight the 2000 scores had little to do with deception and more to do with reaching a more fair assessment of student progress. "The reason the district did not do a year-to-year comparison is because a three-year comparison in terms of testing research is thought to have more meaning due to anomalies that happen year to year. We are not satisfied because in some cases our scores did plateau. There are places where we are flat in reading and statistically down in math," says Mr. Smollar."We do not want to be flat, but we still feel that the trend is up."
Whatever the district's reasons for excluding the 1999 and 2000 scores in their press release, comparisons between the 2000 and 2001 scores have lead many to conclude that the expensive Blueprint for Student Success, with a price tag of $98 million this year alone, is not cost effective.
Despite frequent claims by board majority members that the superintendent's goal of closing the gap between economically disadvantaged and non-economically disadvantaged students is being realized, an analysis of data from the California Department of Education proves otherwise.
Comparing the performance of economically disadvantaged students with the performance of non-economically disadvantaged students on the Stanford-9 test in 1999, 2000 and 2001 shows that the achievement gap has, indeed, widened. From second grade to eleventh grade, the gap widens anywhere from two percentage points to 12 per grade level.
Jeff Lee is the parent of two children being educated within SDUSD and is the Co-Founder of Alliance for Quality Education, a non-profit organization dedicated to informing parents about the academic and financial workings of the district. He has watched the district change for the past 10 years and has pushed hard for the district to align their curriculum with state standards.
"All school districts say that what they are doing is the right thing, but the key is that parents and the public must look past these public relations tactics and look at the hard facts. The achievement gap has not closed as promised. The fact of the matter is that any progress that was reported was artificial, and a lot of this had to do with the opt-out rate," says Mr. Lee, explaining that over the past few years, the number of students actually taking the test has fluctuated because the number of those opting not to take it has increased. He recounts stories told to him by teachers and parents in which teachers claimed that their principals encouraged them to have parents of low-performing students keep their children at home during testing dates.
"Historically, the district's opt-out rate has been below five percent, which is normal. In 1998, the district's opt-out rate was 6.6 percent, which is significantly higher than what is traditional. The next year, in 1999, the opt-out rate was seven percent, and in 2000, it was 7.3 percent. In fact, the district skimmed off the bottom to try and raise the district's overall performance. Now in 2001, the opt-out rate has plummeted back to a normal rate of 3.3 percent," says Mr. Lee, explaining that the reason for this change has to do with the Academic Performance Index, a state measurement of a school's performance. In order to be eligible for certain monetary state incentives, 95 percent of their students must be taking these exams. "This is a fairly new phenomenon, and as the API started to kick in, we expected to see more schools wanting to have their students taking these exams."
This push to have more students take the STAR tests would confirm and explain the district's claim that, "Almost 5,000 more San Diego students were tested in 2001, including many who are not yet fluent in English." For those who wonder about how this influx of students affects the comparison between the 2000 and 2001 numbers, Mr. Lucero explains that the percentage comparisons are valid because you are still comparing "percentages with percentages" and not the number of students with the number of students as the district has done on their website.
But the students are not the only ones being tested with these scores. As Mr. Lee points out, an aptitude test is also very effective for evaluating the efficacy of the method of instruction, itself. "Test scores are not the only measure of a student's performance, but they are an important indicator. What is unfortunate about this is that a proper use of test scores is to use them to evaluate whether or not you are doing the right thing. If test scores indicate that you are not, change course and change programs. This year's test scores clearly say we are doing the wrong thing, but the district says we are doing the right thing. The facts simply do not support the district's conclusion."
Critics of the Blueprint contend that much of the problem lies in the fact that the Blueprint for Student Success curriculum does not actually match up with the state standards for education. This lack of connection between what goes on in the classroom and what appears on the achievement tests becomes a deadly combination for the students.
According to Mr. Smollar, the district days of not incorporating state standards into the curriculum are in the past. "First of all, we have no choice but to begin teaching this material. Second of all, Mr. Alvarado has said over and over and over again that he wants kids' progress to be based on state standards," says Mr. Smollar. "If they do well based on the standards tests, that will actually mean something."