If you’ve been reading this site for more than a few months, you’ve heard about the world-famous Cupertino Union School District, and how foreigners with suitcases full of cash will pay anything to buy a house with Cupertino Schools. You’ve also heard about Palo Alto schools and the Mission San Jose area of Fremont.
The state API test results are in, and Cupertino has two of the highest scoring schools in the state. They, along with Palo Alto and Fremont, also have been marked for Program Improvement, which means they have failed to make their required targets.
By Sharon Noguchi, San Jose Mercury News
Posted: 08/31/2011 12:00:17 PM PDT, Updated: 09/01/2011 03:45:45 PM PDT
(Photo, left) Second-graders work on their reading skills in their class, taught by Lisa Gregoire at Cesar Chavez Elementary School, in San Jose, on Wednesday, Aug. 31, 2011. The K-5th grade school has vaulted 63 points this year on API — the state’s measurement of academic achievement — after climbing 64 points last year. (KAREN T. BORCHERS)
In contrasting scenes of celebration and chagrin on Wednesday, South Bay schools again topped the state in annual test scores, while more of them than ever before are being labeled failures by the federal government.
Two schools in the Cupertino Union School District, Faria and Murdock-Portal, tied for first in the state, with 998 on the Academic Performance Index, among elementary schools. Yet simultaneously the district fell into the feds’ failing category. It’s among plenty of surprising, and surprised, company. Santa Clara Unified’s Millikin placed second with 997, Fremont Unified’s Mission San Jose placed third with 996 and Palo Alto’s Hoover placed fourth with 995 among elementary schools on API. Yet all three school districts landed in "program improvement," the federal equivalent of a report card "F."
Meanwhile several districts with students that traditionally have struggled — Alum Rock, Gilroy and Sunnyvale elementary — posted strong gains. They’re still on the federal watch list, but teachers were elated to see their progress.
So with rising scores at the top and bottom schools, and in a valley known for stellar public education, how is it that 19 of Santa Clara County’s 31 school districts, plus the County Office of Education, appear poised to suffer federal sanctions and embarrassment?
How can this be? 19 out of 31 districts in the county synonymous with Silicon Valley not meeting federal goals?
This graph (right) from the Merc helps tell the story.
No Child Left Behind required all subgroups, not just the school district as a whole, to maintain increasing proficiency levels, and for 2011, schools and every subgroup need to hit 67% proficient on all state tests. The reasoning behind it made sense: help all students, not just the ones who had money and college-educated parents.
The graph shows California schools were improving their test scores. They just didn’t improve them as fast as the goals were going up.
By 2014, all students are supposed to be proficient in every school district everywhere. How is such a goal going to be met? When some students have parents who work two or three jobs and aren’t home to read to them or help with homework, why is this the schools’ fault? When some students don’t get fed regularly, or live in the middle of a gang turf war, or don’t actually have a regular place to live, are they somehow magically going to score “proficient” on a state test?
That’s a wonderful goal, but expecting schools to make all children proficient without putting programs in place to support the students who most need it is insane. In fact, programs shown to help student results in poor families, such as Head Start, have been cut.
This is the equivalent of demanding all students be proficient at track events, but not providing track facilities to schools that didn’t have them, or excusing students from track meets who have to work to help their family pay the monthly rent check.
Now, where do Cupertino, Palo Alto, and Fremont school districts come in? The first two had excellent API scores. But some of the subgroups didn’t hit that 67% proficient mark. Now, if you know anything about statistics, you know that the smaller your data sample, the more scatter your see. What do you think will happen if you start measuring small subgroups of a school district population and demand that every single one of these various smaller samples, many of which are chock-full of the kind of students who don’t test well, all hit the overall goal? These subgroups include economically disadvantages, English language learners, and students with disabilities, and yes, every one one of those groups with these academic, financial, and societal challenges are expected to score as well as the overall school district, or to put things more bluntly, their target is the same as the groups with all the advantages.
Now what will happen if the goal is moved up 11 percentage points a year? How many schools are capable of moving all groups up at that rate? How realistic is it to demand that all English-language learners score 100% proficient in 2014, or all students with learning disabilities, or all students that qualify for reduced-lunch prices?
As the graph shows, it’s going to be more and more difficult for schools to “pass” the NCLB standards in the next three years, since if any subgroup “fails,” so does the entire district. Unless the standards are changed within the next year or two, any school district large enough to have disadvantaged subgroups will be accorded an NCLB Program Improvement school.
Getting back to Cupertino, two of the 25 schools in the district had subgroups missing the targets. That’s right, even if a school district has a 92% success rate in meeting these difficult targets, the whole district is a failure. One of the schools has the most transitional population in the district (Nimitz). The other is an alternative school whose philosophy embraces an integrated curriculum with small group projects, and many parents refuse to let their children take the state tests (McAuliffe). Amazingly, Cupertino didn’t get dinged for insufficient compliance overall (95% of a district and all subgroups must be represented on the tests.) They met 33 of 37 criteria.
In Palo Alto Unified, the only school failing to make the “grade” was Escondido, and again, the problem was not API scores. They met 25 of 34 AYP criteria. In particular, not enough students with disabilities participated in testing.
And in Fremont, also finding itself in Program Improvement for the first year, a whopping 19 out of 33 schools failed to meet the requirements. Fremont Unified met 37 of 46 criteria.
Here’s the list of school districts in Santa Clara County that aren’t in Program Improvement:
- Lakeside Joint (one elementary school)
- Loma Prieta Joint (one elementary, one middle)
- Los Altos Elementary (7 elementary, 2 middle)
- Los Gatos Union (4 elementary, 1 middle)
- Los Gatos-Saratoga Joint Union (2 high schools)
- Luther Burbank (one elementary)
- Orchard Elementary (one elementary)
- Saratoga Union (3 elementary, 1 middle)
- Union Elementary (6 elementary, 2 middle) Since this district has both middle schools marked as not meeting all requirements, I think this is a mistake saying the district is NOT in PI
What do these school districts have in common? They’re SMALL. The more schools in a district, the higher the odds one of them is going to miss a requirement somewhere, pulling down the entire district. That’s a guaranteed recipe for failure, and seems to be exactly what some people wanted.
This year, over 4000 California schools have “failed.” What will happen when almost every school district is considered “failing” by these insane standards, statistically guaranteed to make almost everyone a loser? And who were the idiots who agreed to them?