ChildsWork News, Feb. 7, 2012: Spotlight on Testing Special Needs Children and the Use of Scores

As I combed RSS feeds this morning, I was struck immediately by a prevailing theme that will certainly resonate with CW readers: testing. As we continually discuss on this blog, the use of high-stakes test is not so cut and dry as legislators would have us believe. More than simply numbers, the lengths to which individuals, schools and states will go to prove their worth through testing has disturbing consequences.

First, on the heels of last week’s report of Claremont McKenna College’s SAT scandal, our first article reveals that finance magazine Kiplinger has chosen to remove the school from its rankings of “Best Values in Private Colleges.” As US News and other ranking agencies decide what to do with Claremont’s future, the questions remains as to whether the use of test scores to judge anything – a college, a teacher, a school district – is the best way to go, which leads directly into article #2…

Here, California is still in the hot seat according to The Sacramento Bee, which details another budding problem with test scores in that state. K-12 schools in California have been found to be “misusing” the state’s California Modified Assessment (CMA) test, designed for special needs students, in an effort to inflate their district and school scores. The excerpt below is part of a larger article that evaluates the efficacy of capping test numbers for special needs students and the connection between test scores and self-esteem, not just funding and prestige.

Finally, special needs blogger Nirvi Shah addresses the same issue we see in California through the lens of another special needs student in South Carolina. When Anthony Herrera’s mother refused to subject her son (who has both Asperger’s and Type I diabetes) to state-mandated testing, the run around she got made her decide to home school. The reasons for her issues: No Child Left Behind and mandated reports of Adequate Yearly Progress (AYP). Though the Herrera’s have found peace, the debate is far from over.

Kiplinger Removes Claremont McKenna from Rankings

By Daniel E. Slotnik, writing for The New York Times

The personal finance magazine Kiplinger removed Claremont McKenna College from its 2011-12 “Best Values in Private Colleges” list on Friday, making it the first publication to change its ratings since it was revealed that the college had submitted inflated SAT data for six years.

“Claremont McKenna College unfairly earned its place as 18th-ranked private liberal arts college in our college rankings by reporting inflated SAT scores,” a statement atop Kiplinger’s “Best Values” list said. “We have dropped the college from our 2011-12 rankings of best values in liberal arts colleges and moved schools 19-100 up one slot.”  Bennington College moved into the top 100 as a result.

The news was first reported by the Claremont Port Side, a campus news magazine.

Kevin McCormally, Kiplinger’s editorial director, said he did not hesitate to remove Claremont McKenna from the rankings when he learned of their incorrect numbers.

“That was the only fair way to avoid misleading our readers,” Mr. McCormally said. “It was a really visceral reaction. If they’re admitting their data was wrong, what else can we do?”

He said that academic competitiveness was worth 25 percent of a ranking in Kiplinger’s private college rating criteria, just below the most important category, cost and financial aid. Test scores play an important role in the measurement, though Mr. McCormally could not specify the exact percentage.

Though a spokesman for Claremont McKenna said the college had no comment on Friday, Mr. McCormally said he had spoken to representatives from the school.

“They asked if they were banned from our rankings, and I said absolutely not,” Mr. McCormally said. “Going forward they will be asked for data, and if they provide accurate data, which I’m sure they will, they will be ranked with every other school.”

He added that he had not yet decided whether he would reinstate Claremont McKenna to Kiplinger’s rankings this year.

Mr. McCormally thought that whether a college would be removed from the rankings in the future would have to be decided on “a case by case basis.”

He said that the data Kiplinger uses to determine its rankings comes from Peterson’s Undergraduate Database, a third-party college data collector, so he did not believe that Kiplinger could do more to police incoming data. “We believe that most people are honest,” he said.

Bob Morse, whose blog Morse Code explains U.S. News & World Report’s rankings, wrote that “U.S. News will maintain its long-standing policy of not revising previously published rankings” in a blog post on Tuesday.

“Claremont McKenna College has promised U.S. News it will in the near future supply us with the average SAT scores for Critical Reading and Math that were used in the rankings so we can accurately answer the question what, if any, the impact of this misreporting had on Claremont McKenna College’s latest ranking,” Mr. Morse wrote. “After U.S. News receives the new data, we will estimate its actual impact on Claremont McKenna College’s rankings and publish that information in this blog.”

Mr. Morse said in an interview Friday that U.S. News & World Report was awaiting the results of an investigation being conducted by the law firm O’Melveny & Myers before deciding how to proceed.  He said it would have been virtually impossible for ratings agencies to catch the fake numbers in this situation because they cross check data against one another.

“Cross checking wouldn’t have caught this data,” Mr. Morse said.  “If somebody is sort of engineering falsification at this level we won’t catch it.”  He added that cases like that at Claremont McKenna make up “a very small percent of all the information that we publish.”

Mr. Morse said he was not perturbed by Kiplinger’s removal of Claremont McKenna.  “They made their own choice about how to handle it,” he said.

Mr McCormally did not think pulling the ranking was a punitive measure, but the only viable way to ensure that Kiplinger’s readers trusted the magazine.

“Pretty much anyone we find out lies to us in an effort to get a higher ranking will not get the ranking, they will be pulled out until we get accurate data,” he said. “We can’t let people admittedly compromise the system, that would sabotage the whole thing.”

Growing Use of Simplified Test Inflate Some California School’s Scores (Excerpt)

By Phillip Reese and Melanie Guiterrez writing for The Sacramento Bee

Threatened for the last decade with bad publicity and sanctions following poor test results, school districts have been known to grab at loopholes in state testing policy and rip them wide open.

Critics say that’s happening again with a new test for special education students called the California Modified Assessment.

Introduced in 2007, the CMA is a simpler version of the state’s regular STAR student achievement test. It’s tailored to special education students in grades three through 11 whom teachers and parents deem to have no chance at passing the regular test.

The federal government issued guidelines to the state saying the new test should be given to no more than 2 percent of students in those grades – about 100,000 children.

Four years later, almost 200,000 students are taking the test – a number that will likely grow as the CMA gains momentum.

The trend has consequences beyond special education.

CMA scores are tallied separately from scores on the regular test, the STAR California Standards Test. By removing failing students from the pool of kids taking the regular test, districts end up with a greater proportion of high-scoring students.

The CMA has inflated gains on the regular STAR test by about 25 percent statewide since 2007, according to a Bee analysis.

“It’s the old business of if you want your test scores to go up, don’t test the lower-scoring students,” said Doug McRae, a retired testing consultant and Monterey resident who helped the state design the STAR test.

McRae has gone public with his objections to the CMA, recently telling the State Board of Education – his former employer – that districts are abusing it. He said the CMA has some value but many students should instead take the regular STAR test.

“This is an easier test,” he said of the CMA. “If we don’t push these kids as far as they can be pushed, they might not be functional in society.”

Many special education advocates and school leaders disagree, saying the CMA gives students who fail the regular test a chance to do better.

The new test also lets schools better measure student progress, they say. Arbitrarily capping the number of students who can take the test demoralizes teachers and their students.

“At the end of the day, I think (special education students) feel a bit more successful with this,” said Kristin Wright, who has a special needs child and chairs the state board’s special education advisory committee. “It’s like a sigh of relief for the kids that it is not overwhelming.”

To qualify as a special education student in California, a student must be identified as having a physical disability such as deafness or autism, a learning disability such as dyslexia, or a serious emotional or behavioral problem. The number of special education students statewide has held steady for several years.

Generally, only special education students who performed at the “below basic” or “far below basic” proficiency levels on the previous year’s regular STAR test are eligible to take the CMA. It covers the same concepts as the regular test. But it’s different: Reading blurbs are shorter; fewer options are given on multiple-choice questions; pictures are used more often.

Testing? No, No Testing No Matter What

By Nirvi Shah writing for Education Week’s On Special Education

Last year, South Carolina mother Gretchen Herrera’s son Anthony, who has Asperger syndrome and Type I diabetes, was kicked out of his online charter school.

Ms. Herrera had tried to have Anthony, 12, exempted from South Carolina’s annual tests in reading, math, and other subjects when he was in 6th grade last school year. But no reason would do—not even a medical note that explained Anthony’s blood sugar could spike because of his Asperger-related anxiety, which is just what happened on the first day of testing. Anthony, who did well on the exam, stayed home during other state tests.

The federal Office for Civil Rights decided late last year that Anthony wasn’t the victim of discrimination when he was kicked out of the school. The online charter school dismissed about two dozen other students, too, because they didn’t take state tests—not all of them had disabilities. The scores from those tests determine whether a school makes adequate yearly progress under the No Child Left Behind law.

Still, when testing time returns later this year, Ms. Herrera isn’t worried about what she will do or how Anthony will handle the stress of the tests.

Anthony is now homeschooled.

“I have been told I need to provide him for their testing, and I have told them I don’t
have an AYP to care about, so no,” she said.

Anthony, who needs to socialize and spend time with other children, can do that through gatherings of other students who are homeschooled, like a sleepover this week at the South Carolina State museum.

After nearly a year of battle with schools and the state department of education, Ms. Herrera has found peace for Anthony, now a 7th grader, even if she may face additional challenges down the line.

“I can’t be happier,” she said.

AD Midd

About AD Midd

AD is a college writing teacher whose work experience includes everything from coordinating YMCA after-school programs for at-risk youth to tutoring developmental writing students to general classroom instruction. In addition to writing professionally, AD currently teaches a range of adult community college students in both online and physical classroom settings. At home, she keeps in shape by running after her two young daughters. Follow her on Twitter @ADMidd and on Facebook (www.facebook.com/ADMidd1)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>