Sunday, September 6, 2009

MCAS and Slippery Statistics


Scot Lehigh had an interesting editorial recently in the Boston Globe defending the MCAS (Massachusetts Comprehensive Assessment System, the required-for-graduation standardized test). He cites some interesting statistics comparing how well current students (class of 2011) are doing on the MCAS compared to the class of 2003. He, like almost everyone, compares the scaled scores of the MCAS which has cutoffs for being “Advanced”, “Proficient, “Needs Improvement” (but passing), and “Failing”.


The problem here is that the MCAS is a moving target; to be proficient this year only required a raw (unscaled) score of 47/72 on the English portion, but in 2003 you needed a 52/72. To pass, the cutoff has been lowered from 2003’s 38/72 to today’s 31/72 (that's a 43 % to pass!). By moving these cutoffs around and changing the content of the questions on the test, the MCAS can make public schools look great or lousy at will.


Here are some specific statements in his piece that I take issue with:

"Teachers, too, have stepped up...overall they've done a good job helping students master the required material." Good performance on a test is frequently simply a reflection of good prep for that particular test, not necessarily an indication of mastery of the subject material.


“When the MCAS actually started to count, the passing rates went up dramatically.” I personally have seen the MCAS given to students for whom the test does not count: they make pretty designs on the answer key, they purposely try to get the answers all wrong (surprisingly hard to do), they answer all ‘C’, or they simply leave the entire test blank, etc. No surprise to me that the score on any test taken by teenagers goes up when “it counts.”

“Since we’ve implemented high standards here [meaning MCAS], Massachusetts students… have been turning in top-notch performances. That’s just more evidence of the value of strong curriculum frameworks and standards” What’s the ‘more evidence’? If there is improved MA performance on national assessments post-MCAS then that would be something, but improving MCAS scores is just inside baseball.

On the recent poor results on the Science MCAS: “To some, that may mean delaying or lowering the new standard. That, however, would be precisely the wrong reaction.” Lowering the standard is exactly what the state did (see my second paragraph) when it became a real possibility that thousands of otherwise eligible high school students would be denied graduation because of a poor English MCAS score.

No comments:

Post a Comment