Wow. Our experiment is off to a great start—let's see if we can finish it off sooner than expected.
I swear, sometimes I just want to cry. Here's a story in the Wall Street Journal today:
Students Fall Flat in Vocabulary Test
U.S. students knew only about half of what they were expected to on a new vocabulary section of a national exam, in the latest evidence of severe shortcomings in the nation's reading education. Eighth-graders scored an average of 265 out of 500 in vocabulary on the 2011 National Assessment of Educational Progress, the results of which were made public Thursday. Fourth-graders averaged a score of 218 out of 500.
I'm not crying for the students. I'm crying for the reporter, who apparently believes that students "knew only about half of what they were expected to" because they scored in the vicinity of 250 out of 500. And since 250 is half of 500, that must mean students only knew half of what they should.
This is wrong on so many levels I don't even know where to start. First, these are scale scores, not percentages of correct answers. Second, they're normed scores. Third, this is the same way all the NAEP tests are done, and they all produce scores in the same vicinity. The current eighth grade math average is 284. The reading average is 265. The history average is 266. Etc. And since the scores are scaled so that ten points roughly equals a grade level, fourth graders scored a little more than 40 points lower by definition.
In other words, these numbers in isolation don't tell us anything at all about whether the vocabulary skills of our children are weak or strong. It's like saying someone who scored 100 out of 200 on an IQ test must be a moron. Unfortunately, the reporter was flatly ignorant of all this, so she simply hauled out standard hysterical template No. 4 and decided that the test results represented "severe shortcomings in the nation's reading education" even though they show no such thing.
This stuff just never ends. I wish our nation's reporters were required to take an NAEP test of their own every year. I think that's probably the only thing that might motivate them to figure out what these scores actually tell us.
Test results are here, if you want to know more.