Skip to content

SAT writing Section Under Fire.

March 26, 2007

The ever handy in a recent story “Fooling the College Board” took the College Board to task, jumping behind MIT’s Les Perelman’s campaign against the new writing section of the SAT. In doing so, they are committing the same instructional fallacy as Perelman.

A little background: Perelman, director of the WAC program at MIT, reported prepped a test taker (of consenting age, no doubt to report the findings) to present what Perelman has identified as the specific rubric touchstones that would get a high score, but in a factually facetious and logically inconsistent manner. In short, to present an essay that jumps through the hoops, but one that makes little sense.

It seems he succeeded in proving his point but, ultimately, loses his argument.

Let’s start at the CCCC’s. Perelman walks his “exposure” of the faulty SAT method by showing how even a poorly written paper (see here) can pass the writing test. Perelman argues that with a specific structure (5 paragraphs usually), specific word choice (“plethora,” and the like), and a series of specific argumentative tactics (appeal to the arts and/or history, personal encounter with the topic). If taken as a whole, a passable essay will emerge regardless of spelling, grammar, logic or adherence to facts.

Good for Perelman. He has proven, very effectively, that if one writes to a specific writing situation, one can succeed. Which, incidentally, is one measure of writing success. That is, a good, effective writer will write to succeed: in college, in business, in love, etc. If I were writing in a science course at MIT, then certainly my writing techniques would adhere to the rubric (spoken or not) of that environment. The same goes for “passing” the SAT.

What should really be in question is not whether one can rig the writing portion of the SAT (one can, just like one can “prep” for the verbal section by memorizing a list of commonly tested words), but whether colleges should look to another form of measure in picking their incoming class. That is, when much power is given to the test, then much abuse and gaming will follow.

How about some new methods of sifting the high school grads? Phone interview? Mini-academic boot camp where skills are assessed? Or how about some other measure where the potential of a student can be guessed?

Good work Les. Just don’t become the peril-man.

Other related links:

One Comment
  1. SourDad permalink
    March 27, 2007 1:59 am

    I can’t imagine Admissions Deans and their offices taking the time to read essays from hundreds or thousands of applicants. At UofM the essay is not heavily weighted. It’s so light I wonder if they read them.

    And why should they. Most of the way through college, especially at the big schools, all they will do to demonstrate learning will be filling in little dots.

    It’s all about time and effort.

    I can’t help but think that student writing wouldn’t be so bad if more of my colleagues asked students to write. God knows I don’t want to read student writing, but I do.

    Do I have to, No; I teach biology. I could easily get a way with scantron graded tests, like many of my colleagues.

Comments are closed.

%d bloggers like this: