Computers can find a good Greek restaurant, recognize a song on the radio and beat our best players at Jeopardy. But can they grade our kids' writing?
A wide-ranging new analysis suggests that the answer is "Yes."
The study of automated computer essay-scoring software finds that a handful of programs are "capable of producing scores similar to human scores" on thousands of sample essays.
The findings come at a crucial time for USA students, especially high schoolers. Since 2005, they've had to write essays to score well on SAT college admissions tests — humans grade these, at least for now. Under new academic standards, students must produce billions of words over the next few years. Whether teachers will embrace computer scoring to help grade all that writing is up for debate. The National Council of Teachers of English opposes "machine scored" assessments, putting its support behind "direct assessment by human readers."
But computer-scoring advocates, many of whom are also educators, say critics misunderstand the basic undertaking. Computers stink at judging high-level writing achievements, such as whether a writer has a unique style or can present sophisticated ideas. Most auto-graders aren't even programmed to capture that, no matter what critics say.
"They really don't understand that most kids are having a hard time communicating at all," says Mark Shermis, dean of education at the University of Akron. He agrees that timely, individualized grading by a human reader would be great, but given the scale of writing that schools envision, it's unlikely. "If every kid in the country had that kind of individualized attention, we might not be having this conversation."
Source: USA Today
Read more: http://www.usatoday.com/news/education/story/2012-04-23/essay-scoring-computer-software/54493662/1
There are no comments for this article.