by Carl Straumsheim for Inside Higher Ed

High-enrollment courses often lead professors to assign multiple-choice quizzes, as more complicated forms of assessment dramatically increase the time they take to grade. This fall, the University of Michigan at Ann Arbor will test whether automated text analysis can help professors integrate more writing into their courses without imposing significant new time constraints.

The automated text-analysis tool is the latest addition to M-Write, a program run by the Gayle Morris Sweetland Center for Writing at Michigan. The program targets students in large introductory science courses, using writing as a strategy to improve student learning. Michigan has funded M-Write with a $1.8 million grant, aiming to bring the program to 10,000 students by 2021.

M-Write combines automation with human oversight to lead students through writing assignments in which they draft, receive peer feedback, revise and resubmit. In addition to the new text-analysis tool, the program already uses automation for tasks such as peer review -- a student’s essay is sent to three classmates for anonymous feedback -- but oversees the process with writing fellows, former students who excelled in the class.

In interviews with Inside Higher Ed, members of the M-Write team said the addition of an automated text-analysis tool is an effort to create a “feedback loop” within the program, giving students and faculty members the kind of personalized insight they both would gain from a face-to-face conference.

“What you’d like to do is sit down and read a paper with the student in front of you, identify a misconception and have a conversation about it with them,” said Ginger Shultz, assistant professor of chemistry, who helped create M-Write. In a class of several hundred students where developing good writers isn’t the main objective, however, that sort of arrangement is virtually never feasible, she said.

At this stage of development, the automated text-analysis tool only works with pre-programmed prompts and is not intended to replace instructor grading. Yet Anne R. Gere, the Gertrude Buck Collegiate Professor of Education and professor of English language and literature who serves as director of the writing center, acknowledged that inserting the word “automated” into a conversation about writing instruction is controversial, and that there are “many, many conservative literary people who will indeed be appalled.”

Gere, the incoming president of the Modern Language Association, compared automated text analysis to radioactivity -- large blasts of it can be fatal, but targeted doses can cure disease, she said.

“Perhaps because I’m a humanist, I always think technology needs to have a human element as well,” Gere said. “This is the place where the humanities and sciences can come together to create better learning for students across the curriculum.”

As covered by EdSurge, the automated text-analysis tool will be tested in a statistics course this fall. For three semesters, students in that class have responded to the same writing prompts, producing hundreds of essays on the same topics. The M-Write team has pored over those papers, identifying the features of papers that met the assignment criteria and those that missed the mark. The findings will be used to design an algorithm that makes the text-analysis tool look for those features.

In one of the prompts that will work with the automated text-analysis tool, students are asked to review an advertisement for a pizza company and write one for a rival business, using statistical evidence to build their case. To analyze the essays, the tool will look for specific words and topics, such as if students make an argument out of statistics showing that their business sells larger pies, Gere said.

The tool is not intended to automate grading decisions, however -- only the process of giving students feedback about their writing. The M-Write team plans to use ECoach, a support platform developed at the university, to send students personalized messages. For example, if the automated text-analysis tool determines (and writing fellows agree) that a group of students haven’t grasped how to incorporate peer feedback into a revised paper, the system will send them pointers on how to do so.

“This is not a project about improving student writing per se,” Gere said. “It’s a project about helping students learn better, and writing is a very powerful form of student engagement and learning. We’re trying to harness that power.”

The tool is intended to give faculty members valuable feedback as well, Gere said. If the tool finds that many students struggle with an important course concept, faculty members would learn about it early in the semester and perhaps change an upcoming lecture to ensure the topic receives some extra attention.

“The way that we think about the automated text-analysis tool is that it’s not from a standpoint of trying to score or grade the writing,” Shultz said. “We really want to use the automated text-analysis tool in order to provide information to the faculty members to help them understand how students are learning.”

Read more by Carl Straumsheim