As a teacher of academic and professional writing, I often worked with students who were struggling with grammar and word-choice. Around 2014-15, students started asking whether it was worth using automatic writing evaluation (AWE) tools such as Grammarly. To test the tool, I doctored some of my own writing and ran it through a Grammarly check. I wasn’t expecting perfection, but I was shocked at how many major errors it missed and how often it flagged things that were perfectly fine, like calling out a grammatically correct bulleted list as “sentence fragments.”
Still, students kept asking whether our institution would consider paying for a Grammarly site licence. Some other universities across Canada and the U.S. had opted to do that. So in 2015, our university bought a 6-month licence that allowed a group of student peer tutors to access all of Grammarly’s features, including choices of genre and register. So they tried it, and . . .
You can see they weren’t impressed. Grammarly not only tended to over-correct, but it couldn’t navigate the subtler rhetorical or genre-based features of documents well enough to give accurate feedback. And the tool was (still is) quite pricey. While Grammarly has a free plug-in—I use it myself, and it’s great for typos—it costs a lot per year to get the whistles and bells.
Still, I found myself coming back to the Grammarly question. I read article after article showing how helpful AWE tools were for international students especially. Had AWE tools evolved since our study? Would we see greater accuracy, more rhetorical sophistication, and less Miss Grundyism? For a poster presentation in 2019, I compiled recent findings from other universities, teachers, and researchers about the efficacy of AWE.
There were certainly mixed reviews—in no small part because Grammarly’s educational site licences cost a hefty $1,200 per 20 users (and university budgets being as they are). But one Canadian university reported that international students appreciated the service. And while a U.S.-based writing tutor said, “I don’t think computers are even close” to processing a document’s rhetorical subtleties, he expressed high hopes for using Grammarly as a support tool. “I think the world will be all the richer in knowledge and the exchange of ideas as a result.”
In late 2018, I ran another doctored sample of my own writing through Grammarly. It correctly identified a subject-verb agreement error, a misspelling, a comma misuse, and a wrong word choice. It overlooked a missing period and apostrophe, yet called me out on correctly using commas around an appositional phrase (see picture), and it didn’t like my present-tense verbs.
Though AWE tools aren’t even close to 100% accuracy, I’m more convinced than I used to be that they have a role to play in supporting writing by giving infrequent or developing writers a leg up. This is especially noteworthy when writers—from students to professionals—are forced to work in isolation and can’t easily access human support on demand. As such tools become more reliable, they can help writers sort through basic issues before they bring in an editor or consultant.
I think editors, consultants, tutors, and teachers should be open to AWE’s possibilities. But I don’t see robots replacing us anytime soon.
Have you tried AWE tools? What do you think of their reliability?