Grammarly has introduced an “expert review” feature that generates writing feedback in the voices of real authors and academics, living and dead, without their knowledge or consent. The tool, part of a broader expansion of AI-powered writing features, lists named figures including Stephen King, Neil deGrasse Tyson, Carl Sagan, and William Zinsser as available reviewers, though none of them participate in the process.
A disclaimer buried in the product clarifies that references to these experts are “for informational purposes only and do not indicate any affiliation with Grammarly or endorsement by those individuals or entities.” That legal hedge does little to address the core concern: a commercial product is trading on the names and reputations of real people to lend authority to outputs generated by a large language model.
What the Feature Actually Does
Jen Dakin, senior communications manager at Superhuman (the parent brand that Grammarly rebranded under in October), described the tool as one that “leverages our underlying LLM to surface expert content” relevant to whatever document a user is working on. The suggested experts, she said, depend on the subject matter being evaluated. The system, in her framing, points users toward “influential voices whose scholarship they can then explore more deeply.”
That description sidesteps the central question of whether these AI agents are trained on the actual writings of the people they simulate, and whether doing so is legally permissible. Copyright litigation over exactly this kind of content-harvesting is already active across the industry, with no settled legal framework in place.
The Dead Get No Say Either
The inclusion of deceased individuals adds a distinct dimension to the controversy. Vanessa Heggie, an associate professor of the history of science and medicine at the University of Birmingham, flagged the issue publicly on LinkedIn after discovering that the platform offered AI-generated feedback modeled on David Abulafia, an English historian who died in January 2025. Heggie accused Superhuman of “creating little LLMs” from “scraped work” of both living and dead figures, calling the practice “obscene.”
An independent review of the feature confirmed the availability of the Abulafia model, alongside models built on living cognitive scientists Steven Pinker and Gary Marcus. Neither responded to a request for comment.
A Platform Rebranded, an Appetite Expanded
The expert review tool is one piece of a significantly expanded product suite. Grammarly now offers an AI chatbot for in-draft assistance, a paraphrasing tool, a “humanizer” that adjusts text to match a selected voice, an AI grader that estimates how academic writing would score as coursework, and tools designed to detect and revise text that reads as AI-generated. The last feature carries its own irony: the platform uses AI throughout, then offers tools to make the result sound like it wasn’t.
CEO Shishir Mehrotra announced the Superhuman rebrand in October, writing that “when technology works everywhere, it starts to feel ordinary.” The expert review feature suggests the company is betting that attaching recognizable human names to AI outputs makes them feel less ordinary, and more credible, regardless of whether those humans ever agreed to be involved.
For Zinsser, who wrote the 1976 handbook On Writing Well and died in 2015, there is no recourse. For those still alive, the options are limited and largely untested.
Photo by Alexey Demidov on Pexels
This article is a curated summary based on third-party sources. Source: Read the original article