Hello, it’s been a while, so I finally have a new article! Today, I’m exploring a language problem that is tricky to navigate: computer language assistants giving inaccurate advice. That is, when you have autocorrect or grammar assistant programs that flag things either unnecessarily or incorrectly.
I’ve compiled a short list of what I find to be the most common examples to help you avoid problems with this, so you can use language software tips more effectively.
What’s wrong with language assistant software?
Spellcheck and grammar assistants can be very handy, and I often use them for additional, final proof-reading checks – but I’m very careful about which corrections I accept. For example, this article is about 1,500 words long; I gave it one edit myself and ran it through Grammarly to receive 47 suggestions. Of these, 20 were actually specified “corrections”, and the others just language tips, which are of little interest to me. Of the 20 corrections, I found 5 useful, but only 1 was an objective error (a missing word). The remaining “corrections” were suggestions for commas or for verb or preposition changes. Considering I accepted 4 out of 19 of these, you can see the suggestions weren’t really “correct” for me.
This is about typical with what I find for my documents, though I am a professional writer and editor, so I’d imagine for others there may be a higher rate of useful corrections. If you can tell which ones to accept.
Every extra error I can identify is invaluable, but to get those corrections it’s essential that I’m able to identify how 75% of the suggestions aren’t appropriate. Mostly, the suggestions aren’t wrong, but are just unnecessary, and may slightly change my meaning or personal style. In the worst cases, they could actually introduce errors.
The problem is that language is an ever-changing and flexible thing, as we often discuss on here, whilst computer programs follow strict rules. It’s dangerous to rely on these too much, as they operate on a basis that all writing should be the same – that it should follow a particular, “correct” form. In its most extreme, this can strip writing of its intended effect or meaning.
Mostly, though, this is more of a sentence-level issue, where you have to consider whether or not the computer program’s tweaks are relevant – and consistent. To help keep you vigilant about the suggestions, here are 7 things that I notice often come up.
1. Commas
By far the most common suggestion I get from computer language assistants is to add or remove commas. This is usually based on specific conjunction rules, pointing to situations where you should always or never have a comma alongside a conjunction. I disagree with using commas this way, as their particular use depends on how we can best present the information of any given sentence (as is true of all these points, actually).
For reference, I have a whole series of articles on commas, which you can find curated here. There is a running theme throughout that a comma’s main purpose is to aid with clarity. Software can’t necessarily decide that for you; sometimes, a reliance on specific structures will mean “correct” comma placement helps, but in most cases I would assess the accuracy of your commas based on your own intended meanings.
2. Prepositions
As with commas, computer software tends to try and fit preposition use into rigid rules, in this case based on expected conjugations. Anyone who has studied English will know, however, that the patterns for preposition use (and by association also particles from phrasal verbs) are terribly complex and often very specific.
Mostly, this is seen with suggestions that you change a preposition to a more appropriate one, or remove it. I frequently get told to correct “look on” to “look at”. This is a minor shift, but it’s telling that the software assumes “at” is always the most appropriate preposition for “look”. The problem with a correction like this is that “look at” may almost always be technically “correct” here, it’s not necessarily conveying the exact connection I want.
I also find these programs like to simplify longer or combined prepositions, such as changing “onto” to “on” or “into” to “in”, again stripping some nuance. Words such as “onto” and “into” exist to provide specific detail, which the computer, concerned only with correctness, does not necessarily appreciate.
3. Isolated Verb Conjugation
Certain structures seem to confuse computer software’s understanding of intended verb uses, and you may see suggestions for changing tenses or subject conjugation that don’t make sense in the wider context. One that I often encounter is where the program expects a particular type of structure to be associated with rules, and suggests a present tense verb, no matter what tense the rest of the document is in. This is more likely to occur where you have shorter, isolated sentences.
On the other hand, correction software can also get confused about subject conjugation when dealing with lists or longer sentences that separate the subject from the verb. Be particularly careful where you have noun phrases for subjects, and especially where you have multiple noun phrases, as this can make it hard to keep track of plurals. This is an area that can be tricky when writing and editing your own work, and computer programs aren’t necessarily any better at it than us!
4. Emphatic and Stylised Language
In general, autocorrect software is always looking to make your language more efficient and simpler, assuming simple is clearest. Whilst there is some merit in this as a starting point for editing, if followed too strictly it can restructure information you want presented in a specific way, and it may affect tone or emphasis.
Modern language use has gone towards an attitude that plain, simple language is best, but there is a time and place for varying from this. Computers cannot decide what those times and places are, so will simply tell you, for example, to remove extra adverbs or shorten long sentences.
You’ll see this in many different ways, but one of the most common is when language assistants suggest removing emphatic adverbs such as “still”, “just”, “really” and “very”. Sometimes these will be filler words that we can do without and sometimes they can completely change the emphasis of a sentence. Be careful to decide for yourself whether they are needed.
5. Dialogue Tags & Punctuation Styles
Computer language assistants sometimes make completely incorrect suggestions for specific bits of punctuation, such as dialogue tags (i.e. when using quote marks). A common example I find is when dealing with interruptions in dialogue, or listed items using quotes (in both cases, not using the expected full sentences enclosed by quotation marks). This is a matter of style which should be consistent within one document but may not easily be called correct or incorrect in general. For example:
- “I might interrupt some dialogue” – using dashes – “like this.”
- “Or you could also show an interruption,” with commas, “like this.”
Computer programs won’t necessarily see what you’re going for here, and can make some very strange suggestions, either to end or combine sentences or use different styles (such as replacing commas with dashes or vice versa).
It’s important in these cases to be aware of your style choices and stick to them.
6. Sentence Fragments & Adaptive Rules
There are times in writing where we reject typical rules to use grammatically incomplete sentences, often because they break rules, drawing attention to themselves. This is always going to depend on specific circumstances and personal nuance, and as such is always going to prove difficult for computers to master. I feel the software is actually getting better at appreciating this, and doesn’t always flag fragments, but for the most part if you venture from the rules, you will get warnings.
With sentence fragments in particular, this could lead to a variety of suggested solutions, including punctuation, verb, noun or preposition tweaks – all of which may detract from or completely change the intended meaning. Beware!
7. Vocabulary
Lastly, but most obviously, computer programs sometimes try to offer “better” words to improve your writing. This is not to be confused with correcting words, where you have a mistaken meaning, but is again a symptom of the rule-based system where certain types of language are considered to be clearer or more appropriate than others. Where software offers you words with similar meaning, I’d tread very carefully over exactly why it wants you to make a change. It may just be a case of the program trying to make language fit expected patterns. As with all the cases above, it can also mistake your intended meaning and try to change the sentence to fit its own ideas. Grammarly once told me to change “her behind” to “her hind legs”, which would’ve taken things in a very unusual direction.
The only good reason I can think of for automated synonym suggestions is where you might’ve used a particularly obscure or complex word that might be clearer simplified, but this is a personal choice to make that can depend on the context.
That’s my list for now, which I hope helps highlight areas for caution with computer language assistants. The software is useful and is definitely improving, but I do encourage you to keep questioning it, and there are plenty of other areas for caution I’m sure you’ve all encountered. Feel free to share your own experiences in the comments!



