Don't feed me malarkey
Joseph Thibault takes out the AI trash, originally published in DIRTYWORD magazine issue #9
The Agreement
You and I, dear reader, have entered into an agreement, whether you know that or not. I’m trading you my time, effort, and brain power in the forms of characters, words, sentences, and paragraphs. You’re trading back the time it takes to scan, read, digest, share, or maybe discuss the essays or information provided to you from the page.
Reading, like writing, is an art. It is something we practice, hone, and attempt to master our entire lives. The goals: enlightenment, entertainment, information, knowledge, transportation… The best text allows us to step into somebody else’s shoes, to learn something new, to find some of that real human experience which has been put down on paper or in zeros and ones.
In school, information literacy helps us to learn whether a source is credible or that information can be trusted.
That encyclopedia entry: trustworthy. The meme your second cousin Mary shared on Facebook: maybe not so much. When a newspaper or federal agency publishes information, we assume credibility. At least, that’s how things used to work.
Bait & Switch
At heart, I’m a list maker and researcher. I seek new information and news about topics that I care about. Recently, during research, an article presented some new companies in edtech focused on authorship alongside some familiar players and tools. As I researched two of the new companies presented, I was surprised - then frustrated - when they couldn’t be found in the first several pages of search results.
Spidey sense?Triggered.
The article? AI generated.
The companies? Hallucinations.
My time? Wasted.
Fast forward a few weeks:
A Chicago newspaper published a summer reading list. Of the 15 recommendations, only five were real.
A government commission’s report focused on “Making America Healthy Again” was published. Seven citations included were fabricated from whole cloth.
Malarkey
A few fake companies on a low-traffic site is a nuisance. Fake books peddled to thousands of readers, perhaps a hilarious cocktail table conversation. Fake citations in a government report about health and wellness guiding policy for millions of people, well, that’s next-level.
AI is supposed to be helping us be more productive, creating more clarity, making information more accessible, and accelerating our understanding of the world. But when it’s unfiltered, unchecked, and shared like the examples above, it undermines trustworthiness in written text. Period.
AI isn’t to blame; humans are still driving the prompts, working with the outputs, editing the articles, newspapers, and reports. Information literacy remains a critical skill in the age of AI. AI outputs must be fact-checked and articles on any site, increasingly, might include hallucinations passed along to readers. The next time you use an LLM to plan or translate text to page: remember that you’re the first reader. Don’t pass the buck to others to do the fact-checking for you. The expert is still in the computer chair, not in the cloud. The next time you’re working with AI as a co-author (or asking someone else to), remember that agreement: your time and effort for mine.
AI can reduce the amount of time that it takes to write. But if you really want me to read something deeply, to consider it, to trust it, and hopefully to share or engage with it, for the love of Claude: don’t feed me a bunch of malarkey.1
originally published in DIRTYWORD the E-Learning Magazine Issue 9: https://issuu.com/dirtyword/docs/dirtyword_the_e-learning_magazine_issue_9/s/102279754