Learned Volume 5, Issue 52
This week: stochastic parrots and the end of the written word. Or, you know, just another Monday in our modern neutopia. Let's discuss.
Learned is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Chat GPT and other large language models have been the talk of the virtual town for a while now. And while some are hailing these tools as the future of work, others are reminding us that they are just that - tools. It's both concerning and complicated if for no other reason than the technology appears to be evolving faster than we have the language to discuss it. But we'll come back to that in a moment.
First, when I started Volume 5 of Learned, back in April 2022, I decided I'd use the phrase "Say It Again But Slowly" as both a subtitle and a theme. The idea is that I would dedicate each week's issue to a word that was both new to me and that I had discovered through the news. And so, much as you might ask a friend to repeat an unfamiliar word, my task would be to explore the etymological roots of the word and to share what I had Learned.
51 issues in, I've mostly done that. And it hasn't been fun. It turns out, consuming enough news to reliably find words or usages that I wasn't already familiar with is, in no particular order, exhausting, depressing, and time-consuming. After a while, I found myself avoiding the news and substituting in a word I had found in a book or social media or somewhere generally not stuffed with stress and tension. This led to words like zarf and azimuth, which were a lot of fun to write about. But it also allowed me to skip over some words that I just couldn't bring myself to write about.
Like stochastic. It's an interesting word, and one that is another example of how my lack of mathematics ability has failed me, but it's also one I initially encountered paired with another word. One that starts with terror and ends with ~ism. The resultant phrase is one that is often used to refer to the rash of mass shootings that has been taking place in the U,S. over the past few years and I tried to write about. I really did. Everything I wrote sounded trite, out-of-place, or just, simply, insufficient to the horrors realized by real people. So I set it aside.
Imagine my surprise then, when stochastic popped up again, this time paired with the much friendlier word, parrot. Specifically, Dr. Emily M. Bender called Chat GPT a stochastic parrot, explaining that large language models like Chat GPT are tools
for haphazardly stitching together sequences of linguistic forms … according to probabilistic information about how they combine, but without any reference to meaning
Cool. Um. Could you say that again, but more slowly? 'Cause I don't think I get it. That quote comes from an interview with Dr. Bender in New York Magazine's Intelligencer column. The author of the interview, Elizabeth Weil, precedes it with this definition of stochastic:
Stochastic means (1) random and (2) determined by random, probabilistic distribution.
Merriam-Webster concurs, adding examples for each sense of the word:
1: RANDOM specifically : involving a random variable
a stochastic process
2: involving chance or probability : PROBABILISTIC
a stochastic model of radiation-induced mutation
Stochastic is a relatively new word in English, first appearing in 1934 and relating, mainly, to statistical processes. In fact, since the 1960s, textbooks explaining stochastic processes as a branch of applied mathematics as well as its use as a model in physics, biology, and even linguisticshave become relatively commonplace.
So, what's it got to do with parrots? Essentially, Dr. Bender is saying that large language models have no intrinsic understanding of what they are saying. Their responses are generated at random according to the predictive algorithms that comprise their coding. I think. What's easier to understand is that Dr. Bender is drawing a line in the sand, saying that we humans are not mere parrots, randomly spitting out pieces of language.
After all, words, by themselves have no meaning. They really don't. They're just sounds and / or marks to represent those sounds. Everything else that happens when we use a word happens inside our brains as we perceive and interpret the world around us. A word, by itself, in vacuum, does nothing.
It's an absurdly fascinating article and one I recommend to anyone with an interest in this new technology and how it is going to affect us all as we move along our one-way path into the future. I’m going to link it more clearly below along with a couple of other articles for further reading. But, for now, let me close by saying thank you.
It's been another interesting and difficult year in a string of them and writing this newsletter is one way I manage the stress and chaos of daily life. I hope you find it equally beneficial - or maybe just distracting in a useful way - in your lives. Learned will be carrying on into Volume Six, with a new look and a new theme. I hope you'll come along for the ride, it's great to have you here.
Articles for Further Reading:
Down the Rabbit Hole
This week is less of a rabbit hole and more shameless self-promotion. I've re-started my photo newsletter, 91 Days, here on Substack and will be publishing the third week's photos in a couple of days. The weather here in Japan is trending towards Spring and it's about to get real pretty, real fast. I'll be here, camera out and keyboard ready, documenting it as best I can. If you're interested, and I hope you are, please come read along for free at 91 Days.
From the Archives
Since this is the end of Volume Five, let's go back to where we started by taking a look at Learned, Volume Five, Issue 1, Peripatetic Mix-Up. Enjoy!
“Neutropia is a form of speculative fiction that does not neatly fit into categories of utopia or dystopia. Neutropia often involves a state that is both good and bad or neither.” - Stack Exchange User 46119
Not counting this one.
And not the good kind like we talked about last week.
For the curious, this is the article I had originally intended to write about
I'm not making light of anything here; breaking the word into pieces is a way to avoid this essay becoming a result if people google the full phrase.
This is according to Google Book Search; also, stochastic is used in a sub-branch of linguistics known as "Natural Language Processing," a field I'm fascinated but utterly mystified by.