English, AI and the thermostatic principle

Update, September 2024: an extra Fortnightly issue expanded on, deepened, and I think improved on this post, so I’d recommend heading there, below.

English teaching, AI and the thermostatic principle by Julian Girdham

A one-off issue between Fortnightlies 172 and 173: the latter will be along this time next week, followed by an Occasional for paid subscribers in two weeks.

Read on Substack

Nobody knows anything, as the screenwriter William Goldman famously said about Hollywood and films:

Not one person in the entire motion picture field knows for a certainty what’s going to work. Every time out it’s a guess—and, if you’re lucky, an educated one.

Alan Jacobs refers to ‘the seductions of prediction’, and these seductions have been very much on show since the release to the general public of ChatGPT in November 2022, followed by many new iterations from several tech companies. There are a lot of people who have made completely confident pronouncements about what AI will change in the world of education. But widely-accessible Generative AI is just 1 and a half years old! It is a guarantee that most of those pronouncements will turn out to be wrong. There is also an inbuilt bias in the tech world towards positivity, certainty and boosterism, partly for commercial reasons, and a blindness to unintended consequences (I’m looking at you, social media).

That all goes for me, too. This post is just a scattering of thoughts on the teaching and learning of English as a secondary-school subject in the infant years of Generative AI, and I’m fully prepared for many of these to turn out to be misguided in the years ahead.

Thomas Newkirk wrote one of the best books I have read about teaching, The Art of Slow Reading (2012). In it, he quotes Neil Postman in Teaching as a Conserving Activity:

Schools, Postman argues, should act on a thermostatic principle: a thermostat acts to cool when a room is too hot, to heat when too cool. Schools should act to check (and not imitate) some tendencies in the wider information environment: ‘The major role of education in the years immediately ahead is to help conserve that which is necessary to a humane survival and threatened by a furious and exhausting culture.’ 

Postman wrote that in 1979.


So, some jottings, and some useful recent references.

I write as a technophile who enjoys using new tools (soft and hard alike). But tech boosters always over-claim for their worth in education. As Daisy Christodoulou shows in her book Teachers vs Tech, there is a long history of blustering overpromise and subsequent disappointment. The pattern is re-asserting itself since the arrival of Generative AI. Among the regular suspects: that personalised education will change everything (a cousin of the discredited idea of learning styles), and that we can grab knowledge and understanding from the online world (‘you can just Google it nowadays’). The tech world regularly conflates information and knowledge, assuming the latter is the same as the former.


This talk by Dan Meyer recently in the lions’ den at a tech conference on ‘The Difference between Great AI and Great Teaching’ follows up on that: how the tech industry always thinks it is transformative in a positive way,  and how it has little understanding of the principles of teaching and learning. Worth 18 minutes of your time: the YouTube video is also embedded at the bottom of this post.


Right now, there is an opportunity cost to spending too much time absorbed by this topic: Rebecca Birch -

‘Talking about AI is a fun way of neglecting real professional growth. We can feel so progressive and productive’.


A crucial point from John Warner:

Writing is thinking. Writing is simultaneously the expression and the exploration of an idea. As we write, we are trying to capture an idea on the page, but in the act of that attempted capture, it’s likely (and even desirable) that the idea will change.  The leapfrogging of AI is going to miss that.

And another from Warner:

The fact that writing can be hard is one of the things that makes it meaningful. Removing this difficulty removes that meaning.


Poetry: this week I speak to our school on the provocative idea that the most ‘important’ subject to study, indeed of the most practical use in our lives, is poetry. That is based on this essay. Reading, studying and writing poetry are antidotes to the dehumanisation of Generative AI.

On which topic: you should read English teacher Carol Atherton’s recent book Reading Lessons: the books we read at school, the conversations they spark and why they matter, which shows powerfully how beautiful works of literary art engage our humanity in the classroom.


What we need when we teach writing, and encourage reading, is friction: writing should be effortful and tricky. But AI’s impulse is to remove friction. For pupils - all people - the idea of bypassing productive struggle is another seduction. Vocabulary is an example of this: it is built up over time as reading becomes more sophisticated, and its effectiveness depends on this slow and rooted development in each individual.

Technology tends towards speed. Good writing comes out of slowness over time, and drafting, and messiness. Then its foundations are solid.


Since well before the invention of the internet, English teachers have been aware of the problem of CNS, Coles-Notes-Syndrome. ‘Reading’ a book via summaries is now old hat compared to what AI can do. We need to show our pupils how disastrous to their understanding this will be.

It is also a fundamental misunderstanding of a literary text to think it can be ‘summarised’. The meaning of the text is in the parts filtered out: the sentence-level texture, the rhythm of the narrative, the incidental pleasures.  Marc Watkins in No One is Talking About AI's Impact on Reading -

When we teach reading as a skill, we’re asking students to practise more than analytical thinking through close examination of a text—we’re inviting them into a conversation with the reader about their ideas. We want maturing readers to engage with the text, not a summary of it, because we know that doing so means those ideas can help shape and mould a student’s thinking, challenging their assumptions, and making them argue within themselves over deeply held beliefs they may have about our world. A bespoke generative summary just doesn’t do that. If students adopt it as a replacement for close reading, it may rob them of the opportunity to fully engage with an author’s ideas.

and

Unchecked, the lure of frictionless reading could produce a profoundly atrophied culture of surface-level ideas rather than exploring them in depth. In such a world, I shudder to think how blind obedience to authoritative-seeming AI outputs could allow misinformation and weaponized narratives to proliferate unabated.


Maryanne Wolf’s important book Reader, Come Home: the reading brain in a digital world explores the importance of what she calls ‘cognitive patience’, precisely what Generative AI leapfrogs.

The digital chain that leads from the proliferation of information to the gruel-thin, eye-byte servings consumed daily by many of us will need more than societal vigilance, lest the quality of our attention and memory, the perception of beauty and recognition of truth, and the complex decision-making capacities based on all of these atrophy along the way.


Actually, one and a half years in, it now surprises me how relatively little AI is being used by my pupils. Some of this is because of circumstance: all serious summative assessment here (including the public exam of the Leaving Certificate) is done in person without the use of technology. AI doesn’t appear at all during classtime. Also, older pupils seem to realise more maturely than I had expected that reliance on such a tool atrophies muscles which need to be developed, which is cheering. Below, Tony Wan uses the same ‘muscles’ analogy. I would be more worried about younger children, and the children of the future if they rely on shortcuts as the norm in writing, thinking and reading. I am of course only thinking about English here, and I know things are different at third-level.


Tony Wan:

For many people writing is the most brutal exercise in thinking. It reflects and tests our assumptions, pushing us to refine our ideas and uncover new ones. It leads us down rabbit holes that we have to crawl back from. It requires us to connect the dots and think about what makes sense or doesn’t, to transition between ideas and evidence, and to consider what makes the cut and what doesn’t.  When AI is used as a shortcut, we lose some of these muscles, as painful as they are to build. For developing young writers, this can be a major setback.


AI for ‘brainstorming’ ideas for an extended piece of writing: the worst time for a pupil to use AI is early on, for gathering ideas or producing an early draft, and that is just the time they are most likely to use it. This is not an effective tool for novices, who are the most likely people to be seduced by it. Novices don’t know enough to know what they don’t know (not to mention all the ‘hallucinations’). Let’s mention Dunning-Kruger. Experts in a subject can rapidly assess. I was trying to find ‘new’ questions to ask about King Lear: AI produced some fairly standard ones that didn’t help a great deal, but I knew that instantly from a deep level of knowledge of that play.

Marc Watkins wrote on First Drafts in the AI Era:

I don’t like the idea of students going to AI and prompting a first draft. I know some have argued that this could be a helpful method to fight the blank-page anxiety most writers feel. Others view this as helping maturing writers by giving them a template or outline to help them organize and scaffold their ideas. I think there may be some value in those approaches, especially in terms of helping struggling students who might otherwise balk at writing, but all of these approaches assume a maturing writer will then use their budding rhetorical knowledge, content knowledge, and contextualize knowledge to complete the draft. Those of us who’ve taught first-year writing likely raised a questioning eyebrow at that idea.


It’s entirely possible that Generative AI writing will get worse rather than better, as it eats its own tail, feeding on its own flabby flesh.


Alan Bennett has recently turned 90! This is from his play The History Boys: 

The best moments in reading are when you come across something - a thought, a feeling, a way of looking at things - that you'd thought special, particular to you. And here it is, set down by someone else, a person you've never met, maybe even someone long dead. And it's as if a hand has come out, and taken yours.  

Only a human being can do that, in real life, or in a book.


Richard Hughes Gibson:

When writing meets no impediments, we can easily become links in a chain through which misinformation spreads. Yet my appeal for friction writing goes to something even more basic: When you encounter (and pay heed to) resistance in your writing, you have the chance to change not only your words but also your mind—and even to consider whether you need to be writing something at all, or at least at this moment.


Jane Rosenzweig:

we need to be able to recognize when removing the friction from the process might mean losing something important. We may love having written for different reasons, but the friction contributes to that feeling of satisfaction. I love having written when I am able to give structure to my thoughts or discover something in the process of writing that is satisfying or even profound—when I find an answer or solve a problem or arrange words in a way that makes me see something more clearly.


Stephen Noonoo:

It’s not much of a take to say that writing well is difficult. So is thinking critically. But, crucially, both endeavors help me better understand topics when I have to explain them for others. In other words, doing things on my own is the helpful part.

He also quotes high school teacher Liz Schulman writing in the Boston Globe:

ChatGPT eliminates the symbiotic relationship between thinking and writing and takes away the developmental stage when students learn to be that most coveted of qualities: original.

Isn’t originality the key to innovation, and isn’t innovation the engine for the 21st century economy, the world our students are preparing to enter?

You can sense the aliveness in the classroom when students use their imagination and generate their own ideas. Their eyes become warm. They’re not afraid to make mistakes, to shape and reshape their ideas. The energy shifts. I’ve seen it happen in their discussions and with stages of the writing process, from brainstorming to drafts to silly stories to final essays. They’re more invested in arguing their points because they’ve thought of them themselves.


A useful summarising post on ‘friction’ by Leon Furze:

We need to find ways to convince students and early writers that the struggle – the friction – is worthwhile. With technologies that increasingly remove all of the barriers from first draft to final edit, we need to re-evaluate how and why we teach and assess writing.


Conor Murphy, English teacher:

The English classroom is about sharing the writings of other human beings, of sharing thoughts and experiences from our tangible, and intangible, realities. There's enough writing out there for me to source poems, short stories, articles, novels, comics, films, plays etc etc without having to ask AI to help out. 

Writing and meaning are intertwined. How we write reflects how we think. I want my students to see that, to see that this person used this technique for this reason (or reasons, even if that reason is to be opaque). I want my students to then be inspired to create, cultivate, their own voices, their own way of expressing themselves. 


Benjamin Riley in a coherent long post in Cognitive Dissonance:

My claim is that AI in the form of large-language models is a tool of cognitive automation – and that’s all it is. All it does, all it can do right now, is make statistical guesses about what text to produce in response to text that it’s been given (and often it guesses wrong) … Using a tool that automates student cognition will lead to less effortful student thinking, which will in turn lead to less student learning


US teacher Chanea Bond caused a stir by stating that any of her students using AI would receive zero grades. The coherent, robust and unapologetic reasoning:

They’re not using AI to enhance their work. They are using AI instead of using the skills they’re supposed to be practicing. So, I decided we’re not going to use it in the classroom. 

and

Her policy isn’t about asking students to bury their heads in the proverbial sand. She’s more concerned with what her students are learning—or more often, not learning—by leaning on AI to help them formulate and write their assignments.  Bond believes that allowing students to outsource their ideas and rough-draft thinking to AI doesn’t help them and in fact devalues vital literacy skills like originality, creativity, analysis, and synthesis. “The original ideas are the most important component in a student’s writing,” Bond told me. “You can polish everything else. But how are you going to polish an idea that you didn’t originally have, that you didn’t originally think of, and that you don’t really have any investment in?”

and

There are a lot of things we don’t teach kids how to do that they end up using in their careers. That’s not my job. My job is to help kids develop foundational skills. Using AI at this point in time is not a foundational skill. If they need it, they will learn it on the job, in a job-specific way—just like we are doing right now. 

Full interview with Andrew Boryga.

It’s time to turn down the thermostat.