
In this short series of blog posts, I want to look back at approaches and activities from twenty-five years ago, see if they stand up to the test of time and how they might be adapted to today’s classrooms.
A little bit of background
More or less 25 years ago, I was teaching groups of young adults in Canterbury, in the UK. The students were studying on an intensive course, taking exam preparation classes in the morning, and optional special interest classes in the afternoon. Each teacher got to choose a focus for the afternoon classes. I chose creative writing.
I was interested in exploring the role of word-processing in the process of writing. We had the luxury of a suite of PCs in a dedicated computer room, so I booked this room for our classes. I also wanted to experiment with using creative writing tasks to boost the students’ confidence and fluency in writing, helping them find their own voice before they started focusing on exam-type writing tasks in the second term.
I had a class of around twenty students, with levels ranging from A2 to C1. A lot of them were quite quiet or reticent to speak and had chosen the writing course as an alternative to the drama group, which inevitably attracted the more extroverted students. They weren’t necessarily there because they were interested in creative writing!
The first class
For the first class I chose a simple task based on first impressions. The end goal was to write a short text describing the students’ first impressions of their host town, Canterbury. I chose a simple creative writing training strategy to scaffold the task: brainstorming first impressions associated with the senses of sight, smell, hearing, taste and touch. This gave the students a chance to make a note of key vocabulary they might need in their descriptions. The cathedral and the cathedral grounds were popular choices for sight. Freshly cut grass was a common choice for smell – and resonated strongly for me. Malt vinegar and chips was another smell that was greeted with choruses of agreement.
The students then worked in small groups to discuss how they first arrived in Canterbury. They answered questions about how they travelled, what time of day they arrived, what time of year, what the weather was like, what they did first, any people they saw or met. After the discussion they returned to their vocabulary lists and added any new items (streetlights in the rain, the feeling of a cold wind on your face).
Now it was time for the writing task. I planned each lesson so that the students would be able to draft, redraft and share a completed text in each two-hour session. The instructions were simple: to write a short description of their arrival in Canterbury and their first impressions including whatever details they wanted to from their vocabulary lists and their discussions. The students each had a PC, and I was able to monitor and give feedback as they wrote, supporting them during the process whenever they were stuck. This meant there was a lot of one-on-one discussion and redrafting going on as the students wrote. It also meant that each student could write at their own level, express their own impressions, and create their own text.
I had set a time limit and when it ran out, the students worked in pairs and threes to read each other’s descriptions, compare their different impressions and comment or ask questions. They then went back to their texts, made any changes they wanted to make and shared their final draft with me by email.
I collected the texts and using my newly acquired html mark-up skills (the English department had kindly paid for me to do an evening class) and a section on the department’s homepage, I published all the texts on a web page dedicated to our creative writing class and shared the link with my students. It was a proto blog. And we were all very pleased with it! (A few months later, in 1999, Blogger was launched and my html skills were already obsolete!)
So, does the activity stand the test of time?
I would say that the basic lesson plan and writing task stands the test of time. It worked in the context of a group of students who were all studying in a town that was new to them. I think it would still work in the same context. The basic stages could stay the same, but of course, the technology has moved on.
Today I would ask my students to use Google docs, or a similar shared online space, to write and share their texts. The students could use much more sophisticated grammar and spellcheck tools as they were writing, including AI-powered tools such as Grammarly, or they could ask ChatGPT or another AI bot to check their work and give them feedback. We could share the texts accompanied with copyright-free photos on a virtual noticeboard like Padlet, or share links using QR codes or image triggers on a multimedia poster.
And is the task still relevant in the era AI chatbots?
Which brings me to the main question that interests me here: how does this lesson stand up to the use of AI text-generators? Meaning, can my students simply ask AI to generate the writing text for them and by-pass the whole writing process?
And of course, the answer is yes, AI bots can generate texts about the first impressions of Canterbury very easily, like the one below:
Arriving in Canterbury for the first time feels like stepping into a storybook. The cobbled streets wind through rows of charming, timber-framed houses, their history practically whispering from the walls. The majestic cathedral dominates the skyline, its spires reaching proudly into the sky, drawing the eye and the imagination. As the bells toll softly, the air seems alive with an ancient tranquility, yet the streets bustle with the energy of modern life. Quaint shops and cozy cafes invite exploration, while the gentle flow of the River Stour adds a peaceful rhythm. It’s a town where history and the present meet seamlessly, leaving a sense of wonder at every turn.
(ChatGPT generated at 16.24 on 31/12/2024 Spanish time)
But does this text chime in with the students’ impressions? Does it say what the students want to say? Is it true to their experiences? Does it sound like one of the students’ voices? Well, I guess those would be good discussion questions for the class. Maybe they could write a prompt for the text, and discuss the results after they’ve already discussed their own first impressions. That way they are less likely to be influenced by the machine-generated ideas.
Of course, they could prompt AI with more information about their first impressions, feeding it some information from the brainstorming stage and the group discussion. And we might get something like this:
I arrived in Canterbury on a late summer evening, and the first thing I noticed was the smell of freshly cut grass in the air. A cool summer breeze brushed against my face, making the evening feel even more refreshing. As I walked through the streets, I spotted the cathedral, its spires glowing faintly in the evening light. In the grounds around it, a group of people were sitting on the grass, chatting and laughing. Their voices carried through the quiet, giving the place a warm, lively feeling. It was such a lovely way to be introduced to the town.
(ChatGPT generated at 16.31 on 31/12/2024 Spanish time)
Again, the students could discuss the text, compare it to their personal impressions, decide whether to: continue prompting for a better text; take the text as it is and make changes to it to make it their own; use the text as a starting point for something original written in their own voice.
I would argue strongly, from an environmental point of view, that we should limit the number of times we, or our students, prompt AI for a new text, as each prompt has a hefty environmental footprint in terms of both carbon emission and water usage. I would also argue the same point from a pedagogical point of view. Yes, AI can support and scaffold the students’ writing, but at the end of the day, these students are going to need to be able to write their own texts in an exam (while written assessments still remain a thing), and good writing needs practice. But good writing also needs models. And reading, analysing and discussing model texts created by AI bots is a great way to encourage intensive reading and text analysis.
Conclusions?
I think the lesson stands the test of time in a similar context. It gives students an opportunity to express their experiences in writing with an emphasis on developing their own voice. I also think that AI-text-generation could complement and supplement the lesson usefully. It could give students a chance to experiment with prompt writing in English, to critically analyse AI output, and to discuss possible uses for AI in the writing process.
I haven’t had a chance to teach this lesson with a group of students since the advent of ChatGPT et al, but I think if I were to teach it, I would want to do the following things:
1 Set up the task with brainstorming and discussion first in order to foreground the students’ experiences and their words.
2 Impose a limit on the number of AI outputs, e.g. 2 or 3 max, and ask the students to write their own version using the AI texts in whatever way they want.
3 Ask the students to first discuss and then write a short report on how they used AI for the task, how it helped them and any challenges they faced.
What about you?
Have you been looking back and taking stock? Have you been thinking about how approaches and activities from the past can be adapted, supplemented and complemented by AI? If so, please leave a comment below, or on social media, or write your own blog post!
And to everyone reading this in the dying embers of 2024 or the first few days of 2025 – I wish you the best of luck for the new year!












