Delayed, but it’s done – this will be the final post of commentary on the research Mark Whalley and I conducted last year and had published in SSR In Depth this summer. It’s open-access so please do have a look and let me/us know what you think. We’re also presenting on the data in January at the ASE Annual Conference.
In the earlier posts of this sequence I discussed the context and the data we collected, as well as the recommendations I’ve built on what we found. Please note that Mark and I collaborated on the paper but these posts are my opinions, some of them going significantly further than the data supports in isolation.
Why we thought it was worth asking the questions about job satisfaction and retention
This post will be have a slightly different focus; I’m going to describe the challenges I had with data-collection, in the hopes it will help others avoid the traps I fell into, and think out loud about further research that might be fruitful and/or interesting. I am open to collaboration either as part of my day job or in my own time, so please do get in touch if this is something you’d like to discuss.
Data collection
Teachers are really bad at completing surveys.
Longer surveys get fewer responses.
The busier a teacher is, the less likely they are to complete a survey.
There aren’t many physics teachers in England in the first place.
Taken together, these facts mean that it’s hard to get enough responses to treat the results with confidence. The solution we came up with was an incentive, and after some discussions I secured the budget to provide a voucher to every person completing the survey. This obviously cost more than a prize draw, but avoided the complications of ensuring randomisation etc. To provide a choice, we gave the option of donating the money instead (to Education Support). So we set up the questions on one anonymous survey, linked the completion page to a separate one to gather contact info, and shared the link on social media.
Roughly an hour later we had over 200 responses, the vast majority of which were filled with random garbage and a procedurally generated email address in a transparent attempt to get the voucher. Because of my inbuilt cynicism – I mean, I have three kids and spent ten years in a classroom with other people’s teenagers – the voucher process was not automated. We identified the fraudulent attempts, started a fresh survey and tried again.
This time it took a couple of hours to be discovered, but we got about a thousand submissions. There was Latin in the free text boxes and random numbers in the data fields. A handful looked genuine but most were clearly bot-generated. A later version which was only emailed still got corrupted, suggesting the link had been shared online somewhere the spammers had access to it.
We considered asking participants to use a school-based email address for the voucher, but decided this would reduce people’s confidence in the anonymity. (Although I should emphasize the two surveys were separated by design so it was impossible to link the answers to the participant, even for me.) The platform we used, strangely, didn’t have a built-in Captcha option as a question you could include. We did look at whether we could use an external service for that to enable the voucher request, or a trap question, but it was too complicated.
In the end, there was only one solution – putting my pattern recognition up against the scammers. I ended up figuring out boundaries for several of the questions and having Excel grade the answers as green (possible) amber (dubious) or red (clearly a scam). For example, a participant who claimed to be at a school with 64 students but alongside 20 science teachers was unlikely to be genuine. Although it was frustrating, it was also really interesting when I got into the data, especially as several columns showed a smooth bell curve rather than the clusters we’d expect to see in the real world.
very short time taken for survey completion
inconsistent participant qualifications (no GCSEs, no A-levels, but doctorates in physics and engineering? really?)
unrealistic school numbers, student and staff
incompatible reported FTE vs timetable load
random or repeated phrases in the free text fields
In the end, I filtered around 5000 responses down to just under 100; those, with any possible identifying information such as time of completion removed, were shared with Mark for the data analysis.
My advice: when creating an online survey, set up at least some questions to allow impossible answers and plan on how to identify them quickly. Free text fields will be auto-filled by some bots and this can provide an obvious clue. Decide in advance what your yes/maybe/no ranges will be.
Next steps
I’m still looking at the dataset, for example to see whether there’s a correlation between responsibility roles and job satisfaction – in particular if colleagues are less grumpy about SLT if they’re already a middle leader. I think it would be really interesting to see whether any of the same issues show up for subject specialists who have to teach out of their exact specialism but within the same department. How to French teachers feel about a Spanish-heavy timetable, for example? I’d like to know how other factors, specifically age and gender, affect both job satisfaction and attrition factors. Recent work done looking at the exodus of women in their thirties from teaching suggests to me that alongside the recommendations in both the published paper and my commentary, you can make a big difference for those teachers by offering flexible working and matched timetables. We didn’t ask those questions because it counts as sensitive data and triggers a whole new world of GDPR and ethics trauma.
If I was doing this again, I’d simplify the questions and ask colleagues to prioritise the possible changes. If they could only have three or five changes from a shortlist, which would make the biggest difference? The benefit of this is that we could separate the possible changes based on the tiers I used in my last post, so HoDs find out which of the things they have control over are worth fighting for.
One of the things we looked at was deprivation score, and it didn’t make as much difference as we expected. (There were noticeably more physics specialists in less deprived areas though.) I’d really like to see the teaching unions investigate this angle to see if there’s a relationship between the deprivation index of a teacher’s home compared to their workplace. How does this vary by subject and seniority? For example, for some years I travelled from a relatively high deprivation area to teach physics in a leafy suburb. I’m prepared to bet that although it’s noisy, there’s a similar signal nationally.
I’d like to repeat my suggestion from the last post; we need a national programme of exit interviews for all departing physics teachers. Why do they go and where are they going? What might change their mind? Link it to an ongoing job satisfaction survey and we can see how much difference complaining about workload in years 2 and 3 makes to the chances of them leaving after year 5.
Give me some money and I’ll run a nationwide anonymous exit survey for every physics specialist leaving a state school. I’ll find out where they’re going and why. I’d hope schools are doing this now, but why on earth isn’t there a standardised set of questions for every departing teacher part of the Ofsted requirement for a school? Don’t add it to the league table, but anonymised to a regional level this would be a valuable tool. Add a threshold for X% of teachers ticking the same box for a particular school which should be a warning sign for SLT and Ofsted. (Heads Round Table, call me.)
Final thoughts
I really hope someone has been reading these posts – maybe there will be an influx as term restarts. Or maybe I should give up on blogging the old way and create a substack, but that just feels weird. If you’d like to discuss any of these ideas, please add a comment here, email me or find me on social media; I’m currently experimenting with BlueSky.
Previous posts have introduced the research that we did, summarised the negative and attrition factors the teachers considered most significant and linked this to what they reported might encourage retention. Understandably, we were most interested in the factors considered important by those who were seriously considering leaving! For at risk colleagues, what they said would help most were:
financial incentives
increased planning time
reduced marking load
teaching physics only
lesson resources
department resources
One identified challenge is that different factors are within the control of different stakeholders within the education sector; I suggested that thinking about this in terms of fuzzy boundaries is helpful, based on this:
Methods – all colleagues
All colleagues need to acknowledge the challenges, both generally and how it works in your local area/school/MAT. Teachers need to be enabled – which is more than simply encouraging them to do it! – to seek physics-specific support as well as more general teaching advice. I used to tell trainees that “there’s no such thing as a good teacher, just teachers who are good at X.” Find the local experts on specific areas of the curriculum, go to D&T to see how they deal with practical work and maths to see why kids struggle with your wording about ‘directly proportional’. Join the IOP’s professional community to get physics-specific resources.
Use those physics-specific resources for your own support and development, such as the free-to-access videos on IOP Spark. Wherever possible, invest your time in shared resources and transferrable approaches, like a question bank you can use and reuse every lesson, iterating rather than creating from scratch. Where a school has these in place already, use them. If you can’t find them, ask. Quite apart from anything else, the benefit of your students knowing the ‘house style’ means you’re able to lean on department and school procedures rather than establishing your own!
For colleagues working in the classroom, especially early in their career, it’s worth remembering how many different skills you’re trying to polish. One analogy is that it’s like trying to learn everything about driving a car simultaneously:
general maintenance
using the controls
hazard perception
highway code
adjusting for changing conditions
following directions
planning a journey
What actually happens – or at least it did back in the mists of time when I learned to drive – is that you consolidate one set of skills and move on to others. My parents got me sitting in the passenger seat, thinking about the conditions and other road users, before I got my provisional licence. I built on my experience of being a cyclist and all of those specific hazards. I practised using the controls, including a gear shift and a choke – I did say I was old – in an empty car park on a Sunday, so no other cars to think about. And so on.
A department scheme of work and borrowed resources are the equivalent of someone else doing the journey planning so you can focus on making progress, while not going over the speed limit. Shared planning means you can improve consistency of style and approach in a department – we call it a staff team for a reason. And if you’re wondering why you’re seeing new colleagues struggling, remember the curse of knowledge and lend a hand.
Tactics – Heads of Department, CPD and MAT Science Leads
As suggested in the final section of our paper, matched timetables should be the first priority at the department level; this will involve advocating upwards, possibly in writing with a link to our research. A colleague with year 7, 8 and 9 science classes has three courses to teach. Each course includes resources, practicals, risks to manage, misconceptions, assessments… and two-thirds of those will usually be in their non-specialist area. It is easy for their last experience of those ideas to be from ten years and two degrees ago. So instead, could they have three classes of the same year group? Three opportunities to teach the same content, with far less time needed to review/choose/write resources. It also means it’s worth them investing time in a more streamlined feedback and marking approach. When the tests come around, marking three sets of the same paper is easier than three different ones.
Wherever possible, give them more physics. Unless you’re over-supplied with physics specialists – statistically unlikely – then their colleagues will also be pleased. All of the hinterland we talk about, all of the extra confidence and increased familiarity, means less time spent planning. They can use that time to dig deeper into the pedagogy, to improve their understanding of the school approach to assessments – or simply have a cuppa in the prep room. Over time they’ll still get to teach more of the curriculum, but nobody would expect every French teacher to be as good at teaching Spanish simply because they’re in the MFL department.
Every Head of Department wants to support their colleagues with shared resources, feedback and marking approaches which balance the needs of the students with the sanity of the teachers. This is about making sure everyone knows the materials are there to promote a consistent house style and reduce duplicated effort. The danger here is that if you’re short of physics specialists, anything you ask them to create in-house means extra work for them, per-person. If you’ve got one specialist building things for everyone to use, being the point of contact for issues and troubleshooting, then congratulations – you’ve got a lead practitioner! Are they getting paid as such? If you work across a MAT then you’ve got the added challenge of providing support that’s helpful in terms of workload but still allows flexibility for different schools.
To make this easier, there’s a whole load of resources and projects that can support you at a department or MAT level to increase physics confidence and competence across the team.
the Stimulating Physics Network (SPN, now administered by STEM Learning) works with the department to boost physics skills
the Ogden Trust works with schools and individuals; the SKPT project in particular is a great way for individual colleagues to be upskilled, but availability may vary depending on area and they will need dedicated time.
the CPD videos on IOP Spark are topic-based support including real classroom approaches as well as physics explanations, and the ECPL set are a good way to structure mentoring conversations with those new to teaching physics. The Early Career Framework is time-consuming enough without having to write your own physics modules.
Remember that making sure there’s development opportunities provides many benefits; the point of including them here is that by reducing the overall workload in the team, you’re addressing an attrition factor. The downside of this is that it’s easy for biology and chemistry colleagues to feel that they’re missing out on refreshing and developing their own specialist areas.
I’m looking at ways HoDs could use a department audit, based on the questions used in our research, to identify local priorities and match them to specific, evidence-informed tactics. If you’re already monitoring this, please give me a shout!
Strategy – Headteachers and MAT leadership
My colleague and friend Mark is keen to use the model of ‘dealing with the closest crocodile.” Apart from what this reveals about the wildlife in Cheshire, it’s a great way to remind us that there’s almost always a crisis demanding the attention of education professionals at every level. Whether you use an Eisenhower matrix, GTD or some other way of prioritising, there will always be more problems than time or money for solutions. So why should the boss of a school with a hundred staff and two thousand students be spending time on physics teacher retention?
The title of this post is the reason why. Physics teachers are special, rare and hard to retain. But they’re not, despite the understandable hyperbole, actually unicorns. Instead, a good way to think about them is – to use a more recent biology concept – as an indicator species. Physics specialists have more options outside education, and those options are better paid on average (see Fig 5 in this from the NFER). The grass gets greener as they see their pay starting to plateau. Especially since the pandemic they’re contrasting their situation – in the workplace Monday to Friday, with associated time and money implications – with their fellow graduates working in PJs from home, with decent coffee and no commute. What this means is that they have more options, so they act when their non-physics teaching counterparts, under the same strains and stresses, can only grumble. This is particularly true for early career colleagues who don’t have a mortgage yet, and can make the most of the freedom that entails. They might be cheaper to hire, but the higher attrition rate might make up for that. If a school can’t retain physics teachers then it’s a warning sign that other colleagues would be leaving if they felt they could.
It may feel unfair to pay them more money, but if your Head of Science tells you that your sole physics specialist is doing all the jobs that are shared between three biologists, it’s easier to see how that’s unfair too. If you lost all your Spanish teachers, you would expect the Head of MFL to make choices about which languages were offered to your new Year 7 students. That can’t be done with Science, so show you respect their specialism by supporting matched timetables instead.
If you’re responsible for workload and retention across a MAT, you can use economies of scale to show how investing in resources pays off when it comes to staving off teacher exhaustion. Of course, these problems are not unique to physics teachers. It’s just that we notice the effect of the attrition factors for them first. If when you model the workload effects of new initiatives – and I’m assuming you do – some teachers are hit harder than others, what’s your plan to reduce that impact? How are resources like IsaacPhysics supported across the schools in the trust to reduce the barriers to uptake? When you choose new platforms for retrieval practice or worked examples, do your staff know that they’ll be available for long enough that it’s worth investing in them?
Alongside Ofsted surveys monitoring staff job satisfaction, wellbeing and concerns is going to pick up a lot of noise and occasionally an important signal. You can’t keep everyone happy all the time, but aiming to keep most colleagues grumbling a little rather than a few constantly crying in the staffroom is a more realistic aim. I’m not qualified to tell you how to manage your staff – I’m just flagging up why the physics specialists might be a more urgent concern than their numbers may suggest.
Policy and Law – government, exam boards and publishers
I’m grouping these together because the overlaps are so hard to entangle – and, frankly, I’d be amazed if any of them read what I’m typing. That’s a long way above my pay grade, but the reality is that if we don’t acknowledge who can make changes, we’ll get blamed for the ones we don’t make ourselves. Being a middle-manager and being blamed for the things your bosses do sucks, so drawing a line and admitting what’s beyond your control is important.
Defining the curriculum (government), how it’s assessed (exam boards) and how it’s taught (professional associations and publishers) are not small jobs. They’re also dependent on each other, which is why education reform is always challenging. Back when I was teaching – and over a decade there was one year when I wasn’t teaching a new spec to at least one class – I was told it’s like trying to convert a diesel train to electric without stopping the journey. I’ve now decided it’s like trying to convert every train to a different type without cancelling journeys or reducing the timetable. So what would I recommend to the government, based on this research, to address physics teacher retention?
Firstly, throw money at the problem – but aim carefully. I’m personally really pleased that the government is acknowledging the concerns raised by the STRB and doing something about this for the profession, and even more so that it’s at least partially funded. They’re looking at how pay could be varied by subject and need, which is something we’ve already seen in physics. I’d argue that a long-term plan for this is needed, and the benefit needs to be spread out rather than just being front-loaded into bursaries. A physics teacher needs to know – as I did, ages back – that I can rely on the extra cash for long enough to get a mortgage. Overall, this needs to be considered carefully, not re-invented annually, and be built on top of broader sector funding reforms that address old buildings and a crumbling SEND component – not to mention the growing problem of the inflexibility of teaching as a career, post-pandemic.
As part of that, we need to be asking better questions. Focussing on physics teachers; it’s almost unbelievable that the DfE can’t say how many physics specialists are teaching in state schools. Specialist is so poorly-defined that deciphering the stats is nearly impossible. Teaching electricity to Year 10 is physics, right, so it needs a physics teacher to count as specialist?
Turns out it’s not that simple. If it’s a Physics GCSE class, then yes. (Let’s not get into whether a D&T teacher who’s a qualified electronics engineer counts, versus a chemistry graduate who did a ten day SKE to access the Physics PGCE programme.) But if it’s a class doing what I still think of as ‘Double Science’, then any science teacher counts as a specialist. We – and by that I mean the DfE – just don’t collect enough information. (If I’m wrong on this, please point me at something more useful.) Teaching within specialism – whether from graduation or acquired by longer-term development – means better outcomes for students and less workload for the teacher. But at the moment we just don’t know how big a problem it is nationally.
Adding to this, I’d like to know why physics teachers leave by asking the ones who are actually leaving. And I mean ALL of them. Give me some money and I’ll run a nationwide anonymous exit survey for every physics specialist leaving a state school. I’ll find out where they’re going and why. I’d hope schools are doing this now, but why on earth isn’t there a standardised set of questions for every departing teacher part of the Ofsted requirement for a school? Don’t add it to the league table, but anonymised to a regional level this would be a valuable tool. Add a threshold for X% of teachers ticking the same box for a particular school which should be a warning sign for SLT and Ofsted. (Heads Round Table, call me.)
Exam board specifications and teaching resources are two sides of the same coin – and they’re both determined by government decisions, interpreted broadly in some cases. That starts with thinking very carefully about what we want the curriculum to do, and most importantly what there is not time for. Gove shoved a load of stuff in the science specification that seemed like a good idea to him and his advisors, and since then the idea of a knowledge-rich curriculum has been taken for granted by many. And it’s not that it’s necessarily bad physics – it’s just that there’s an awful lot. A few years back I wrote a Physics Teacher Guide for Hodder and felt the need to acknowledge in the accompanying SOW how cramped it would be. We need to be honest about what we fit in to the science curriculum and what will need to be left out. (Obviously this is true across all subjects, and that’s before we get into all the other things the media seem to think that schools should be responsible for.)
Creating good resources is harder than it looks, as every early-career teacher has discovered to their cost. The principle behind projects like Oak Academy are noble ones, but as all the publishers would tell you, making good materials takes expertise and time. Neither are free. Exam boards and publishers being entangled – and yes, Pearson, I’m looking at you – means that it’s very easy for specification-matches resources to be produced that then need regular updating and improving, all at a cost. This is an example of the general movement towards rental/subscription rather than ownership – which has benefits, especially for shared digital resources, but it’s not without challenges. I’m listening to Spotify at the moment, but I had to put a playlist together myself because the album I wanted has been ‘updated to a DELUXE Edition’ by the band and I want the original, damnit! (August and Everything After by Counting Crows, if you care.)
Ranting over
I think I’m probably done for now, but the above ideas show that the issues can be addressed, if not solved, in different ways. I’d love to hear suggestions about which are realistic and which are mistaken. My next post – probably at the weekend – will be digging into the survey process itself and what I learned from what didn’t go right.
First things first; the wonderful people at the ASE have now made the published article open-access. You can read the paper via SSR in Depth without logging in, share it with colleagues (including HoDs or SLT) and generally check my working. I would emphasise that I’m a big fan of the ASE, have been a member for some years and encourage colleagues to engage with them. And no, I’m not being paid for that. I’m writing now as an engaged professional, separate from my ASE membership or IOP employment.
Recap
In the first two posts of this mini-series, I explained the context of the research and summarised some of the things we found. In particular, I discussed what can be described as
negative factors – what colleagues said reduced their job satisfaction)
attrition factors – what they said made it more likely they’d choose to move on.
That second one is particularly relevant in physics teaching because we lose so many colleagues from the English state sector. It’s important to note that because of the way data can be collected, what sometimes happens but is hard to track is that teachers move to teaching in independent (fee-paying) schools and/or internationally. I’m interested in the patterns of this migration, so if anyone has data please give me a shout!
This post will examine a third list, the things which colleagues said might be a factor in encouraging them to stay. I’m describing these as retention factors but before I start discussing the stats, it’s important to include a caveat.
Every teacher is different, and every school is different. There are many reasons why teachers leave the profession, and because we tracked intentions rather than actions it will be an imperfect report. Mark is in the process of finishing a separate piece of research where he interviewed those who had made the leap, and that will be a valuable complement to this work. It’s not as if a Head of Department can work down the list of retention factors and add them all to the school policies. And which factors matter most in individual cases may not reflect the order we have here. As I said to a colleague on Twitter, this research – like many other trends we can see in large numbers, such as the reasons girls often choose not to do Physics or the differing perceptions of careers held by parents – gives us the questions to ask, not the definitive answers. (Thanks to Paul Hunt for prompting this response, which I’ve polished slightly here.)
Happy physics teachers are all alike; every unhappy physics teacher is unhappy in their own way.”
(apologies to Tolstoy)
Part of the reason we limited our dataset to colleagues in the first five years was that when we ask similar questions of the most experienced colleagues, their answers are very different for reasons that make perfect sense. In unpublished data from previous work, a large percentage said that teaching out of specialism wasn’t a problem for them. It turned out that this was because most of the respondents fit into one of two categories; many had significant experience and so had gained the skills and knowledge needed for teaching biology and chemistry topics with confidence. Many of the others were in settings where they only taught physics; they could honestly say it wasn’t off-putting for them because they didn’t need to do it! Neither of those things are necessarily applicable to early career colleagues teaching across the curriculum.
Retention factors
Just as with the earlier questions, the teachers were asked about how important possible changes would be in encouraging them to stay in teaching. These factors were defined because we started the survey, but were based on previous studies and unsurprisingly were often possible solutions to the factors proposed in the negative and attrition factors questions.
As before, there was a fairly close match between the factors identified by all respondents and those categorised as ‘at risk’; in fact the order was identical:
financial incentives
increased planning time
reduced marking load
teaching physics only
improved behaviour policy
Beyond the scope of the published article, I spent some time looking at the smaller group who could be described as ‘high risk’. The difference was not huge, but it was interesting; for these respondents, other factors become more important suggesting they’re particularly affected by the workload aspects. I’ve compared the percentages saying it would be a big or medium factor in encouraging retention with the equivalent for all respondents.
financial incentives (93% rather than 92%)
increased planning time (93% rather than 85%)
teaching physics only (89% rather than 75%)
lesson resources (86% rather than 70%)
well-resourced department (82% rather than 70%)
The difference is interesting rather than ground-breaking, but to me it suggests that any solution which does not address their teaching-specific workload is doomed to failure. What I found particularly interesting is that even though they’re at much higher risk, more money wasn’t massively higher. It’s a problem, yes – but it’s not the answer to everything. And although flexible working featured in other lists, by this point they’re beyond worrying about it. Definitely think about how you can offer it, but arguably that’s often a sector issue.
I’ll also point out there was no correlation between the Index of Multiple Deprivations for the school and whether teachers reported concerns about department resourcing.)
How do we solve a problem like retention?
If it was easy, anyone could do it. And it’s amazing how many people, whether they’re celebrated opinion writers in the media or committed teachers on social media, think they can indeed do just that “with this one weird trick…” It turns out that it’s a bit more complicated than that, but this does not mean we shouldn’t try to address it.
I’m going to ignore behaviour improvements, not because I don’t think it’s important but because it’s not a physics teacher problem. We/they are particularly vulnerable to it, and SLT need to understand the challenges of working with hazardous practical tasks in large numbers when students are unable to follow instructions. But let’s be honest; not many schools will have a behaviour issue that only shows up in physics lessons. So what does that leave?
financial incentives
increased planning time
reduced marking load
teaching physics only
lesson resources
department resources
Who can solve a problem like retention?
One problem in organisations of any size is that when things go wrong there are a lot of people to blame. This is made worse when, for completely sensible reasons, those higher up in the organisation may not be able to share all the reasons for decisions that affect us. So the Head of Department gets the blame for the choice of the exam board, even when it’s a decision made at a school or MAT level, because they can’t or won’t share that explanation with their teams.
Something I have up by my desk is a reminder of the different layers in managing UK education and how they inter-relate. It’s an absolute mess and one specific to my day job, so I won’t share it here. Instead, a massively over-simplified version with layers that have decidedly fuzzy edges is below:
So the question now becomes, what changes could we make in the system to improve retention, and critically who can control or influence that? There’s no point in asking the Secretary of State for Education to share their physics teaching resources with an ECT or expecting every new teacher to successfully demand a higher salary because physicists are in short supply. So who makes which decision?
Although the bullet points above look like a wide range, they’re actually closely related. More money could be used to address practically every concern, but there are many reasons why that may not be the first solution we can apply. Separate to that, most of the specific suggestions below are concrete suggestions that reduce the workload of physics teachers, without simply delegating this work to someone else. Many departments will already use some or all of these – others will perhaps not have SLT who realise how much they would help. (I wrote ages ago about SLT needing to know what actual teacher experience is like, and this is never more true than the experiences of physics colleagues teaching out of specialism because ‘it’s all science anyway.’) And addressing some of these will have a knock-on effect – buying in resources for the department will effectively increase available planning time, for example.
In my next post I’m going to work through these approaches and others, but in terms of the layers of influence/control. I’ll be starting with the foundation – the colleagues who work with students.
As I said in yesterday’s post, these musings are my responses, and some behind the scenes explanations, of the article Mark Whalley and I wrote for SSR In Depth. This was based on a survey undertaken as part of my work with the IOP, but was peer-reviewed by colleagues through the ASE. The ideas here are mine rather then being IOP-approved policy, and I hope readers will see that they’re directly based on the data rather than taking a top-down approach.
Context
It’s very easy for teachers – and I’ve been there! – to feel like all schools are like their school. I suspect social media has reduced this somewhat, but it’s done that by encouraging polarisation and assumptions that all schools are variations on a small number of themes, based on those who are the loudest advocates. Getting actual numbers of active specialist teachers is surprisingly difficult, but a good estimate is that of the 30k science teachers in English state schools we’ve got between 4k and 7k physics specialists, rather than the 10k which would be a ‘fair share’. They’re not equally distributed, either – lots of discussion about this at the Royal Society of Chemistry.
Recruitment varies but 600 per year is a reasonable benchmark, and half of those leave (the English state sector) within five years. We’re not in a good place, even compared to the general concerns about teacher supply (such as this from the NFER).
Questions and Answers
Details about the methods are in the paper, and I’ll talk about the challenges in the last post of this series. Put simply, we wrote a survey with a bunch of questions, put it through ethics review and then asked teachers of physics to complete it. Because previous data obtained through my day job had been heavily weighted towards experienced colleagues, we chose to focus on those who were in the first five years of teaching. We further specified England to reduce the confounding factors of different educational systems. More than ninety sets of valid responses were collected – choosing which were valid was interesting but again, you’ll have to wait for the final post – and Mark then did all the hard work of data analysis, looking for patterns and correlations.
We asked respondents about their setting and their career to date. We did not identify schools, going to the extent of asking them to check the deprivation index from an online calculator rather than collecting the postcode ourselves. We chose not to collect any sensitive data which means, for example, we could not analyse any possible effect of age, gender or ethnicity on job satisfaction or attrition.
The survey asked whether colleagues were planning to remain in teaching. It’s really important to acknowledge that this is about intention, and because of when many of the respondents completed it , at the end of the summer term, they may have been at a low point. (As an aside – I’d love to see responses to this question on a monthly basis through the school year to recognise a cycle of peaks and troughs.) Mark has been working on more in-depth questions with former physics teachers, but this approach naturally suffers from the problem that those who respond tend to have the strongest feelings! The headline result: 32% of physics teachers surveyed were seriously considering leaving or actively planning to leave. There was no strong correlation with school characteristics such as size or deprivation score. Many factors were important both for those generally dissatisfied and those planning to leave, but order varied – more of this in a moment.
32% of physics teachers surveyed were seriously considering leaving or actively planning to leave.”
In the survey, respondents were asked to rate different factors in terms of their effect on:
Job satisfaction
Choice to enter teaching
Dissatisfaction
Intention if any to leave teaching
Probability of encouraging them to stay in teaching
What made physics teachers unhappy and what made them consider leaving were, unsurprisingly, overlapping lists. None of these will be a shock, but those of us who remember the ’24 tasks’ list (and a more recent iteration) will recognise that teachers are much more likely to accept what they see as necessary professional tasks than imposed administrivia.
Factors causing dissatisfaction, in order (all respondents)
poor student behaviour/relationships
salary
planning workload
marking workload
administrative tasks
lack of flexible working
school leadership
These are not unique to physics teachers, of course. What was particularly interesting is that some of these factors correlated more strongly with intention to leave than others. Being dissatisfied with planning workload, lack of flexible working and school leadership are clear warning signs that someone might be seriously looking at their options, not just recognising the challenges of the job.
Factors linked to attrition
It’s reasonable to expect that the factors causing colleagues to consider leaving will match the list above. It’s also predictable that when we analyse the importance of these factors, the percentages will go up when we look at those who described themselves as more likely to leave. They’re the high-risk group for attrition, so of course they’re more unhappy! What was really interesting to me is that some are much more significant, and the order changes too.
All respondents
At risk
High risk
salary (70%) student behaviour (60%) planning workload (57%) marking workload (57% school leadership (51%) lack of flexible working (51%)
salary (74%) student behaviour (68%) planning workload (63%) marking workload (59% school leadership (53%) lack of flexible working (51%)
Planning workload (82%) Salary (74%) Marking workload (68%) Student behaviour (68%) Lack of flexible working (68%) School leadership (61%)
The takeaway from comparing the ‘general dissatisfaction’ list with this is a simple one. Teachers grumble about student behaviour, salary and other factors. As a profession we definitely want to address these, but if we want to focus on retaining physics teachers it’s effectively background noise. If we can’t fix everything, what do we choose? The signal we need to look for – the specific issues that seem to be driving people out of the classroom – are the things which are more important to them than the ‘general population’. The planning workload is at the top of that list, followed by a lack of flexible working, marking workload and school leadership.
Coming soon (ish)
Next time I get to write, I’ll be looking at what the responses suggested about addressing – not solving – the issues, particularly for those who were higher risk of attrition. In particular I’ll be sharing what I think about who might be able to make some of these changes, rather than putting all the responsibility in one place.
“Why bother researching something we all know about?” I got asked when we announced a physics teacher retention survey. I guess I could see their point. Everybody knows there aren’t enough physics teachers working in the UK state sector. Everybody knows they leave sooner than we’d like, and everybody knows the reasons for it. So why bother asking the questions when everybody knows the answers we’re going to get?
The obvious response was that it was (and is) my job. It’s on my business cards and everything. Without expert, passionate teachers of physics we don’t get enough expert, passionate physicists. We need a whole load more of those than we currently have. And whether someone’s job title says they’re a physicist or not – many more jobs and courses rely on physics than is obvious to those outside the profession – that knowledge starts with a teacher. So part of my day job is about measuring, supporting and advocating for teacher retention.
The second, more philosophical response is that we don’t stop asking questions because we think we know the answer. That’s not how science works. Either we’ll find something that contradicts what we thought we knew, or we find subtleties and patterns in the data that we didn’t know before. Both of those are good outcomes, and data with a bigger sample is always going to be better than relying on the anecdotes of those physics teachers we’ve happened to meet.
The final, more pragmatic reason is that what “everybody knows” isn’t always the truth. It turns out that what although we are definitely understaffed nationally, the reasons for this aren’t quite as clear-cut as many might think. We wanted to know which factors were most important. Were some teachers more susceptible than others, and how does experience or setting matter? Most importantly, what can we do to address the issues before people leave? Who can affect the different factors?
In our next thrilling episode…
So I’ve set the scene for the research we did. If you want to read ahead, be my guest – the study was published over the weekend in the ASE’s SSR In Depth issue 391. The title I’ve used here is one we considered for publication, because the data shows physics teachers are a warning sign of general sector issues rather than being unique. More posts – and yes, I’m going old-school with my blog rather than doing a podcast or a substack – will hopefully all come out this week:
I’ll give some key points about what we found out, particularly the difference between negative factors and attrition factors
I’ll discuss what possible actions or retention factors the data suggest might be useful, and who is responsible (a hint – we can’t keep blaming Michael Gove)
I’ll confess to the stumbling blocks we had during the survey process
I’ll emphasise here what I’ve said previously on the blog and have pinned on my twitter profile; these are my personal views, not those of my workplace. When I say “I think…” I’m speaking individually, not expressing the policy of the organisation I work for. I’ve also not run them by my co-author, former colleague and friend Mark! Consider them prep room discussions over a coffee rather than carefully thought-out policy recommendations shared while wearing a suit.
Whalley, M. and Horsewell, I (2024) Should I stay or should I go? Exploring the experiences of physics teachers in their first five years. SSR in Depth July 391
Following up yesterday’s reflective post, my typed up bullet points of the afternoon sessions. As before, my thanks to the organisers and presenters and a promise that I’ll update these posts with links to the actual presentations in a week or so.
Do They Really Get It session by Niki Kaiser (@chemDrK)
Students gave correct answers by imitation, not based on deep understanding, as shown by discussions of ions in a solution vs electrons in a wire; I wonder if the demo showing movement of coloured ions during ‘slow’ electrolysis would help?
Threshold concepts guide the teacher when choosing what to highlight, what to emphasize in lessons. There should be no going back from the lightbulb moment. If so, why do we need to constantly return to these misconceptions where students rely on folk physics despite explicit refutation work with us?
It is worth making explicit to students that these are challenging (and often abstract) concepts, and so time to understand them is both normal and expected. In Physics we make this clear with quantum work but perhaps it should be a broader principle.
#rEDRugby my thoughts on @chemDrK‘s “these concepts are hard”: we need to share that with kids so they don’t think it’s them. They’re using brains evolved for arguing about fruit for a totally different task.
Teachers will do a lot of this already, but we need to be more deliberate in our practice, both for our students and for our own reflection. This is how we improve, and is particularly important for us as experts to put ourselves in the position of novices. This is part of what we refer to as PCK.
“Retrace the journey back to innocence…” a quote from Glynis Cousins in a 2006 paper (this one?) which is about better understanding where our students are coming from. I would use the word ‘ignorance’, but like ‘naive’ there are many value judgments associated with it!
It’s not properly learned unless students can still do it when they weren;t expecting to need to.
From @chemDrK: “not learned unless they know it when not waiting for it to be tested”
Me to @DSGhataura : “*Nobody* expects the Chemical Inquisition!”#rEDRugby
The bar-model is an algebraic way of thinking about a situation, without using algebra explicitly. This means it is compatible with better/quicker approaches, rather than being a way around them like the formula triangle.
Paraphrasing @BenRogersEdu: a formula triangle is an alternative to algebra, the bar model is a step towards it. Compatible with development ie learning. #rEDRugby
Uses principles from CLT; less working memory is needed for the maths so more is available for the physics.
Suggests (emphasizes this is speculative) that visual rather than verbal information is a way to expand working memory. This is also an example of dial coding and presumably one of the strengths.
Compare approaches by using different methods with two halves of a class. Easiest way is to rank them using data, then ‘odd number positions’ use one approach to contrast with ‘even number positions’ for the other. Even if the value of the measurement used for the ranking is debatable, this should give two groups each with a good spread of ability/achievement.
Useful approach for accumulated change and conservation questions; could be difficulties for those questions where the maths makes it look like a specific relationship, such as V = E/Q, as this reinforces a unit approach rather than ratio.
A Sankey diagram, although a pain to draw, effectively uses the bar method. The width of each arrow is the length of the bar, and they are conserved.
Some questions are harder than others and the links may not be obvious to students, even if they are to us. Be explicit about modelling new ‘types’ (and discussing similarity to established methods). This sounds like a use, deliberate or otherwise, of the GRR model from Fisher and Frey.
— oliver caviglioli (@olivercavigliol) June 10, 2018
Reconstructing meaning is how we build understanding. Although this process is by necessity individual, it can be more or less efficient.
The old idea of remembering seven things at once is looking shaky; four is a much better guideline. If one of those things or ‘elements’ is a group, however, it represents a larger number of things. Think of this as nested information, which is available if relevant.
We need to design our lessons and materials to reduce unproductive use of the limited capacity of the brain.
Two approaches are the Prototype (Rosch) and Sets (Aristotle). Suspicion that different disciplines lean more towards different ends of this spectrum. Type specimens in science are an interesting example. My standard example is of different Makaton signs for ‘bird’ and ‘duck’ and the confusion that follows. Links to discussion on twitter recently with @chemdrK about how we need to encourage students to see the difference between descriptions and definitions (tags and categories) when, for example, talking about particles.
Facts can be arranged in different ways including random (disorganised), list, network (connections) and hierarchical. By providing at least some of this structure, from an expert POV, we save students time and effort so recall (and fluency) is much more efficient. Statistic of 20% vs 70% recall quoted. Need to find the source of this and look into creating a demonstration using science vocab for workshops.
The periodic table is organised data, and so the structure is meaningful as well as the elements themselves. Alphabetical order, or the infamous song, are much less useful.
Learning as a Generative Activity, 2015 is recommended but expensive at ~£70.
Boundary conditions are a really important idea; not what works in education, but what works better, for which students, in which subjects, under X conditions. This should be a natural fit for science teachers who are (or should be) used to explaining the limitations of a particular model. This is where evidence from larger scale studies can inform teacher judgment about the ‘best’ approach in their setting and context.
Bottom-up and top-down approaches then become two ends of a spectrum, with the appropriate technique chosen to suit a particular situation and subject. To helpfully use the good features of a constructionist approach we must set clear boundaries and outcomes; my thought is that for a=F/m we give students the method and then ask them to collect and analyse data, which is very different to expecting them to discover Newton’s Laws unassisted. It might, of course, not feel different to them – they have the motivation of curiosity, which can be harnessed, but it would be irresponsible to give them free rein. From a climber’s perspective, we are spotting and belaying, not hoisting them up the cliff.
Missed Opportunities And My Jobs List
As you might expect, there were several sessions I would have loved to attend. In my fairly limited experience this is a problem with most conferences. In particular I was very disappointed not to have the chance to hear the SLOP talk from @rosalindphys, but the queue was out of the door. The presentation is already online but I haven’t read it yet, because then I knew I’d never get my own debrief done. This applies to several other sessions too, but it was only sensible to aim for sessions which could affect my own practice, which is as a teacher-educator/supporter these days rather than a ‘real’ teacher.
After some tweeted comments, I’m reproducing my jobs list. This has already been extracted from my session notes and added to my diary for the coming weeks, but apparently it may be of interest. In case you’re not interested, my customary appeal for feedback. Please let me know what if any of this was useful for you, and how it compares with your own take-away ideas from the sessions. And if I didn’t catch up with you during the day, hopefully that will happen another time.
Talk to Dom about CPhys course accreditation
use references list to audit blended learning prototype module
add KS3 circuits example showing intrinsic/germane/extraneous load to workshop
review SOLO approach and make notes on links to facts/structured facts part of CLT
check with Pritesh if subject associations have been (or could be) involved with booklet development
read Kristy’s piece for RSC about doing your first piece of ed research
check references for advice on coding conversations/feedback for MRes project
search literature for similar approach (difficulty points scores) for physics equation solving
share idea re reports: a gap in comments may itself be an implicit comment
check an alert is set with EEF for science-specific results
use Robin’s presentation links to review roles for a research-informed school – might be faster to use Niki’s Research Lead presentation
build retrieval practice exercise for a physics topic that is staged, and gives bonus points for recall of ‘earlier’ concepts
TILE livestream from Dundee Uni; sign-up form?
follow Damian Benny
share ionic movement prac with Niki
add Cousin, 2006 to reading list
write examples of singapore bar model approach for physics contexts – forces?
pre-order Understanding How We Learn
use Oliver’s links as a way to describe periodic table organisation – blog post?
find correct reference from Oliver’s talk, AGHE et all 1969 about self-generated vs imposed schema changing recall percentages
You’ll have to check in with me in a month to see how many of these have actually been done…
Going to a conference isn’t good CPD unless you reflect on the new information and apply it to your own practice. (This isn’t an original thought, of course; @informed_edu probably put it best a while back.) So although I found the day in Rugby really interesting – and all due congratulations to @judehunton and the team for a great day – if I want to make it worthwhile I need to think about it a little more. The same as feedback should be more work for the student than the teacher, reflection should be more intense for the participant than speaking was for the colleague leading a CPD workshop or talk.
photo of a notebook page from ResearchED Rugby
The notes I take during a talk are quite straightforward; I use a modified Cornell notes structure, adding key terms on the left before I leave to sum up, and tasks at the bottom I can tick off when completed. The bullet points for each session are from my notes, with italics marking out my thoughts and responses. Many of the speakers will be blogging or sharing their presentations, but I’ll update this in a week rather than waiting.
It’s not listed below, but one of the most valuable things for me about the day was talking to colleagues about their responses to the talks, how they planned to use the ideas and how I might get them involved in my projects. I was particularly touched by several colleagues, who I’ve ‘known’ through Twitter but not met before, who made a point of saying how they appreciated particular things I’ve done over the past few years. Always nice to be appreciated!
Emphasized that CLT (from John Sweller) is a really useful model but is disputed by some.
Load = intrinsic (which will vary depending on student and their starting point) + germane (which builds capacity) and extraneous (distractions or ambiguities which we as experts know to ignore but students worry about)
Being concise with instructions reduces extraneous load so they can focus on what is intrinsic/germane. This might involve training them for routines early on.
Curiosity drives attention so ration it through the lesson!
Explicitly providing subject-specific structures to pupils means they organise knowledge into an effective schema. The process of making those links itself adds to the cognitive load, which is something to be aware of but not avoid.
This feels a bit SOLO to me; meaningful connections themselves are a form of knowledge, but one which is harder to test.
Acknowledged that his setting (Michaela) get a lot of attention from media/twitter and tends to polarise debate.
Spending time as a team on building a shared curriculum means more efficient use of that time; this is supported by school routines eg shared detentions.
Starting with the big ideas, break down content to a very small scale and then sequence. Bear in mind the nature of each facet; procedural vs declarative, threshold concepts, cultural capital, exam spec. One of my thoughts was that this must include knowledge about the subject, such as the issues described by @angeladsaini in her book _Inferior_.
Sequencing is a challenge when the logical order from the big ideas is contradicted by the exam spec order, which is supported by resources from the exam boards.
Booklets used which are effectively chapters of a self-written textbook. Really interesting approach, I’d love to see how students use these (write-on? annotate?) and the sources of explanations, links to learned societies etc.
Feedback to students may consist simply of the correct answers. I disagree with this, because which wrong answer they choose may be diagnostic and sharing the process with them may be useful to help them recognise their own ‘wrong tracks’. Also consider @chemDrK‘s post on students giving the right answer by rote, not understanding.
Some really interesting ideas, but my concern is that this is only possible if the whole school follows a very clear line. This is much harder to ensure with existing schools rather than a new approach from scratch. So it may not be scalable. Researcher/Teacher role session by Kristy Turner (@doc_kristy)
0.6 Uni lecturer, 0.4 school teacher (plus freelance)
Teachers in school were slow to adopt evidence informed practice, so an attempt made to do some research looking at historical data (therefore no ethical issues)
Coding phrases from reports was a challenge. Codes were based on ideas from the A-Level Mindset book. I need to adapt this approach to analyse the reflective comments on workshops etc that will form the basis of my own MRes project.
Results showed that, rather than science, Physics teachers were the outlier (along with Music and Russian) about how often innate characteristics were praised.
Lots of the comments were vague, and this will itself inform report-writing. Many could be interpreted in different ways, and this is worth remembering for parents. My immediate thought is that some parents will be able to decode the comments much better than others (social issue?), and we as teachers may recognise that an absence of a comment may itself reflect a judgment eg if no comment about working hard, they may be lazy.
An ongoing study is looking at student answers to ten chemical equation Qs, scored for difficulty by teachers based on values of coefficients, number of elements etc, comparing them before and after summer break. Some evidence that older students do better (‘year 9 into 10’ vs ‘year 8 into 9’) even without explicit balancing equations work in that year – is this because of increasing maturity, drip-feeding chemical equations over the year or something else?
I need to look for an equivalent test (or write one) for physics equations, with the equations assessed for difficulty in the same way.
Research-Informed Schools by Robin Macpherson (@robin_macp)
We need to start with a model of teacher competency which is reflective, not deficit-based. Research-informed practice is often time-effective, but the ‘informed’ matters because it is always adjusted/filtered by our own approach and setting. Professional judgment is key!
Paraphrasing @robin_macp: as all contexts vary, asking your own questions might be better than looking for someone else’s answers. #rEDRugby
the gap between research and practice is where weird ideas get in, and these are what cause us problems. I remember comments, years back, that some knowledge about ed-research is a vaccine against BrainGym and similar.
Building in ideas from, for example, Dunlovsky can be as simple as making sure there are bonus points on tests for questions relating to earlier topics. We’re making explicit that we appreciate and reward recall going back further than last week.
Not all ideas turn out to be useful. Differences in mindset seem to be real, but there’s growing evidence that these differences are slowly accumulated and not something we can change by displays or interventions.
My take on @robin_macp‘s comments on growth mindset: it’s a measurable difference, that helps, but it can’t be provided. Cc pattern of exercise vs pills for blood pressure. #rEdRugby
A Research Lead will have many jobs to do, including but not limited to curation, distillation, signposting and distribution. (These words are my paraphrasing.) Making a school research-informed is a slow process, 5-10 years, not an instant fix. One link shared was TILE for good practice examples.
I’m flagging with lack of coffee and so will post the afternoon’s sessions tomorrow. Or maybe the day after!
I have a full-time job, although ironically I’m not managing to blog nearly as much as when I was a classroom teacher, which was noticeably more than full-time. I’m fielding a lot of queries about physics teaching concerns on Twitter, which is fine, but I thought it might save me a lot of hassle if I put the same links here. Over a third of those teaching physics topics, according to data reported on p2 of this report from Wellcome, are not physics specialists. This matches the data I’ve seen through my day job at the Institute of Physics.
But before I say too much, let’s start with a disclaimer: what’s on my blog and on twitter from me is not official IOP policy or approved content. The IOP doesn’t care about the music I listen to, the political views I share, the arguments I have about gun control, mental health support or how to spell sulphur. (Well, maybe that last bit.) When I blog and tweet, I speak for myself. I’ll do my best to explain the IOP approach, for example with energy stores and pathways or the best way to support gender balance, but my bosses will only care about what I send from my work email account on work time. They’ll defend me on that – or not, as the case may be – but my off-duty self is not their problem.
Teacher support via the IOP
Whether you’re new to teaching Physics or have been heading your department for decades, the IOP has supporting material for you via the For Teachers page. Among other suggestions, this links to the TalkPhysics forum (free to join), which I recommend for queries that include more detail than the average tweet. There are several projects running to support schools, including the Stimulating Physics Network and Future Physics Leaders; these run alongside the locally-based Physics Network Co-ordinators. If you want your department to receive a little more support, you can join the schools and colleges affiliation scheme which gets you the journal Physics Education among other perks.
Detailed and in-depth discussion of pedagogy is broken down into 5-11, 11-14 and 14-16 topics on the Supporting Physics Teaching site. If you’re after something specific you may want to drop me a line on Twitter, but the content is evidence-informed and referenced. Great material for when you have a little time to think and plan.
The Improving Gender Balance project grew out of the Girls in Physics report. Lots of resources are available and my colleagues are always happy to talk to schools interested in applying these ideas. The last set of data showed that in around half of UK state schools not a single girl carries on to A-level physics; the imbalance in some subjects is even worse.
For hands-on advice the IOP supports the Practical Physics site. This grew out of the Getting Practical materials and is well worth exploring, with guides to pass on to technicians. You may also find the Teaching Advanced Physics (TAP) site useful, not least because some of the concepts are now covered in the GCSE curriculum as well as A-level.
If you’re an established physics teacher, the chances are that you do some informal coaching of colleagues even if you don’t have an official role. This is what my day job is all about, so please give me a shout so I can steal your ideas discuss the sharing of good practice. You may also be interested in Membership and applying to be recognised as a Chartered Physicist, and I have supporting materials that could help.
Other sources
I may be biased but I think the IOP materials are a good start. There are, of course, other places to look! I’ve been involved with a couple of these but others I know from using them with students or colleagues.
There are simulations available at PhET and the Physics Classroom. Understandably they take an American approach at times, but they’re well worth checking out. Double check suitability before setting for homework, as some will need Java installing or updating so may not play well on mobile devices. Both include pedagogy discussions for teachers as well as simulations for students.
STEM Learning – what I still think of as the eLibrary, and linked to the physical library at York – has loads of great resources, including versions of some of those linked above. Two collections in particular may be of interest, which organise the resources according to a curriculum: 14-16 science resource packages and A-level science resource packages. Bizarrely, the topics within each subject are alphabetical rather than logical, but that’s pretty much my only criticism. A free sign-in is required.
I do some freelance work with Hodder Education. The textbooks are obviously worth a look, but I’m not here to advertise. One project you can get for free is the Physics Teacher Guide. This is matched to the student textbook and online (subscription) resources, but may be useful even if you don’t have the budget to get for your workplace.
As an ASE member, I get the journal and magazine regularly. You shouldn’t need a login to access the Physics resources, which are an eclectic collection. I highly recommend the free downloads from the Language of Maths in Science project. Heads of Department might find membership worthwhile simply to access the Science Leaders’ Hub.
For Students
You may already pass these on to students – or have opinions about why that is a bad idea – but I think SchoolPhysics (from the author of the Resourceful Physics Teacher), HyperPhysics (concept maps linking physics ideas, probably best for A-level) and Physics and Maths Tutor (for past paper) are worth a look. Several of the above links, of course, may also be useful for them too.
A-level students can get a free e-newsletter, Qubit from the IOP. Hodder also publish Physics Review for A-level students, which is a good way to extend their learning beyond the curriculum.
EDIT: I was prompted about IsaacPhysics, which of course is a great site and one I recommend to colleagues. Questions are organised by linked topics for the spaced retrieval practice we all know is so important. Thanks to @MrCochain for the reminder. They also have funded places for a residential bootcamp this summer for students in England between years 12 and 13 who meet one or more criteria eg in first generation going to uni.
Please share any broadly useful resources via the comments; I’ve deliberately not started listing teacher blogs because I’d be here for ages. Maybe that can be a later post? But I have several others on my list, including materials to support the learning of equations and a review of an old science textbook. There’s never enough time…
I recently got involved in a twitter conversation about getting back into the classroom, and what to do between jobs to make yourself more attractive to schools. This post is based on the email I put together, and I’m going to start with the same warning I gave my correspondant.
I should point out I’m no expert on recruitment; I’ve never held a promoted post in school so this is based more on conversations with colleagues and in prep rooms that I’ve had because of my day job.
Money
It’s got to start with supply (and cover supervising). This is always going to be a pain, but the good news is that you get to check out the school in advance. Different schools will have different rules about supply, but linking up with the HoD will help. There are ways to make that link – more in a moment. And exam invigilation, although less of an issue with fewer modules or AS, would still be a possibility.
You could plan to do some tutoring for now. The money isn’t great but the time commitment is fairly low. It’s best through word of mouth, but getting started with a few notices on community noticeboards and the coffee shops near colleges and sixth forms where students hang out can be effective.
The other choice is some kind of freelance publishing, possibly starting with TES or similar. If you have time, this is a good way to brush up on your pedagogy and stay familiar with specifications. Producing some generic resources on HSW or similar will be a useful thing to take with you when on supply, as it shows you’re a competent specialist. Other publishing stuff comes up online from time to time, but the hourly rate is fairly low.
Admin/Applying
Now’s the time to bring your CV up to scratch and work on phrases that will go into a cover letter. Review the CPD you’ve done and summarise responsibilities, so all the dates are to hand. Scan your certificates then put them all in one (electronic) place. Make sure you have up to date referee details, hopefully with a couple of spares.
As well as TES, make sure you’ve registered with local teacher agencies and the council recruitment page. Bookmark possible schools and their current vacancies pages. If there’s a standard LA application form – less common these days, but still possible – you might like to save a personalised copy with your information already added.
Brief digression: why the hell is there not a standard, national, teaching professional profile form? Because all the information the schools want is the same – just every form has a different, badly formatted order. Create a form, then insist every school uses it, with a one page ‘local supplement’ which teachers can then fill out. More time to spend on the cover letter…
It might be worth looking for the kind of post you’re after nationally, just to get a look at the kind of things that show up in job descriptions and person specifications. Then you can think up examples of times when you’ve done the kind of thing that matches up. This is how you can show that although you might cost more than an NQT, you’re much better value for money. (NQTs: this is where you look for non-teaching examples showing similar responsibilities and experience.)
Development
Try to see the time as an opportunity; a sabbatical, if you like! Explore subject associations and membership options. If there’s a local group, check out teachmeets and similar. If there are gaps in your skills that the CV check showed up, address them. Have you considered things like Chartered status? Even if you don’t go through the process, looking at the requirements might help inspire your next steps. And if travelling for conferences is possible, they are a great way to build your skills and knowledge. The Association for Science Education(ASE) is the obvious first choice, being teaching-specific, but don’t forget IOP/RSC/RSB either.
Quite a few universities and organisations offer free online courses – STEM Learning in particular. You can add these to your CV, of course! TalkPhysics is an example of a forum for teaching discussion where you can swap ideas, if you’d like something less structured. Or borrow some science pedagogy books, read and reflect. A nice talking point at interview…
You might like to contribute reviews on the books, or posts with developed resources, on a blog or similar. UKedchat welcomes guest posts, for example. These will start arguments and get discussions going; you might even get lucky and score some free review copies!
A different way to keep your skills up to date would be volunteering. Secondary schools sometimes want reading volunteers, but I’d also suggest looking at local primary schools. How about offering to do a primary science club for a half-term? I did this in my local primary, using the RI ExpeRimental activities, and found it really interesting. The IOP’s Marvin&Milo cartoons would also be a good starting point for accessible yet interesting activities. I had a whole new respect for primary colleagues too! You might already be a youth leader, but that’s also a possibility. Fancy running the Scientist badge for local Cub or Brownie groups?
It’s not something you want to do in September, but if you’re still looking in a few months then doing some development work gives an opportunity to get into school science departments. Choose a topic where teacher opinions would be useful or interesting, eg what resources would they use, or a survey of how they use animations in lessons. Do your research ahead of time. And then write a letter to the HoD, asking if you can visit and talk to the department to collect some anonymous data. The article will be interesting – you could even try submitting it to Education in Science or similar – and you get to talk to colleagues, sound out the school, and leave your contact details for when flu season hits…
As I said at the start, I’ve never been in a position of power when it comes to hiring, so I’d really appreciate corrections, additions and suggestions from those who have. What can Teacher X do?
Following a conversation on twitter about the phonics screening test administered in primary school, I have a few thoughts about how it’s relevant to secondary science. First, a little context – especially for colleagues who have only the vaguest idea of what I’m talking about. I should point out that all I know about synthetic phonics comes from glancing at materials online and helping my own kids with reading.
Synthetic Phonics and the Screening Check
This is an approach to teaching reading which relies on breaking words down into parts. These parts and how they are pronounced follow rules; admittedly in English it’s probably less regular than many other languages! But the rules are useful enough to be a good stepping stone. So far, so good – that’s true of so many models I’m familiar with from the secondary science classroom.
The phonics screen is intended, on the face of it, to check if individual students are able to correctly follow these rules with a sequence of words. To ensure they are relying on the process, not their recall of familiar words, nonsense words are included. There are arguments that some students may try to ‘correct’ those to approximate something they recognise – the same way as I automatically read ‘int eh’ as ‘in the’ because I know it’s one of my characteristic typing mistakes. I’m staying away from those discussions – out of my area of competence! I’m more interested in the results.
Unusual Results
We’d expect most attributes to follow a predictable pattern over a population. Think about height in humans, or hair colour. There are many possibilities but some are more common than others. If the distribution isn’t smooth – and I’m sure there are many more scientific ways to describe it, but I’m using student language because of familiarity – then any thresholds are interesting by definition. They tell us, something interesting is happening here.
The most exciting phrase to hear in science, the one that heralds new discoveries, is not “Eureka!” but “That’s funny …”
It turns out that with the phonics screen, there is indeed a threshold. And that threshold just so happens to be at the nominal ‘pass mark’. Funny coincidence, huh?
The esteemed Dorothy Bishop, better known to me and many others as @deevybee, has written about this several times. A very useful post from 2012 sums up the issue. I recommend you read that properly – and the follow-up in 2013, which showed the issue continued to be of concern – but I’ve summarised my own opinion below.
D Bishop, used with permission.
Some kids were being given a score of 32 – just passing – than should have been. We can speculate on the reasons for this, but a few leading candidates are fairly obvious:
teachers don’t want pupils who they ‘know’ are generally good with phonics to fail by one mark on a bad day.
teachers ‘pre-test’ students and give extra support to those pupils who are just below the threshold – like C/D revision clubs at GCSE.
teachers know that the class results may have an impact on them or the school.
This last one is the issue I want to focus on. If the class or school results are used in any kind of judgment or comparison, inside or outside the school, then it is only sensible to recognise that human nature should be considered. And the pass rate is important. It might be factor when it comes time for internal roles. It might be relevant to performance management discussions and/or pay progression. (All 1% of it.)
“The teaching of phonics (letters and the sounds they make) has improved since the last inspection and, as a result, pupils’ achievement in the end of Year 1 phonics screening check has gradually risen.”
From an Ofsted report
Would the inspector in that case have been confident that the teaching of phonics had improved if the scores had not risen?
Assessment vs Accountability
The conclusion here is obvious, I think. Most of the assessment we do in school is intended to be used in two ways; formatively or summatively. We want to know what kids know so we can provide the right support for them to take the next step. And we want to know where that kid is, compared to some external standard or their peers.
Both of those have their place, of course. Effectively, we can think of these as tools for diagnosis. In some cases, literally that; I had a student whose written work varied greatly depending on where they sat. His writing was good, but words were spelt phonetically (or fonetically) if he was sat anywhere than the first two rows. It turned out he needed glasses for short-sightedness. The phonics screen is or was intended to flag up those students who might need extra support; further testing would then, I assume, suggest the reason for their difficulty and suggested routes for improvement.
If the scores are also being used as an accountability measure, then there is a pressure on teachers to minimise failure among their students. (This is not just seen in teaching; an example I’m familiar with is ambulance response times which I first read about in Dilnot and Blastland’s The Tiger That Isn’t, but issues have continued eg this from the Independent) Ideally, this would mean ensuring a high level of teaching and so high scores. But if a child has an unrecognised problem, it might not matter how well we teach them; they’re still going to struggle. It is only by the results telling us that – and in some cases, telling the parents reluctant to believe it – that we can help them find individual tactics which help.
And so teachers, reacting in a human way, sabotage the diagnosis of their students so as not to risk problems with accountability. Every time a HoD puts on revision classes, every time students were put in for resits because they were below a boundary, every time an ISA graph was handed back to a student with a post-it suggesting a ‘change’, every time their PSA mysteriously changed from an okay 4 to a full-marks 6, we did this. We may also have wanted the best for ‘our’ kids, even if they didn’t believe it! But think back to when league tables changed so BTecs weren’t accepted any more. Did the kids keep doing them or did it all change overnight?
And was that change for the kids?
Any testing which is high-stakes invites participants to try to influence results. It’s worth remembering that GCSE results are not just high-stakes for the students; they make a big difference to us as teachers, too! We are not neutral in this. We sometimes need to remember that.
With thanks to @oldandrewuk, @deevybee and @tom_hartley for the twitter discussion which informed and inspired this post. All arguments are mine, not theirs.