The Canary in the Coalmine 4/4

Delayed, but it’s done – this will be the final post of commentary on the research Mark Whalley and I conducted last year and had published in SSR In Depth this summer. It’s open-access so please do have a look and let me/us know what you think. We’re also presenting on the data in January at the ASE Annual Conference.

In the earlier posts of this sequence I discussed the context and the data we collected, as well as the recommendations I’ve built on what we found. Please note that Mark and I collaborated on the paper but these posts are my opinions, some of them going significantly further than the data supports in isolation.

This post will be have a slightly different focus; I’m going to describe the challenges I had with data-collection, in the hopes it will help others avoid the traps I fell into, and think out loud about further research that might be fruitful and/or interesting. I am open to collaboration either as part of my day job or in my own time, so please do get in touch if this is something you’d like to discuss.

Data collection

  1. Teachers are really bad at completing surveys.
  2. Longer surveys get fewer responses.
  3. The busier a teacher is, the less likely they are to complete a survey.
  4. There aren’t many physics teachers in England in the first place.

Taken together, these facts mean that it’s hard to get enough responses to treat the results with confidence. The solution we came up with was an incentive, and after some discussions I secured the budget to provide a voucher to every person completing the survey. This obviously cost more than a prize draw, but avoided the complications of ensuring randomisation etc. To provide a choice, we gave the option of donating the money instead (to Education Support). So we set up the questions on one anonymous survey, linked the completion page to a separate one to gather contact info, and shared the link on social media.

Roughly an hour later we had over 200 responses, the vast majority of which were filled with random garbage and a procedurally generated email address in a transparent attempt to get the voucher. Because of my inbuilt cynicism – I mean, I have three kids and spent ten years in a classroom with other people’s teenagers – the voucher process was not automated. We identified the fraudulent attempts, started a fresh survey and tried again.

This time it took a couple of hours to be discovered, but we got about a thousand submissions. There was Latin in the free text boxes and random numbers in the data fields. A handful looked genuine but most were clearly bot-generated. A later version which was only emailed still got corrupted, suggesting the link had been shared online somewhere the spammers had access to it.

We considered asking participants to use a school-based email address for the voucher, but decided this would reduce people’s confidence in the anonymity. (Although I should emphasize the two surveys were separated by design so it was impossible to link the answers to the participant, even for me.) The platform we used, strangely, didn’t have a built-in Captcha option as a question you could include. We did look at whether we could use an external service for that to enable the voucher request, or a trap question, but it was too complicated.

In the end, there was only one solution – putting my pattern recognition up against the scammers. I ended up figuring out boundaries for several of the questions and having Excel grade the answers as green (possible) amber (dubious) or red (clearly a scam). For example, a participant who claimed to be at a school with 64 students but alongside 20 science teachers was unlikely to be genuine. Although it was frustrating, it was also really interesting when I got into the data, especially as several columns showed a smooth bell curve rather than the clusters we’d expect to see in the real world.

  • very short time taken for survey completion
  • inconsistent participant qualifications (no GCSEs, no A-levels, but doctorates in physics and engineering? really?)
  • unrealistic school numbers, student and staff
  • incompatible reported FTE vs timetable load
  • random or repeated phrases in the free text fields

In the end, I filtered around 5000 responses down to just under 100; those, with any possible identifying information such as time of completion removed, were shared with Mark for the data analysis.

My advice: when creating an online survey, set up at least some questions to allow impossible answers and plan on how to identify them quickly. Free text fields will be auto-filled by some bots and this can provide an obvious clue. Decide in advance what your yes/maybe/no ranges will be.

Next steps

I’m still looking at the dataset, for example to see whether there’s a correlation between responsibility roles and job satisfaction – in particular if colleagues are less grumpy about SLT if they’re already a middle leader. I think it would be really interesting to see whether any of the same issues show up for subject specialists who have to teach out of their exact specialism but within the same department. How to French teachers feel about a Spanish-heavy timetable, for example? I’d like to know how other factors, specifically age and gender, affect both job satisfaction and attrition factors. Recent work done looking at the exodus of women in their thirties from teaching suggests to me that alongside the recommendations in both the published paper and my commentary, you can make a big difference for those teachers by offering flexible working and matched timetables. We didn’t ask those questions because it counts as sensitive data and triggers a whole new world of GDPR and ethics trauma.

If I was doing this again, I’d simplify the questions and ask colleagues to prioritise the possible changes. If they could only have three or five changes from a shortlist, which would make the biggest difference? The benefit of this is that we could separate the possible changes based on the tiers I used in my last post, so HoDs find out which of the things they have control over are worth fighting for.

One of the things we looked at was deprivation score, and it didn’t make as much difference as we expected. (There were noticeably more physics specialists in less deprived areas though.) I’d really like to see the teaching unions investigate this angle to see if there’s a relationship between the deprivation index of a teacher’s home compared to their workplace. How does this vary by subject and seniority? For example, for some years I travelled from a relatively high deprivation area to teach physics in a leafy suburb. I’m prepared to bet that although it’s noisy, there’s a similar signal nationally.

I’d like to repeat my suggestion from the last post; we need a national programme of exit interviews for all departing physics teachers. Why do they go and where are they going? What might change their mind? Link it to an ongoing job satisfaction survey and we can see how much difference complaining about workload in years 2 and 3 makes to the chances of them leaving after year 5.

Give me some money and I’ll run a nationwide anonymous exit survey for every physics specialist leaving a state school. I’ll find out where they’re going and why. I’d hope schools are doing this now, but why on earth isn’t there a standardised set of questions for every departing teacher part of the Ofsted requirement for a school? Don’t add it to the league table, but anonymised to a regional level this would be a valuable tool. Add a threshold for X% of teachers ticking the same box for a particular school which should be a warning sign for SLT and Ofsted. (Heads Round Table, call me.)

Final thoughts

I really hope someone has been reading these posts – maybe there will be an influx as term restarts. Or maybe I should give up on blogging the old way and create a substack, but that just feels weird. If you’d like to discuss any of these ideas, please add a comment here, email me or find me on social media; I’m currently experimenting with BlueSky.

The Canary in the Coalmine 3b/4

Recap

Previous posts have introduced the research that we did, summarised the negative and attrition factors the teachers considered most significant and linked this to what they reported might encourage retention. Understandably, we were most interested in the factors considered important by those who were seriously considering leaving! For at risk colleagues, what they said would help most were:

  • financial incentives
  • increased planning time
  • reduced marking load
  • teaching physics only
  • lesson resources
  • department resources

One identified challenge is that different factors are within the control of different stakeholders within the education sector; I suggested that thinking about this in terms of fuzzy boundaries is helpful, based on this:

Methods – all colleagues

All colleagues need to acknowledge the challenges, both generally and how it works in your local area/school/MAT. Teachers need to be enabled – which is more than simply encouraging them to do it! – to seek physics-specific support as well as more general teaching advice. I used to tell trainees that “there’s no such thing as a good teacher, just teachers who are good at X.” Find the local experts on specific areas of the curriculum, go to D&T to see how they deal with practical work and maths to see why kids struggle with your wording about ‘directly proportional’. Join the IOP’s professional community to get physics-specific resources.

Use those physics-specific resources for your own support and development, such as the free-to-access videos on IOP Spark. Wherever possible, invest your time in shared resources and transferrable approaches, like a question bank you can use and reuse every lesson, iterating rather than creating from scratch. Where a school has these in place already, use them. If you can’t find them, ask. Quite apart from anything else, the benefit of your students knowing the ‘house style’ means you’re able to lean on department and school procedures rather than establishing your own!

For colleagues working in the classroom, especially early in their career, it’s worth remembering how many different skills you’re trying to polish. One analogy is that it’s like trying to learn everything about driving a car simultaneously:

  • general maintenance
  • using the controls
  • hazard perception
  • highway code
  • adjusting for changing conditions
  • following directions
  • planning a journey

What actually happens – or at least it did back in the mists of time when I learned to drive – is that you consolidate one set of skills and move on to others. My parents got me sitting in the passenger seat, thinking about the conditions and other road users, before I got my provisional licence. I built on my experience of being a cyclist and all of those specific hazards. I practised using the controls, including a gear shift and a choke – I did say I was old – in an empty car park on a Sunday, so no other cars to think about. And so on.

A department scheme of work and borrowed resources are the equivalent of someone else doing the journey planning so you can focus on making progress, while not going over the speed limit. Shared planning means you can improve consistency of style and approach in a department – we call it a staff team for a reason. And if you’re wondering why you’re seeing new colleagues struggling, remember the curse of knowledge and lend a hand.

Tactics – Heads of Department, CPD and MAT Science Leads

As suggested in the final section of our paper, matched timetables should be the first priority at the department level; this will involve advocating upwards, possibly in writing with a link to our research. A colleague with year 7, 8 and 9 science classes has three courses to teach. Each course includes resources, practicals, risks to manage, misconceptions, assessments… and two-thirds of those will usually be in their non-specialist area. It is easy for their last experience of those ideas to be from ten years and two degrees ago. So instead, could they have three classes of the same year group? Three opportunities to teach the same content, with far less time needed to review/choose/write resources. It also means it’s worth them investing time in a more streamlined feedback and marking approach. When the tests come around, marking three sets of the same paper is easier than three different ones.

Wherever possible, give them more physics. Unless you’re over-supplied with physics specialists – statistically unlikely – then their colleagues will also be pleased. All of the hinterland we talk about, all of the extra confidence and increased familiarity, means less time spent planning. They can use that time to dig deeper into the pedagogy, to improve their understanding of the school approach to assessments – or simply have a cuppa in the prep room. Over time they’ll still get to teach more of the curriculum, but nobody would expect every French teacher to be as good at teaching Spanish simply because they’re in the MFL department.

Every Head of Department wants to support their colleagues with shared resources, feedback and marking approaches which balance the needs of the students with the sanity of the teachers. This is about making sure everyone knows the materials are there to promote a consistent house style and reduce duplicated effort. The danger here is that if you’re short of physics specialists, anything you ask them to create in-house means extra work for them, per-person. If you’ve got one specialist building things for everyone to use, being the point of contact for issues and troubleshooting, then congratulations – you’ve got a lead practitioner! Are they getting paid as such? If you work across a MAT then you’ve got the added challenge of providing support that’s helpful in terms of workload but still allows flexibility for different schools.

To make this easier, there’s a whole load of resources and projects that can support you at a department or MAT level to increase physics confidence and competence across the team.

  • the Stimulating Physics Network (SPN, now administered by STEM Learning) works with the department to boost physics skills
  • the Ogden Trust works with schools and individuals; the SKPT project in particular is a great way for individual colleagues to be upskilled, but availability may vary depending on area and they will need dedicated time.
  • the CPD videos on IOP Spark are topic-based support including real classroom approaches as well as physics explanations, and the ECPL set are a good way to structure mentoring conversations with those new to teaching physics. The Early Career Framework is time-consuming enough without having to write your own physics modules.

Remember that making sure there’s development opportunities provides many benefits; the point of including them here is that by reducing the overall workload in the team, you’re addressing an attrition factor. The downside of this is that it’s easy for biology and chemistry colleagues to feel that they’re missing out on refreshing and developing their own specialist areas.

I’m looking at ways HoDs could use a department audit, based on the questions used in our research, to identify local priorities and match them to specific, evidence-informed tactics. If you’re already monitoring this, please give me a shout!

Strategy – Headteachers and MAT leadership

My colleague and friend Mark is keen to use the model of ‘dealing with the closest crocodile.” Apart from what this reveals about the wildlife in Cheshire, it’s a great way to remind us that there’s almost always a crisis demanding the attention of education professionals at every level. Whether you use an Eisenhower matrix, GTD or some other way of prioritising, there will always be more problems than time or money for solutions. So why should the boss of a school with a hundred staff and two thousand students be spending time on physics teacher retention?

The title of this post is the reason why. Physics teachers are special, rare and hard to retain. But they’re not, despite the understandable hyperbole, actually unicorns. Instead, a good way to think about them is – to use a more recent biology concept – as an indicator species. Physics specialists have more options outside education, and those options are better paid on average (see Fig 5 in this from the NFER). The grass gets greener as they see their pay starting to plateau. Especially since the pandemic they’re contrasting their situation – in the workplace Monday to Friday, with associated time and money implications – with their fellow graduates working in PJs from home, with decent coffee and no commute. What this means is that they have more options, so they act when their non-physics teaching counterparts, under the same strains and stresses, can only grumble. This is particularly true for early career colleagues who don’t have a mortgage yet, and can make the most of the freedom that entails. They might be cheaper to hire, but the higher attrition rate might make up for that. If a school can’t retain physics teachers then it’s a warning sign that other colleagues would be leaving if they felt they could.

It may feel unfair to pay them more money, but if your Head of Science tells you that your sole physics specialist is doing all the jobs that are shared between three biologists, it’s easier to see how that’s unfair too. If you lost all your Spanish teachers, you would expect the Head of MFL to make choices about which languages were offered to your new Year 7 students. That can’t be done with Science, so show you respect their specialism by supporting matched timetables instead.

If you’re responsible for workload and retention across a MAT, you can use economies of scale to show how investing in resources pays off when it comes to staving off teacher exhaustion. Of course, these problems are not unique to physics teachers. It’s just that we notice the effect of the attrition factors for them first. If when you model the workload effects of new initiatives – and I’m assuming you do – some teachers are hit harder than others, what’s your plan to reduce that impact? How are resources like IsaacPhysics supported across the schools in the trust to reduce the barriers to uptake? When you choose new platforms for retrieval practice or worked examples, do your staff know that they’ll be available for long enough that it’s worth investing in them?

Alongside Ofsted surveys monitoring staff job satisfaction, wellbeing and concerns is going to pick up a lot of noise and occasionally an important signal. You can’t keep everyone happy all the time, but aiming to keep most colleagues grumbling a little rather than a few constantly crying in the staffroom is a more realistic aim. I’m not qualified to tell you how to manage your staff – I’m just flagging up why the physics specialists might be a more urgent concern than their numbers may suggest.

Policy and Law – government, exam boards and publishers

I’m grouping these together because the overlaps are so hard to entangle – and, frankly, I’d be amazed if any of them read what I’m typing. That’s a long way above my pay grade, but the reality is that if we don’t acknowledge who can make changes, we’ll get blamed for the ones we don’t make ourselves. Being a middle-manager and being blamed for the things your bosses do sucks, so drawing a line and admitting what’s beyond your control is important.

Defining the curriculum (government), how it’s assessed (exam boards) and how it’s taught (professional associations and publishers) are not small jobs. They’re also dependent on each other, which is why education reform is always challenging. Back when I was teaching – and over a decade there was one year when I wasn’t teaching a new spec to at least one class – I was told it’s like trying to convert a diesel train to electric without stopping the journey. I’ve now decided it’s like trying to convert every train to a different type without cancelling journeys or reducing the timetable. So what would I recommend to the government, based on this research, to address physics teacher retention?

Firstly, throw money at the problem – but aim carefully. I’m personally really pleased that the government is acknowledging the concerns raised by the STRB and doing something about this for the profession, and even more so that it’s at least partially funded. They’re looking at how pay could be varied by subject and need, which is something we’ve already seen in physics. I’d argue that a long-term plan for this is needed, and the benefit needs to be spread out rather than just being front-loaded into bursaries. A physics teacher needs to know – as I did, ages back – that I can rely on the extra cash for long enough to get a mortgage. Overall, this needs to be considered carefully, not re-invented annually, and be built on top of broader sector funding reforms that address old buildings and a crumbling SEND component – not to mention the growing problem of the inflexibility of teaching as a career, post-pandemic.

As part of that, we need to be asking better questions. Focussing on physics teachers; it’s almost unbelievable that the DfE can’t say how many physics specialists are teaching in state schools. Specialist is so poorly-defined that deciphering the stats is nearly impossible. Teaching electricity to Year 10 is physics, right, so it needs a physics teacher to count as specialist?

Turns out it’s not that simple. If it’s a Physics GCSE class, then yes. (Let’s not get into whether a D&T teacher who’s a qualified electronics engineer counts, versus a chemistry graduate who did a ten day SKE to access the Physics PGCE programme.) But if it’s a class doing what I still think of as ‘Double Science’, then any science teacher counts as a specialist. We – and by that I mean the DfE – just don’t collect enough information. (If I’m wrong on this, please point me at something more useful.) Teaching within specialism – whether from graduation or acquired by longer-term development – means better outcomes for students and less workload for the teacher. But at the moment we just don’t know how big a problem it is nationally.

Adding to this, I’d like to know why physics teachers leave by asking the ones who are actually leaving. And I mean ALL of them. Give me some money and I’ll run a nationwide anonymous exit survey for every physics specialist leaving a state school. I’ll find out where they’re going and why. I’d hope schools are doing this now, but why on earth isn’t there a standardised set of questions for every departing teacher part of the Ofsted requirement for a school? Don’t add it to the league table, but anonymised to a regional level this would be a valuable tool. Add a threshold for X% of teachers ticking the same box for a particular school which should be a warning sign for SLT and Ofsted. (Heads Round Table, call me.)

Exam board specifications and teaching resources are two sides of the same coin – and they’re both determined by government decisions, interpreted broadly in some cases. That starts with thinking very carefully about what we want the curriculum to do, and most importantly what there is not time for. Gove shoved a load of stuff in the science specification that seemed like a good idea to him and his advisors, and since then the idea of a knowledge-rich curriculum has been taken for granted by many. And it’s not that it’s necessarily bad physics – it’s just that there’s an awful lot. A few years back I wrote a Physics Teacher Guide for Hodder and felt the need to acknowledge in the accompanying SOW how cramped it would be. We need to be honest about what we fit in to the science curriculum and what will need to be left out. (Obviously this is true across all subjects, and that’s before we get into all the other things the media seem to think that schools should be responsible for.)

Creating good resources is harder than it looks, as every early-career teacher has discovered to their cost. The principle behind projects like Oak Academy are noble ones, but as all the publishers would tell you, making good materials takes expertise and time. Neither are free. Exam boards and publishers being entangled – and yes, Pearson, I’m looking at you – means that it’s very easy for specification-matches resources to be produced that then need regular updating and improving, all at a cost. This is an example of the general movement towards rental/subscription rather than ownership – which has benefits, especially for shared digital resources, but it’s not without challenges. I’m listening to Spotify at the moment, but I had to put a playlist together myself because the album I wanted has been ‘updated to a DELUXE Edition’ by the band and I want the original, damnit! (August and Everything After by Counting Crows, if you care.)

Ranting over

I think I’m probably done for now, but the above ideas show that the issues can be addressed, if not solved, in different ways. I’d love to hear suggestions about which are realistic and which are mistaken. My next post – probably at the weekend – will be digging into the survey process itself and what I learned from what didn’t go right.

The Canary in the Coalmine 3a/4

First things first; the wonderful people at the ASE have now made the published article open-access. You can read the paper via SSR in Depth without logging in, share it with colleagues (including HoDs or SLT) and generally check my working. I would emphasise that I’m a big fan of the ASE, have been a member for some years and encourage colleagues to engage with them. And no, I’m not being paid for that. I’m writing now as an engaged professional, separate from my ASE membership or IOP employment.

Recap

In the first two posts of this mini-series, I explained the context of the research and summarised some of the things we found. In particular, I discussed what can be described as

  • negative factors – what colleagues said reduced their job satisfaction)
  • attrition factors – what they said made it more likely they’d choose to move on.

That second one is particularly relevant in physics teaching because we lose so many colleagues from the English state sector. It’s important to note that because of the way data can be collected, what sometimes happens but is hard to track is that teachers move to teaching in independent (fee-paying) schools and/or internationally. I’m interested in the patterns of this migration, so if anyone has data please give me a shout!

This post will examine a third list, the things which colleagues said might be a factor in encouraging them to stay. I’m describing these as retention factors but before I start discussing the stats, it’s important to include a caveat.

Every teacher is different, and every school is different. There are many reasons why teachers leave the profession, and because we tracked intentions rather than actions it will be an imperfect report. Mark is in the process of finishing a separate piece of research where he interviewed those who had made the leap, and that will be a valuable complement to this work. It’s not as if a Head of Department can work down the list of retention factors and add them all to the school policies. And which factors matter most in individual cases may not reflect the order we have here. As I said to a colleague on Twitter, this research – like many other trends we can see in large numbers, such as the reasons girls often choose not to do Physics or the differing perceptions of careers held by parents – gives us the questions to ask, not the definitive answers. (Thanks to Paul Hunt for prompting this response, which I’ve polished slightly here.)

Happy physics teachers are all alike; every unhappy physics teacher is unhappy in their own way.”

(apologies to Tolstoy)

Part of the reason we limited our dataset to colleagues in the first five years was that when we ask similar questions of the most experienced colleagues, their answers are very different for reasons that make perfect sense. In unpublished data from previous work, a large percentage said that teaching out of specialism wasn’t a problem for them. It turned out that this was because most of the respondents fit into one of two categories; many had significant experience and so had gained the skills and knowledge needed for teaching biology and chemistry topics with confidence. Many of the others were in settings where they only taught physics; they could honestly say it wasn’t off-putting for them because they didn’t need to do it! Neither of those things are necessarily applicable to early career colleagues teaching across the curriculum.

Retention factors

Just as with the earlier questions, the teachers were asked about how important possible changes would be in encouraging them to stay in teaching. These factors were defined because we started the survey, but were based on previous studies and unsurprisingly were often possible solutions to the factors proposed in the negative and attrition factors questions.

As before, there was a fairly close match between the factors identified by all respondents and those categorised as ‘at risk’; in fact the order was identical:

  1. financial incentives
  2. increased planning time
  3. reduced marking load
  4. teaching physics only
  5. improved behaviour policy

Beyond the scope of the published article, I spent some time looking at the smaller group who could be described as ‘high risk’. The difference was not huge, but it was interesting; for these respondents, other factors become more important suggesting they’re particularly affected by the workload aspects. I’ve compared the percentages saying it would be a big or medium factor in encouraging retention with the equivalent for all respondents.

  1. financial incentives (93% rather than 92%)
  2. increased planning time (93% rather than 85%)
  3. teaching physics only (89% rather than 75%)
  4. lesson resources (86% rather than 70%)
  5. well-resourced department (82% rather than 70%)

The difference is interesting rather than ground-breaking, but to me it suggests that any solution which does not address their teaching-specific workload is doomed to failure. What I found particularly interesting is that even though they’re at much higher risk, more money wasn’t massively higher. It’s a problem, yes – but it’s not the answer to everything. And although flexible working featured in other lists, by this point they’re beyond worrying about it. Definitely think about how you can offer it, but arguably that’s often a sector issue.

I’ll also point out there was no correlation between the Index of Multiple Deprivations for the school and whether teachers reported concerns about department resourcing.)

How do we solve a problem like retention?

If it was easy, anyone could do it. And it’s amazing how many people, whether they’re celebrated opinion writers in the media or committed teachers on social media, think they can indeed do just that “with this one weird trick…” It turns out that it’s a bit more complicated than that, but this does not mean we shouldn’t try to address it.

I’m going to ignore behaviour improvements, not because I don’t think it’s important but because it’s not a physics teacher problem. We/they are particularly vulnerable to it, and SLT need to understand the challenges of working with hazardous practical tasks in large numbers when students are unable to follow instructions. But let’s be honest; not many schools will have a behaviour issue that only shows up in physics lessons. So what does that leave?

  • financial incentives
  • increased planning time
  • reduced marking load
  • teaching physics only
  • lesson resources
  • department resources

Who can solve a problem like retention?

One problem in organisations of any size is that when things go wrong there are a lot of people to blame. This is made worse when, for completely sensible reasons, those higher up in the organisation may not be able to share all the reasons for decisions that affect us. So the Head of Department gets the blame for the choice of the exam board, even when it’s a decision made at a school or MAT level, because they can’t or won’t share that explanation with their teams.

Something I have up by my desk is a reminder of the different layers in managing UK education and how they inter-relate. It’s an absolute mess and one specific to my day job, so I won’t share it here. Instead, a massively over-simplified version with layers that have decidedly fuzzy edges is below:

So the question now becomes, what changes could we make in the system to improve retention, and critically who can control or influence that? There’s no point in asking the Secretary of State for Education to share their physics teaching resources with an ECT or expecting every new teacher to successfully demand a higher salary because physicists are in short supply. So who makes which decision?

Although the bullet points above look like a wide range, they’re actually closely related. More money could be used to address practically every concern, but there are many reasons why that may not be the first solution we can apply. Separate to that, most of the specific suggestions below are concrete suggestions that reduce the workload of physics teachers, without simply delegating this work to someone else. Many departments will already use some or all of these – others will perhaps not have SLT who realise how much they would help. (I wrote ages ago about SLT needing to know what actual teacher experience is like, and this is never more true than the experiences of physics colleagues teaching out of specialism because ‘it’s all science anyway.’) And addressing some of these will have a knock-on effect – buying in resources for the department will effectively increase available planning time, for example.

In my next post I’m going to work through these approaches and others, but in terms of the layers of influence/control. I’ll be starting with the foundation – the colleagues who work with students.

The Canary in the Coalmine 2/4

Recap

As I said in yesterday’s post, these musings are my responses, and some behind the scenes explanations, of the article Mark Whalley and I wrote for SSR In Depth. This was based on a survey undertaken as part of my work with the IOP, but was peer-reviewed by colleagues through the ASE. The ideas here are mine rather then being IOP-approved policy, and I hope readers will see that they’re directly based on the data rather than taking a top-down approach.

Context

It’s very easy for teachers – and I’ve been there! – to feel like all schools are like their school. I suspect social media has reduced this somewhat, but it’s done that by encouraging polarisation and assumptions that all schools are variations on a small number of themes, based on those who are the loudest advocates. Getting actual numbers of active specialist teachers is surprisingly difficult, but a good estimate is that of the 30k science teachers in English state schools we’ve got between 4k and 7k physics specialists, rather than the 10k which would be a ‘fair share’. They’re not equally distributed, either – lots of discussion about this at the Royal Society of Chemistry.

Recruitment varies but 600 per year is a reasonable benchmark, and half of those leave (the English state sector) within five years. We’re not in a good place, even compared to the general concerns about teacher supply (such as this from the NFER).

Questions and Answers

Details about the methods are in the paper, and I’ll talk about the challenges in the last post of this series. Put simply, we wrote a survey with a bunch of questions, put it through ethics review and then asked teachers of physics to complete it. Because previous data obtained through my day job had been heavily weighted towards experienced colleagues, we chose to focus on those who were in the first five years of teaching. We further specified England to reduce the confounding factors of different educational systems. More than ninety sets of valid responses were collected – choosing which were valid was interesting but again, you’ll have to wait for the final post – and Mark then did all the hard work of data analysis, looking for patterns and correlations.

We asked respondents about their setting and their career to date. We did not identify schools, going to the extent of asking them to check the deprivation index from an online calculator rather than collecting the postcode ourselves. We chose not to collect any sensitive data which means, for example, we could not analyse any possible effect of age, gender or ethnicity on job satisfaction or attrition.

The survey asked whether colleagues were planning to remain in teaching. It’s really important to acknowledge that this is about intention, and because of when many of the respondents completed it , at the end of the summer term, they may have been at a low point. (As an aside – I’d love to see responses to this question on a monthly basis through the school year to recognise a cycle of peaks and troughs.) Mark has been working on more in-depth questions with former physics teachers, but this approach naturally suffers from the problem that those who respond tend to have the strongest feelings! The headline result: 32% of physics teachers surveyed were seriously considering leaving or actively planning to leave. There was no strong correlation with school characteristics such as size or deprivation score. Many factors were important both for those generally dissatisfied and those planning to leave, but order varied – more of this in a moment.

32% of physics teachers surveyed were seriously considering leaving or actively planning to leave.”

In the survey, respondents were asked to rate different factors in terms of their effect on:

  • Job satisfaction
  • Choice to enter teaching
  • Dissatisfaction
  • Intention if any to leave teaching
  • Probability of encouraging them to stay in teaching

What made physics teachers unhappy and what made them consider leaving were, unsurprisingly, overlapping lists. None of these will be a shock, but those of us who remember the ’24 tasks’ list (and a more recent iteration) will recognise that teachers are much more likely to accept what they see as necessary professional tasks than imposed administrivia.

Factors causing dissatisfaction, in order (all respondents)

  1. poor student behaviour/relationships
  2. salary
  3. planning workload
  4. marking workload
  5. administrative tasks
  6. lack of flexible working
  7. school leadership

These are not unique to physics teachers, of course. What was particularly interesting is that some of these factors correlated more strongly with intention to leave than others. Being dissatisfied with planning workload, lack of flexible working and school leadership are clear warning signs that someone might be seriously looking at their options, not just recognising the challenges of the job.

Factors linked to attrition

It’s reasonable to expect that the factors causing colleagues to consider leaving will match the list above. It’s also predictable that when we analyse the importance of these factors, the percentages will go up when we look at those who described themselves as more likely to leave. They’re the high-risk group for attrition, so of course they’re more unhappy! What was really interesting to me is that some are much more significant, and the order changes too.

All respondentsAt riskHigh risk
salary (70%)
student behaviour (60%)
planning workload (57%)
marking workload (57%
school leadership (51%)
lack of flexible working (51%)
salary (74%)
student behaviour (68%)
planning workload (63%)
marking workload (59%
school leadership (53%)
lack of flexible working (51%)
Planning workload (82%)
Salary (74%)
Marking workload (68%)
Student behaviour (68%)
Lack of flexible working (68%)
School leadership (61%)

The takeaway from comparing the ‘general dissatisfaction’ list with this is a simple one. Teachers grumble about student behaviour, salary and other factors. As a profession we definitely want to address these, but if we want to focus on retaining physics teachers it’s effectively background noise. If we can’t fix everything, what do we choose? The signal we need to look for – the specific issues that seem to be driving people out of the classroom – are the things which are more important to them than the ‘general population’. The planning workload is at the top of that list, followed by a lack of flexible working, marking workload and school leadership.

Coming soon (ish)

Next time I get to write, I’ll be looking at what the responses suggested about addressing – not solving – the issues, particularly for those who were higher risk of attrition. In particular I’ll be sharing what I think about who might be able to make some of these changes, rather than putting all the responsibility in one place.

Feed Me Seymour!

I’m a bad person.

This is not because I tell off students (although I do). It’s not because I told my son that revolving doors are powered by mice (and when he doubted me, pointed out that they squeak). It’s not even because I’ve been known to write really depressing poetry (for therapeutic reasons, and usually unshared).

It’s because two days after it was made available, I haven’t managed to watch Demo: The Movie.

I will, really I will. But at work I still don’t have speakers, making watching anything on my desktop an exercise in frustration and lip-reading. And at home I’ve been busy cooking, washing up, child-rearing and taking a mostly-dead mouse back outside, much to the disgust of the cat. So it is as yet unwatched, despite my certainty that it will be interesting, funny and well-produced. With luck it will be watched by me over the next couple of days, so I can (a) blog about it over the weekend and (b)contribute to the forthcoming #asechat. So watch this space. I’m sorry, Alom.

In this context, it seems a little cheeky that I’m the one asking for feedback. But I wanted to post about my latest hare-brained scheme idea, as suggested in my previous item. I’ve set up a google form, but this time it’s not for me. Instead, I’d invite any and all readers of my blog – and, I suppose, twitter feed – to take a moment to record what I’ve done that has helped them.

feedback

I’ll be including a link in every (non-political) post from now on. My hope is that instead of paying me, you’d be happy to document an ongoing portfolio of my impact outside my own classroom and school. A crowd-sourced testimonial, if you will. You don’t have to leave your name, just a few words about how what I did made a difference. If you’ve blogged about it, I’d love for you to include a link. Tweets are transient, comments on the posts are hard to collect together, but this would really help.

Blog Feedback via Google Form

Of course, if this post inspires you to add your own evidence-gathering Google Form to your site, and you link back here, the internet will quite possibly explode in a frenzy of recursion. So be careful.

T&L Ideas 2

Second on what will hopefully be a series of  ‘echoed’ posts, based on the weekly emails I’ve been asked to produce in my setting. Still my own, rather than based on suggestions from colleagues, so regular readers will probably recognize ideas and links.

Three quick links about effective revision this morning; it seems appropriate given what many of our students are up to.

Five out of Three/Teach, Do, Review from David Fawcett: a useful framework for structuring a revision lesson, so students don’t spend an hour flicking through textbooks and chatting about Eastenders.

Some similar ideas, explained rather more briefly, are available through Student Toolkit. Some are printable so can be given to students as they walk in the door, and are intended to be used individually.

If you’re using computers, the free site bubbl.us lets students generate mind maps without too much of a learning curve. I find it useful to ask them to organize clear information from another source, eg Bitesize or S-Cool, in a graphical format. This way they can focus on links rather than making excuses for forgetting an odd fact. It’s easy for them to test themselves, just by covering up a section and challenging each other to fill in the ‘gaps’.

We’d be really interested in feedback or suggestions about these or any other classroom resources…

What should I share with colleagues? What would be your recommendations, of themes or individual ideas/links, that are most likely to increase involvement?

(Sounds like a teacher choosing lesson activities for an able but unmotivated class, doesn’t it…)

#rED2013 First thoughts

This is going to be a very quick post, end even when I’ve had a chance to process the day properly I’m sure it will be nowhere as analytical as my colleagues, some of whom also beat me to the keyboard. But it seems like a good idea to get this up on my site as soon as practical anyway.

What a great day.

If you made it, I probably didn’t speak to you – and I’m sorry. If you didn’t, then I’m afraid you missed a great day. But the videos will be up soon, loads of posts will no doubt be blogged and twitter won’t easily give up the #rED2013 hashtag. Which probably means we owe Taylor Swift fans an apology, but so it goes.

I shared my thoughts through the day, linking to the raw notes I was producing with Evernote. I tweeted the links as much as spotty WiFi and dying mobile batteries allowed. I’m linking the same notes – no added thoughts or reflection, no editing, no URLs – below. My plan is to post every day or so with tidied up, referenced and considered views on each of the sessions I was able to attend.

  1. Intro by Ben Goldacre
  2. Redesigning Schooling (two Toms)
  3. Dr Kevin Stannard: problems with ed research
  4. CUREE/Philippa Cordingley
  5. Chris Waugh – ed research from a class teacher POV
  6. Dr Jonathan Sharples
  7. Effect size debate
  8. Tom’s closing words

I had a great day, not only because of the excellent speakers (there were easily three times as many sessions I wanted to attend but couldn’t) but because of the audience. Even in passing it was great to meet fellow colleagues enthusiastic about developing our practice, and to put names to the avatars with whom I converse on twitter. Although I’m surprised I was the only person I saw who thought to put an avatar picture on my conference badge…

I have a few thoughts for the future and any possible ResearchED2014. These are not criticisms, just things I wondered about.

  • How about a ‘speed-dating’ exercise, or simply a large room where teachers and academics can show up to meet? Perhaps have individual whiteboards by each desk, and let us write what we’re looking for or what we have to offer. “KS3 English classes, want to investigate SOLO for text analysis” or whatever.
  • Host/start an electronic list where we can sign up with those same kinds of interests to find a mentor/partner.
  • FAQ board – list questions at the start of the day, tagged for teachers/researchers, and anyone who wants can give their answers/thoughts
  • Enough time for coffee! Admittedly I chose to forgo lunch in the interests of more sessions.
  • A poster session where we can share successful projects with colleagues.

Last of all, it was clear during the day that some really big questions were being considered. I’ve long thought that CPD often has very different levels of application. I think it might be worth flagging sessions according to their interest for:

  1. Classroom teachers wanting to investigate methods to use directly with students eg Bloom’s, seating plans, group work.
  2. Senior management, heads of department, learning authority advisors (while we still have them) who want to make sure policies and whole-school tactics are informed by the best possible evidence eg uniform, length of lesson/school day, sets/mixed ability.
  3. Professional associations, government decision-makers, curriculum developers who need to set national, large scale strategies which can support us all in a broad way.

So more posts will be arriving, sooner or later. In the meantime, sorry for any typos, haste or lack of clarity int he notes linked above. Comments are, of course, as welcome as ever.

TeachMeetMidlands 1/2

Last night – a warm summer evening – I finished work and then travelled into Derby rather than away, so I could attend a TeachMeet. If you’ve not been to one, I strongly recommend the experience; classroom teachers sharing an idea which should be usable more or less immediately. Quick talks (max 7 minutes in theory) and lots of chances to ask questions and share ideas. There’s usually coffee.

I’ll post soonish about the ideas I’ve taken away, although if you’re in a hurry you can see the quick notes I made via my CPD tracker – these are not yet proofed and will be gaining details and links when I get a chance to reflect. This post is my chance to share the resources I talked about there, and the presentation I didn’t end up doing.

Review Templates

I’m not bothering to embed the presentation, although you can have a look if you’re interested. Basically, I like to get students using the ideas to improve understanding, as a stage distinct from revision (although these are good for that too). I’ve spent a bit of time today tidying them up and you can now download a total of eight A4 pages in two sections. (They were a mixture of Word and Publisher originally – anyone know an easy way to stitch two pdfs into one file?)

Cornell Notes, Prior Planning, Fours as a pdf

These Are The Answers, PBODME, Blooms, 5Cs, Quarters as a pdf

Comments, thoughts and feedback welcome as always. The only one that’s not really self-explanatory, Cornell Notes, has its own post on this blog.

CPD Tracker

As the link above shows, I’m trying to better track (and reflect on) my CPD using a Google Form. This has lots of advantages (mobile as well as platform independent) and could potentially be used for accreditation or sharing within a group or department. In fact, I’m hoping it will get looked at as part of my #CSciTeach accreditation, which I will be blogging about soon.

My original post is probably still the most useful to explain, but you may also find the presentation helpful. This is what I would have delivered with more time, but this way I can reach those who care and avoid boring those who don’t!

(If for whatever reason the embedded version isn’t working for you, the presentation can be accessed directly.)

Please let me know what ideas, if any, are useful for you – nice to be able to show impact!

From the Classroom Up

So we had a Journal Club.

Getting on for 200 tweets from a small (but dedicated) group of Science teachers, with some tentative conclusions as Storified elsewhere. Although participants commented on the weak results from the case study – unavoidable with small groups on a single site – it certainly seemed interesting.

Could we show improved understanding, and hence achievement, by moving away from HSW skills integrated with content, and instead start KS3 by teaching these skills discretely? Enquiring minds want to know. If only there was a way to expand an interesting case study to get more reliable and/or generally applicable results. If only there was a general move towards gathering more evidence at a classroom level that could be widely shared in the profession…

“Hang on, fellas. I’ve got an idea.”

hangon

 Where We Are

An interesting case study has found a benefit from one approach (discrete teaching of Sc1 skills at the start of KS3) over another (gradually introduced over the year). A small sample was involved at one school.

What We Could Do Next

As several people pointed out, we need more data before proceeding to a full trial. The next step would be collecting information about schools which use these two approaches and how well they work. How do schools assess students’ understanding of the language and methods? A Googleform or similar would be an easy way to acquire the data without a high cost at this stage.

Trial Design

I should possibly leave this to the experts, but the whole point of this teacher-led approach is to get us involved. (Alternatively, the DfE could press release a huge study but not tell us what they’re actually investigating.) As I understand it, we’d need to

  1. Get an education researcher to co-ordinate design/timetables/data analysis.
  2. Produce standard resources to be used either all together (discrete unit) or spread through the year (integrated learning) – this could be based on CASE or similar approaches.
  3. Design outcome measure, ideally something cheap and non-intrusive.
  4. Recruit participant schools.
  5. Visit schools during trial (in both arms) to observe delivery, consider deviation from ‘ideal script’, and also raise profile of organisation/idea.
  6. This provides good ‘teacher/researcher’ links and could be used as a way to observe CSciTeach candidates perhaps, or at least accredit ‘teacher-researchers’.
  7. Collect data on outcomes for both groups. Tests need to be blinded, ideally marked externally or by computer. Workload!
  8. Data analysis – which approach gives the best results? Is this correlated with some characteristic of the schools?
  9. Share results widely, provide materials and best practice guidance based on evidence.
  10. Plan the next RCT, perhaps looking at the materials used.

Funding and Support

I’ve a few ideas, but they’re probably way off. I don’t know how much it would cost, either in terms of money or time. The EEF is focused on attainment of particular groups, so I don’t know how relevant it would be to their aims. (But their funding round closes in October.) The ASE, I suspect, would have the organisational skills but not the money. Might the Science Learning Centres have a part to play, if we consider this from the point of view of teachers developing themselves professionally while conducting research? It would also nicely complement some of the aims of YorkScience. And we shouldn’t forget the original author, Andrew Grime, although I don’t think he’s on Twitter. (We probably should have tried harder to get in touch with him before the Journal Club session, come to think of it…

I’m sure there are many other questions that could be answered in UK Science classrooms. But the question should be, which one shall we try to answer first? Instead of complaining from the sidelines, teachers should, ideally through coordinated projects and their professional associations, get involved. This seems like an ideal chance to make the most of the Evidence-Based Teaching Bandwagon and could perhaps be launched/discussed at ResearchED2013. If we want to make something of it.

Do we?

 

An apologetic postscript: sorry to followers of the blog who got spammy emails about a post which wasn’t there. This was because I hadn’t read the fine print on Storify about not being able to embed the material on a WordPress.com blog.  It’s the same Storify I link to above, now happily live at the SciTeachJC site.

The Evidence-Based-Teaching Bandwagon

Evidence-based practice in education is getting more and more attention recently. Projects like #SciTeachJC have been part of this, but I think there’s a general movement towards wanting to base what we do on facts rather than wishful thinking. The problem is that it’s actually quite hard, for several reasons, to be an evidence-based-practitioner.

That doesn’t mean we shouldn’t.

I Want To… But I’m Lazy

There’s a lot of evidence to keep up with. A lot of teachers are still being told that learning styles are useful despite a lack of supporting data, and a recent Guardian article shows this also applies to the infamous Myers-Briggs ‘test’. This means that we as teachers aren’t accessing old research, let alone new material. This is hardly surprising when you consider the cost; joining the British Educational Research Association costs £89, for which you get 6 issues of the BJER each year, and four issues a year of the Curriculum Journal will set you back £135. There’s also the lack of time teachers have when constantly rewriting schemes of work to suit the latest national qualification change, of course!

I do my best to keep up, but I’ve only so much time and money. I pay for my own membership of the ASE. I buy my own books. I spend my own time developing what I know and what I can do. I make it to TeachMeets when I can, join in with #asechat and #SciTeachJC, read and try out in school and reflect afterwards. But the situation we’re in makes it difficult.

Of course, what makes it even more frustrating is when individual teachers know the research and want to make decisions about teaching based on evidence, but aren’t allowed to. It’s important to recognise that schools may have perfectly valid reasons for not following suggestions from research, and cost is obviously often high on the list! But we need to accept that sometimes we are not getting it right on an institutional level, and this needs to change. If it doesn’t change from the bottom up, it will inevitably – and probably slowly and painfully – happen from the top down.

What’s Already Available And Where From

Every school should have well-thumbed copies of Petty’s book Evidence-Based Teaching and Hattie’s Visible Learning.  In my opinion – as a classroom teacher, not a manager – schools could do a lot worse than spending half of every inset day applying just one of the ‘best-value’ concepts in every relevant department. The constantly updated research by Marzano in the States examines a wide variety of teaching methods in terms of their success against measured criteria. The database is freely available and there are materials to explain effect size.

The British Educational Research Association (BERA) and the Specialist Schools Attainment Trust (SSAT) spend time and money looking into the effectiveness of eduication policies and methods; the latter works primarily with schools. The Evidence for Policy and Practice Information (EPPI) unit based out of the Institute of Education looks interesting, but not exactly accessible for those of us in the classroom. The GTC produced some research summaries with the title Research for Teachers (RfT) but I don’t know how well they were accessed; the group behind the summaries, the Centre for the Use of Research and Evidence in Education (CUREE) is still active. There’s the National Education Trust. And of course the Times Ed now has a weekly article bridging the gap between research and classroom practice, but I can’t find it online. There’s lots around, some free and some not so much. Some is purely academic while other groups attempt to translate it for classroom use.

The Education Endowment Foundation looks particularly at techniques to support those from disadvantaged background but their EEF Toolkit is generally useful, ranking interventions in terms of ‘value for money’. The difficulty with this approach is that it ignores the cost in terms of time and pressure on teachers, something I am sure they are aware of. It is the limited time of individual teachers which means centralised research is so necessary. As of 22nd March they have a vacancy for a Senior Analyst, if you’re interested…

There are some smaller groups in the UK; the Evidence-Based Teachers’ Network grew out of training sessions and has some useful summaries. There are many practitioners active online, for example @teachitso on Twitter (Dr Mark Evans IRL) who has some useful summaries on his site. There’s also several (competing?) groups such as the Guild of Teaching and the Teacher Development Trust with a small impact so far.

What We Need To Do Better

Much – but not all – of the current evidence is based on action research. This means a practitioner decides to try an intervention, does so and records any measurable change in results. This could be exam scores, recruitment rates for post-16 courses (I did that) or something else. It tends to be small samples and a snapshot in time. Think of them as case studies. Useful because they’re a step up from staffroom anecdotes, but more a starting point than gold-standard data.

Ben Goldacre, following the paper he wrote on RCTs for social issues that we discussed in #SciTeachJC, was asked to consider the use of RCTs specifically in education. The report has now been published and has stirred up a lot of debate. He wrote an article about it for the Guardian, and it’s noticeable how conscientiously he’s engaged with those commenting. I’d recommend reading the paper itself, of course – unlike some of those commenting. I like the idea of getting more teachers involved in research, obviously, but many seem sceptical. From a teacher’s point of view, the main issue is getting hold of the information afterwards. But it’s okay, the government has a cunning plan…

From this announcement, the EEF will be one of six centres, alongside NICE, tasked with gathering and disseminating evidence on social issues. It deliberately follows the NICE model where the evidence is analysed independantly of government, which would then (hopefully) consider the results and implications. A big issue I see here, of course, is that we seem to be moving away from a centralised education system where new knowledge would result in new systems for all. But we’ll see how it works.

What I Would Like

I’ve said before – like many others on Twitter, and I’m sure I wasn’t the first – that we need a National Institure of Education Excellence. An organisation committed to performing more meta-analyses of research, like the Cochrane Collaboration, and then making sure everyone else knows. For this to work effectively, there are several things the system needs.

Information needs to be effectively free at the point of use. Schools won’t pay for what they think they can get for free elsewhere (even if they’re wrong) and if we say all teachers need the information, it seems odd to expect them to pay for it when they’re cutting our pay in real terms.

The research cannot be politically driven. Some of the answers will go against current government policy. Some of the research will show MPs or Ministers to be wrong. That’s how evidence works and they’re going to have to be prepared to accept the consequences. But we can’t expect Gove to follow the evidence if we don’t do what we can to (a)collect it and (b)use it as soon as we know.

Interventions will have different relevance to different people and institutions. I tend to think of strategic choices at national level (such as exam specifications), tactical choices at a school level (such as behaviour and homework policies, setting and ICT provision) and choices of technique in a classroom (such as how to make group work most effective). I’m sure it’s more complicated than that, but you get the idea. We need to get the right information to the right people.

We need a wish-list, as Ben puts it, of questions we want answered. Set up a Google form and let any of us suggest something to investigate; shortlist and vote every six months. Personally, I’d love to see a comparison between students taught to use Blooms’ and those who are exposed to SOLO. Is there a difference? Does it depend on the students? If so, which method should we teach to which kids?

Teachers should have the opportunity to build up their skills as researchers. If they are needed to do more than send a copy of the results their class got following intervention A/B/C (delete as applicable) then the chance to get involved in data analysis will make it more likely they put the results into practice.

Get current researchers involved in designing the interventions. Of course this might be difficult if they feel the Secretary of State for Education is dismissive of their views or their motivations. We need better links between academics and full-time practitioners (or more people who do both, like the wonderful @MaryUYSEG). Maybe BERA could offer discounted memberships to the data-collectors?

Share the results widely in a format that means it can be used immediately. Imagine a magazine format, published electronically every month in three sections; strategic, tactical and techniques. The summaries link to journal articles, which are made open-access for the month so we can all see how well the synopsis matches the evidence. And each month three case studies show how the evidence from six months was put into practice at all three levels.

Next

There’s lots of groups talking about doing the same thing – linking research to practice. And despite having been in post for nine years, with a strong interest in science and evidence, I found half of the links in this post today for the first time.

Surely we can do better than this?

 

EDIT/UPDATE: It looks like something is happening rather quicker than we might have expected, thanks to the efforts of Tom Bennett. Check out the new blog for this September’s suddenly planned conference, ResearchED2013.