The Uncomfortable Truth: Why Developer Training Is a Waste of Time
There’s an entire industry built around “improving” software developers. Conferences, workshops, bootcamps, online courses, books, certifications—billions of dollars spent annually on the promise that if we just train developers better, we’ll get better software. It’s time to say what many of us have privately suspected: it’s all just theater.
Here’s why investing in developer training is increasingly pointless, and why organisations would be better served directing those resources elsewhere:
- Nobody’s actually interested in improvement
- Developers don’t control what actually matters
- GenAI has fundamentally changed the equation
Let’s examine each of these uncomfortable truths.
1. Nobody’s Actually Interested in Improvement
Walk into any development team and ask who wants to improve their craft. Hands will shoot up enthusiastically. Now watch what happens over the next six months. The conference budget goes unused. The book club fizzles after two meetings. The internal tech talks attract the same three people every time. The expensive training portal shows a login rate of less than 15%. Personal note: I have seen this myself time and again in client organisations.
The uncomfortable reality is that most developers have found their comfort zone and have little to no genuine interest in moving beyond it. They’ve learned enough to be productive in their current role, and that’s sufficient. The annual performance review might require them to list “professional development goals” but these are box-checking exercises, not genuine aspirations. When developers do seek training, it’s often credential-seeking behavior—resume-building for the next job search, a.k.a. mortgage-driven development, not actual skill development for their current role.
This isn’t unique to software development. In most professions, once practitioners reach competence, the motivation for continued improvement evaporates. The difference is that in software, we’ve created an elaborate fiction that continuous learning is happening when it definitely isn’t. The developers who genuinely seek improvement are self-motivated outliers who would pursue it regardless of organisational investment. They don’t need your training programs; they’re already reading papers, experimenting with new technologies, and pushing boundaries on their own time.
2. Developers Have No Control Over What Actually Matters
Even if a developer emerges from training enlightened about better practices, they return to an environment that makes applying those practices simply impossible. They’ve learned about continuous deployment, but the organisation requires a three-week approval process for production releases. They’ve studied domain-driven design, but the database schema was locked in five years ago by an architecture committee. They’ve embraced test-driven development, but deadlines leave no time for writing tests, and technical debt is an accepted way of life.
The factors that most impact software quality—architecture decisions, technology choices, team structure, deadline pressures, hiring practices, organisational culture, the social dyname—are entirely outside individual developers’ control. These are set by management, architecture boards, or historical accident. Having developers trained in excellent practices but embedded in a dysfunctional system is like teaching someone Olympic swimming techniques and then asking them to compete while chained to a cinder block. (See also: Deming’s Red Bead experiment).
Moreover, the incentive structures in organisations reward maximising bosses’ well being, not e.g. writing maintainable code. Developers quickly learn that the skills that matter for career advancement are political navigation, project visibility, stakeholder management and sucking up—not technical excellence. Training developers in better coding practices while maintaining perverse incentives is simply theater that lets organisations feel good about the charade of “investing in people” while changing absolutely nothing that matters.
3. GenAI Has Fundamentally Changed the Equation
The emergence of generative AI has rendered much of traditional developer training obsolete before it’s even delivered. When Claude or GPT can generate boilerplate code, explain complex algorithms, refactor legacy systems, and even architect solutions, what exactly are we training developers to do? (Maybe AI has a more productive role to play in helping developers maximise their bosses’ well being).
The skills we’ve traditionally taught—memorising syntax, understanding framework details, knowing design patterns, debugging techniques—are precisely the skills that AI handles increasingly well. We’re training developers for skills that are being automated even as we conduct the training. The half-life of technical knowledge has always been short in software, but AI has accelerated this to the point of absurdity. By the time a developer completes a course on a particular framework or methodology, AI tools have already internalized that knowledge and can apply it faster and more consistently than any human (usual AI caveats apply).
The argument that developers need to “understand the fundamentals” to effectively use AI is wishful thinking from an industry trying to justify its existence. Junior developers are already shipping production code by describing requirements to AI and validating outputs. The bottleneck isn’t their understanding—it’s organisational factors like the social dynamic, relationships, requirements clarity and system architecture. Training developers in minutiae that AI handles better is like training mathematicians to use slide rules in the calculator age.
The Hard Truth
The developer training industry persists not because it works, but because it serves organisational needs that have nothing to do with actual improvement. It provides HR with checkboxes for professional development requirements. It gives managers a feel-good initiative to tout in interviews and quarterly reviews. It offers developers a sanctioned way to take a break from the grind. Everyone benefits except the balance sheet.
If organisations genuinely wanted better software, they’d stop pouring money into training programs and start fixing the systems that prevent good work: rigid processes, unrealistic deadlines, toxic relationships, flawed shared assumptions and beliefs, and misaligned incentives. They’d hire fewer developers at higher salaries, giving them the time and autonomy to do quality work. They’d measure success by folks’ needs met rather than velocity and feature count. But that would require admitting that the problem isn’t the developers—it’s everything else. And that’s a far more uncomfortable conversation than simply booking another training workshop.
