Monthly Archives: March 2026

Evidence that GenAI is widening the software engineering skills gap

According to the FT, demand for software engineers is rising again, and in relative terms is outperforming the wider jobs market. That’s the headline most people will take away.

But the more important detail is that the growth is concentrated in more experienced roles, while entry-level hiring remains weak.

That broadly lines up with a concern I wrote about recently for Ada National College for Digital Skills: GenAI is amplifying the already existing skills gap in software engineering.

Genuinely strong, experienced engineers are already a relatively small pool. If the FT data is right, demand is increasingly being concentrated into that already constrained part of the market.

That means a supply squeeze. Salary inflation, harder hiring, slower execution, and more organisations unable to deliver on the strategy and goals they have set.

Further, as we saw in the pandemic hiring boom, shortage pressure creates title inflation and level distortion, such as mid-level people taking senior or lead roles at new companies. The result is organisations paying more for talent that is, on average, less experienced than the role implies, while carrying more execution risk and less capability depth than the org chart suggests.

If companies keep optimising away that layer, it is not just bad for the industry overall, it is economically short-sighted for the individual organisation. It increases future dependence on an already constrained and expensive part of the market, while missing the real value of junior engineers, not just what they contribute now, but the capability they become.

Parkinson’s Law is really about organisational bloat

Parkinson’s Law is mainly interpreted as “people fill the time available”. That line was really only a hook. The original paper is really about how organisations create extra work and headcount for themselves, regardless of whether there is more genuinely useful work to do.

As organisations grow, work often grows around the work itself – layers, handoffs, approvals, reporting, internal dependency. More work about work.

That’s why adding more people, more process, more tooling (more AI?) so often fails to produce the productivity gains people expect.

Adding people can be beneficial to a point, but often far less than people might expect – and beyond a point, things usually get slower and more expensive, not faster. Doug Putnam’s analysis of hundreds of software projects found small teams were generally best, with 3-5 often the economic sweet spot, and larger teams quickly suffering diseconomies of scale.

The countermeasures to keep things as small, clear and contained as possible:

  • clear direction and priorities
  • clear accountability, ownership and decision boundaries
  • ruthless prioritisation
  • less work in flight – a culture of finishing

This is why I keep coming back to the same principle:

Fewer, better people doing less, better will usually get more done.

Not because small is always magically better. But because complexity compounds, and every extra person, process, dependency and priority adds drag.

Footnote: Probably the grossest misinterpretation of Parkinson’s Law is the idea that giving individuals less time or tighter deadlines will somehow make work happen faster. In practice, that will often just compresses time without removing the underlying drag.