The Garand Thumb of IT
1. A Different Era of Engineering
In the early 20th century, firearms design entered a golden age. Engineers such as John Moses Browning did not iterate on opinions. They built systems that had to function under stress, in mud, in cold, in chaos. Their work did not tolerate approximation.
John C. Garand belonged to that same lineage.
At Springfield Armory, he designed the M1 Garand, adopted in 1936. A semi-automatic rifle that changed the dynamics of infantry combat through speed, precision, and reliability. General Patton would later call it “the greatest battle implement ever devised.”
It was not just a weapon.
It was a system.
And like all serious systems, it demanded understanding.
2. The Lesson of the Garand Thumb
There is a well-known phenomenon associated with the M1 Garand: the “Garand thumb.” It occurs during loading, when the operator fails to control the bolt and the mechanism snaps forward with enough force to crush the thumb against the receiver.
The rifle is not malfunctioning. It behaves exactly as designed. The injury results from misuse, misunderstanding, or lack of discipline.
In that world, the feedback loop is immediate. The system does not adapt to the operator’s mistake. The operator adapts to the system, or pays the price.
This is how standards emerge. Not from guidelines, but from consequence.
You learn once. You do not forget.
3. IT and the Rise of Organised Ignorance
This is where the contrast with modern IT becomes uncomfortable.
In IT, the problem is no longer isolated ignorance. It is organised ignorance.
Systems are routinely shaped by layers of stakeholders who do not understand how those systems behave under real conditions. Product defines without constraints. Leadership commits without validation. Governance expands without clarity.
Instead of correcting misunderstanding, organisations absorb it. They add process, increase reporting, and multiply coordination layers. The system becomes heavier, yet no more grounded in reality.
In IT, the system does not punish misunderstanding.
It protects it.
4. The IT Garand Thumbs
We have our own equivalents of the Garand thumb. The difference is not the presence of mistakes. It is the absence of consequence.
Consider how often roadmaps are committed at executive level based on ambition rather than validated demand. Engineering delivers with discipline, milestones are met, releases go out on time. Months later, adoption does not follow. The narrative does not question the initial assumptions. It shifts toward execution gaps, team velocity, or “market timing.”
This pattern is not anecdotal. Google Wave was not a failure of engineering capability. It was a failure of alignment with real user need. The same applies to Meta’s early metaverse push, where massive investment preceded clear product-market fit. IBM’s repeated repositioning over decades reflects not a lack of technical ability, but a recurring disconnect between strategy and grounded execution.
Closer to home, large-scale banking transformations and public sector programmes, including initiatives such as the NHS IT modernisation efforts, have demonstrated the same dynamic. Systems get built. Budgets get consumed. Progress gets reported. Outcomes remain elusive.
At a smaller scale, the pattern repeats daily. Platform teams become centralised bottlenecks, buried under ticket queues. Instead of questioning the model, organisations add layers of intake, prioritisation, and governance. The system slows down, but appears more controlled.
Metrics follow the same trajectory. Dashboards show green indicators while onboarding fails, conversion drops, and support volume increases. The numbers are accurate. They are simply disconnected from reality.
These are not isolated failures. They are systemic behaviours.
And yet nothing forces correction. The system absorbs the mistake, redistributes the pressure, and continues operating.
5. When Misuse Becomes the Norm
In established engineering disciplines, misuse leads to failure, failure leads to learning, and learning leads to standards. That loop defines professionalism.
In IT, the loop weakens, and often disappears.
Language begins to shift to protect the system from reality. A failed initiative becomes a learning. A delay becomes a reprioritisation. A structural flaw becomes a communication issue. Each reframing distances the organisation from the actual problem.
At the same time, a deeper confusion sets in. The industry claims to embrace failure as a path to learning, yet often behaves as if failure carries no memory. The distinction matters. Learning from failure requires retention, adjustment, and constraint. What emerges instead is a form of organisational amnesia, where the same mistakes reappear under new names, stripped of their history and consequences.
Over time, misuse no longer triggers correction. It generates noise, and that noise gets absorbed. Dysfunction becomes normalised, then rebranded as best practice.
Minimum Viable Product becomes unfinished work. Ownership dissolves into committees. Delivery continues, but increasingly as performance rather than value creation.
The system does not collapse.
It drifts.
And in that drift, the industry loses its reference for what correct operation looks like.
6. The Invisible Damage
The most significant damage is not immediate failure. It is the erosion of the system’s ability to represent reality.
Trust degrades quietly. Stakeholders stop believing timelines, yet continue attending governance rituals. Engineers disengage, not from lack of skill, but from lack of meaning. Leadership relies on dashboards that reflect internal motion rather than external outcomes.
The organisation remains active. Meetings continue. Reports circulate. Delivery pipelines move. But value becomes incidental rather than intentional.
Unlike a physical injury, this damage accumulates without visible signal. By the time it becomes undeniable, correction is no longer incremental. It is structural.
7. Restoring Consequence
Restoring professionalism in IT does not require another framework. It requires removing the protective layer that shields bad decisions from consequence and reconnecting systems to reality.
Metrics must reflect outcomes rather than activity.
Ownership must remain explicit and non-transferable. Friction should be exposed instead of managed away through additional process, and failure needs to become visible early, while it remains correctable.
Professionalism emerges when misunderstanding carries a cost. Not a symbolic cost, but a real one that forces adjustment.
8. What To Remember
The M1 Garand did not change. People learned to use it properly.
IT chose a different path. Instead of learning, it redefined. Instead of correcting, it absorbed. Instead of improving, it renamed.
This pattern extends beyond engineering. Modern systems, much like postmodernist movements, tend to prioritise interpretation over constraint and narrative over structure. Meaning becomes flexible, definitions shift, and accountability dissolves into perspective. It is a school case of postmodernism in practice: changing the meaning of words instead of confronting reality.
In that world, failure no longer anchors learning. It becomes another version of the story.
Renaming problems does not solve them. It makes them harder to see.
And that is the real injury.
Member discussion