Across government, a pattern repeats. Supply-chain risks are identified early. Reports are written. Committees convene. Yet action is deferred – often until disruption is already underway.
The Covid-19 Inquiry documented how warnings about PPE supply vulnerabilities circulated for years before the pandemic. The 2017–18 energy price crisis followed months of signals about gas storage capacity. In each case, the information was available. The challenge was converting it into a decision.
Why warnings lose urgency
The conventional explanation is that departments lacked the right data or frameworks. But this rarely holds up. Risk registers existed. Escalation routes were defined. The difficulty lies elsewhere: in the psychology of acting on risks that have not yet materialised.
Behavioural economics offers a useful lens. Present bias – the tendency to overweight immediate costs against future benefits – makes it rational for individuals to defer spending today on problems that might not arrive until next year. Ambiguity aversion compounds this: when probabilities are uncertain, decision-makers prefer inaction to committing resources to incomplete information.
These biases are not irrational. They reflect how humans navigate uncertainty. But they interact badly with supply-chain dynamics, where lead times are long, and options narrow quickly once disruption begins.
A supplier that takes six months to qualify cannot be replaced in six weeks. A stockpile that requires twelve months to build offers no protection if the decision to build it comes three months before a crisis. By the time a risk becomes undeniable, the window for low-cost intervention has often closed.
Organisational structures amplify the problem
Individual bias would be manageable if organisations were designed to counteract it. Often, they do the opposite.
Responsibility in government is deliberately structured into domains — what might be called organisational "wells". Procurement teams see supplier behaviour. Policy teams see strategic intent. Delivery teams see operational friction. Each perspective is locally coherent. Taken together, they would tell a different story — but the system provides few incentives and limited permission to assemble them.
The accountability structure reinforces this. Acting early on cross-boundary signals is personally risky: you are overstepping your domain without a formal mandate, and if the judgement proves wrong, exposure follows. Waiting until the crisis is undeniable carries no such risk – the institutional narrative absorbs the failure as unavoidable. Early synthesis is therefore discouraged; late synthesis is excused. The system does not punish delay. It punishes premature action.
Governance processes compound this by requiring explicit triggers: formal escalation, agreed language, and recognised thresholds. Until those conditions are met, action remains difficult to justify. Observations stay local. Implications stay implicit. Decisions wait for permission – by which point alternatives may already be narrowing.
As I argued previously in these pages, initiative overload and fragmented accountability make it hard to pause, simplify, or reset in time. Supply-chain warnings face the same dynamic: they compete for attention against more immediate pressures, and lose.
The gap between knowing and doing
The result is a familiar dynamic. Risks are visible in hindsight – not because signals were absent, but because no part of the system was designed to convert dispersed observations into timely decisions.
This is not primarily a forecasting problem. In many areas of supply-chain management, the relevant time horizons, dependencies and lead times are well understood. The gap is between knowing and doing – between information that exists somewhere in the system and decisions that require someone to act on it.
What departments can do differently
Addressing this does not require new frameworks or wholesale reform. It requires counteracting predictable biases through governance design.
Make key dependencies visible across domains. If procurement, policy and delivery teams can see how their observations relate to each other, partial signals are more likely to be connected before they harden into constraints. Simple dependency maps – regularly updated – reduce reliance on informal escalation.
Define explicit decision rules for accumulated warnings. Rather than waiting for a single definitive trigger, departments can specify when accumulated signals – supplier concentration, lead-time extension, price volatility – should prompt cross-boundary discussion. This shifts the burden from proving a crisis is imminent to acknowledging that risk has crossed a threshold.
Treat inaction as a decision with costs. Present bias makes delay feel neutral. Governance can counteract this by requiring explicit sign-off on continued inaction when warnings persist. If a risk has been flagged for two consecutive review cycles without action, the decision to wait should be documented and owned.
Build slack for early intervention. As with initiative overload, the discipline to hold capacity in reserve – rather than committing every resource to current priorities – creates the space to act before options narrow.
A leadership discipline, not a technical fix
Supply-chain failures are often described as sudden shocks. In reality, they are more often the product of slow-building blind spots – not because warnings were ignored, but because the system made it easier to wait than to act.
The question for leaders is not whether they have the right risk registers, but whether their governance makes early action feel permissible. In a world where supply chains can shift faster than procurement cycles, the cost of waiting is often higher than the cost of acting on incomplete information.
Vsevolod Shabad researches and advises on cognitive bias, governance failures, and decision-making in complex organisations. His work draws on executive experience in critical infrastructure and technology across eight countries