Beyond Prices: Why the Market Can’t Think in the 21st Century
I. The Illusion of Epistemic Competence in a Commodity Price
Consider a seemingly trivial episode in contemporary technology markets: within a week, the retail price of a 128GB DDR5 memory kit rose from roughly £350 to nearly £900. Nothing in the underlying material conditions of semiconductor fabrication changed. No fabs were flooded; no supply-chain chokepoints suddenly appeared; no cartel announced a coordinated restriction of output. What shifted was the psychology of a handful of highly leveraged funds reacting to a rumour that AI demand might tighten global inventories.
If we accept orthodox economic doctrine, this tripling of price should be read as a revelation of underlying scarcity or heightened marginal utility. Yet such an interpretation collapses under even cursory scrutiny. We are not witnessing scarcity; we are witnessing amplification. Not a discovery of need, but a displacement of priorities. A signal that tells us more about liquidity and leverage than about productive demand.
To call this an instance of “efficient allocation” is to hollow out the concept of efficiency itself.
Twentieth-century economics relied on a quasi-mystical conviction that markets translate decentralised knowledge into coherent price signals. A rising price, in this worldview, reflects a collective assessment of scarcity, desirability, and opportunity cost. But this interpretive edifice rests upon a world in which informational opacity is the norm. As long as the world remained difficult to observe, prices appeared oracular.
The illusion held because society, institutions, and governments lacked the epistemic tools to contradict it.
We no longer live in that world. We inhabit a landscape in which informational scarcity has been radically overturned, yet the mythological status of price persists as though nothing has changed.
II. The Temporal Displacement of Hayek’s Information Thesis
Hayek’s The Use of Knowledge in Society (1945) remains the canonical articulation of why markets supposedly outperform any form of planned coordination. The argument is elegant: information is fragmented, tacit, and dispersed. No central authority, however well-intentioned, can acquire and process the relevant data required to allocate resources effectively. Markets, through prices, achieve what no planner could.
In 1945, this claim was empirically plausible. States lacked computational capacity, statistical depth, and real-time data. Large organisations were epistemically clumsy. The price mechanism could, in many domains, outperform bureaucratic judgement because the alternatives were primitive.
But this is a historical observation, not an eternal truth.
Hayek did not and could not anticipate the emergence of:
- global, real-time supply-chain observability through AIS, RFID, and satellite logistics;
- behavioural telemetry generated passively at planetary scale;
- ubiquitous mobile sensors embedded in consumer devices and industrial processes;
- networked databases able to integrate, reconcile, and interpret information in minutes rather than months;
- synthetic modelling architectures capable of generating counterfactual futures with high fidelity;
- intelligence agencies operating as de facto epistemic institutions with observation rights spanning continents;
- machine-learning systems able to distil patterns from data too voluminous or subtle for human cognition.
Hayek’s central assumption—that markets enjoy an irreplaceable epistemic advantage—has been eroded by the very informational infrastructure of modernity.
The 21st century does not vindicate the planner. But neither does it vindicate the market. What it does, instead, is dissolve the premise that markets are uniquely positioned to coordinate economic life.
III. Intelligence Agencies as the New Super-Observers
Intelligence agencies do not set prices or manage inventories. They do not orchestrate production schedules or allocate scarce inputs. But they operate with an observational granularity that undermines the foundational claims of laissez-faire ideology.
The analysts embedded in the so‑called “numbers club”—whether at GCHQ, MI5, MI6, or their cognate institutions—process data at volumes and speeds that were inconceivable even two decades ago. Their work fuses communications metadata, geospatial imaging, open-source intelligence, financial network analysis, supply-chain telemetry, and behavioural signals. No planner of the mid‑20th century could have imagined such epistemic capabilities.
The point is not that intelligence agencies should run the economy. It is that their very existence falsifies the assumption that centralised bodies are condemned to informational incompetence.
This leads to an unavoidable conclusion:
If the state can know more—and know earlier—than market actors about systemic conditions, then markets can no longer claim to be the sole or superior locus of economic coordination.
This is not an endorsement of technocracy. It is an empirical recognition that the informational landscape has changed, while our theoretical frameworks have not.
IV. Why Ability to Pay Has Become Decoupled from Social Usefulness
The RAM example is not an aberration. It is emblematic of a broader phenomenon: ability-to-pay is increasingly orthogonal to social utility.
A speculative trader who can mobilise £900 on short notice does not, by virtue of liquidity, possess a stronger claim to a scarce computational resource than a university researcher whose budget is capped. Yet market logic treats the former as more deserving because the market has no concept of desert—only demand, and no concept of demand—only purchasing power.
The market never asks:
- Who can generate the greatest positive spillovers from this resource?
- Who stands to contribute to long-run scientific or societal advancement?
- Who is engaging in speculation rather than production?
- Who will convert this resource into value that benefits more than themselves?
Instead, the market asks a single question: Who can pay, now?
This was once a tolerable proxy for usefulness when wealth distributions were narrower, speculation slower, and productive investment more dominant. In a financialised economy, it becomes a pathological filter.
Liquidity, not merit, becomes the arbiter of resource allocation. The result is systemic misallocation disguised as “efficient” price discovery.
V. Misallocation as the Defining Failure of Contemporary Market Systems
The characteristic economic pathologies of our era share a common structure: they are misallocations, not true shortages.
We see this in multiple domains:
- Hydrogen supply constraints reflect misaligned investment incentives rather than physical scarcity.
- Housing scarcity is produced by planning bottlenecks, land-value hoarding, and regulatory fragmentation—not the lack of bricks.
- GPU scarcity results from hype cycles, hoarding, and speculative accumulation rather than insufficient silicon.
- Energy volatility stems more from geopolitical risk and market concentration than from the geological limits of hydrocarbons.
- Labour shortages stem from decades of wage suppression, skill misclassification, and institutional rigidities.
These failures share a unifying theme: goods exist, but systems fail to match them to socially valuable uses.
The market excels at facilitating exchange under conditions of dispersed ignorance. But it was never designed for optimisation. The 21st century demands optimisation: coordination across systems, not piecemeal transactions.
The market, left to its own devices, is structurally incapable of producing that.
VI. The Epistemic State: Observation, Not Command
The political question of the 20th century revolved around the scope of state intervention. The political question of the 21st century revolves around the resolution and timing of state insight.
The modern state’s most significant emerging capacity is epistemic: the ability to observe, diagnose, and anticipate systemic conditions. It does not require ownership of industry or command over production schedules. It requires situational awareness.
One can envision three epistemic modalities coexisting:
- Markets guess, using price as a crude information aggregator.
- States observe, using accumulated data, institutional memory, and analytic systems.
- AI predicts, using models capable of generalisation, pattern recognition, and counterfactual inference.
For most of history, the first modality dominated by default. Today, the second and third increasingly dominate by necessity.
A high-information state is not inherently authoritarian. Its legitimacy arises from constraint, transparency, and proportionality. Its interventions should be justified not by ideological confidence but by empirical evidence.
This is not a revival of central planning but a transition toward bounded correction: targeted interventions where market misallocation generates predictable harm.
VII. Regulation as an Epistemically Informed Allocative Instrument
To regulate is to allocate. This is not a novel insight, but it is too often ignored.
The following interventions are explicitly allocative:
- energy price caps,
- anti-hoarding restrictions,
- strategic buffering of critical commodities,
- anti-speculation protocols,
- land-use governance,
- capital controls,
- priority access rules for essential goods.
These regulatory tools do not oppose markets. They sustain them by damping volatility, constraining pathological incentives, and ensuring that resources flow to actors who can deploy them productively.
Regulation becomes, in a high-information era, a means of aligning distribution with empirically grounded social priorities.
The refusal to regulate is not “neutral.” It merely entrenches the allocative authority of liquidity.
VIII. Reconstructing a Coherent Theory of Social Value
If price fails as a proxy for value, then value must be defined and operationalised on different normative grounds. This requires a move from the pseudo-objectivity of price to an explicitly articulated account of social purpose.
A polity that wishes to endure must locate value in activities that contribute to long-run collective flourishing. At a minimum, these encompass:
- scientific and technological advancement, particularly where it expands productive capacity and problem-solving potential;
- adequate, secure and affordable housing, as the material substrate for any plausible account of human dignity;
- infrastructural robustness, from transport to digital networks, without which market exchange itself becomes fragile;
- public health systems capable of absorbing shocks and sustaining population-level wellbeing;
- a diversified productive base, resisting overconcentration in speculative financial sectors;
- education and human capital formation, especially where it equips citizens to engage critically with complex systems;
- energy security consistent with ecological limits and intergenerational obligations;
- civic and digital public goods, including open data, secure communications, and trustworthy information infrastructures;
- resilient logistics and supply chains, resistant to both political coercion and climate disruption;
- intergenerational equity, ensuring that today’s consumption does not pre-empt tomorrow’s survival.
None of these can be reliably inferred from willingness-to-pay. They are revealed instead through normative reasoning, empirical analysis, and public deliberation. They require institutions that can encode priorities over time, rather than passively reflecting the distribution of purchasing power at any given moment.
Twentieth-century economics attempted to evade this responsibility by treating value as an emergent property of markets. The result was a theory that abdicated moral judgement while covertly smuggling in normative commitments through the back door of “efficiency.”
Twenty-first-century economics cannot afford this evasion. In an information-saturated world, refusing to define social value is not an act of humility; it is an abdication that leaves allocative authority in the hands of those with the most money and the fastest access to leverage.
IX. Allocation After the Demolition of Ignorance
We now inhabit a paradoxical condition: the world is hyper-observable yet institutionally under-coordinated. Sensors, databases, and models have demolished the ignorance that once justified reliance on blind price signals, but our allocative institutions remain calibrated to a previous epistemic regime.
Information has become abundant; coherence has not.
The task ahead is not to abolish markets or to enthrone data as a new deity. It is to construct an economic architecture capable of reasoning with information rather than being overwhelmed by it. Such an architecture must integrate at least four dimensions:
- Empirical awareness – the capacity to detect real constraints, bottlenecks, and externalities, rather than mistaking price movements for ground truth.
- Moral judgement – the willingness to articulate and defend claims about what ought to be prioritised, whom the system is for, and which harms are intolerable.
- Uncertainty management – institutions that can act under incomplete information without collapsing into paralysis or hubris, including the use of scenario modelling, precautionary principles, and adaptive policies.
- Long-term stewardship – mechanisms that prevent present actors from exhausting resources, capacities, or ecological sinks on which future actors will depend.
Markets, by design, handle none of these explicitly. At best, they gesture toward them indirectly through prices distorted by power, expectation, and fear.
The invisible hand performed tolerably in an era of low-resolution knowledge, when no one could reasonably claim to see the system as a whole. As the resolution of our collective self-observation increases, the limits of this metaphor become stark.
The pertinent question, therefore, is no longer the stale binary of “state versus market,” but a deeper one:
How should an information-saturated society allocate its resources in a manner consistent with reason, justice, and ecological constraint?
We already possess tools—statistical, computational, institutional—that can support more reflective allocation. The deficit is not technical but political and moral: a lack of will to acknowledge that blind markets are inadequate and that judgement cannot be outsourced to price.
X. Beyond the Market’s Pretence of Thought
The market was a serviceable prosthesis for a civilisation that could not see itself. Prices acted as crude but often useful heuristics in a world of pervasive ignorance.
That world is gone.
When intelligence agencies can map global flows of goods, capital and information with high granularity; when AI systems can anticipate demand, identify bottlenecks, and simulate counterfactual trajectories; when firms and states alike can observe in near real time the consequences of shocks and the topology of interdependence—then continued veneration of the price mechanism as the sovereign allocator is no longer defensible as a matter of reason. It persists as ideology.
This does not mean that markets have no role. It does mean that their role must be reconceived.
In a post-Hayekian, information-rich environment, markets are one coordination device among several, not an unquestionable arbiter of value. Their judgements must be subject to contestation, correction, and constraint by institutions that are explicitly charged with reasoning about the common good.
The 21st century demands a political economy that is honest about what states can now know, what markets cannot know, and what both systematically overlook. It demands a recognition that misallocation is not an unfortunate side effect but a preventable harm. It demands that we cease confusing liquidity with merit and price with desert.
If the 20th century was the age in which markets claimed to think for us, the 21st must be the age in which we accept that they cannot—and that the responsibility for judgement, with all its risks and burdens, returns to us.
We are no longer entitled to pretend that the market is thinking. It never was. It was merely groping in the dark on our behalf.
We now have light.
What we lack is the courage to use it.