Politics can seem strange to historians. Governments make so many profound and elementary errors that one would think that the storehouse of previous policies would allow them to avoid some of the more egregious. But no: they blunder on regardless. A.J.P. Taylor once argued that the only thing statesmen learn from the past is how to make novel mistakes. But the situation actually looks worse than his joke implied: they learn, it seems, to make the same mistakes.
The present administration is set on a very large programme of public spending reductions to please ‘the markets’ – essentially, bond holders. Previous occasions on which this has been Whitehall’s aim, for instance at the opening of the Great Depression in 1929-31, do not augur well for this policy. The post-war era of modest inflation and fast growth, on the other hand, which certainly did deliver rapid relief from Britain’s debts, is ignored.
The Secretary of State for Health is pressing ahead with a very unpopular and politically risky ‘decentralisation’ of the National Health Service – which will actually pile layers of bureaucracy, rules and costs onto general practitioners and hospitals. Previous reorganisations, for instance Keith Joseph’s disastrous management-orientated reshaping of the Service in the 1970s, have left chaos and confusion in their wake. It’s another lesson that’s being ignored. University fees are being ratcheted up, with little thought for the popularity or credibility of either British governance as a whole, or the future of our (in general very successful) universities. By the early 1980s, Keith Joseph as Education Secretary had learned from a few of his past ‘big bang’ mistakes, and retreated from just such a suggestion before the Thatcher government was dragged into the quagmire. One could go on and on, and a list would become wearisome.
All of these so-called ‘reforms’ are dominated by one particular and orthodox view of economics, which posits that public spending crowds out private investment; that choice, especially within quasi-markets, must be better than the ‘bureaucracy’ and statism of collective provision; that these mechanisms will introduce competition and efficiency into the public sector; and that money will therefore be saved.
All of these presumptions are, of course, highly questionable, and deficit reduction, health care reorganisation and university marketisation are often going to cost the taxpayer more, not less, in the long run. More theoretically, the market is a human, a contingent and above all rule-bound and law-encrusted creation like any other way of life. It is as questionable, on that basis, as any trope. Certainly there is nothing ‘natural’ or necessarily efficient about its operation.
But if we leave that important caveat to one side, what’s also notable about present views of public spending and choice is that they are so ahistorical. As I argue in my new book, Governing Post-War Britain: The Paradoxes of Progress, what politicians really lack is a sense of time – or, rather, a critical sense of time passing and change occurring. This has been evident since at least the ‘Year Zero’ approach of New Labour and probably since the supposedly constant or permanent administrative revolution of the 1980s, both of which phenomena attempted to defy or deny the Conservatives’ previous accommodations with Labour, or that latter party’s prior history as a socialist or even social democrat grouping. But it has recently become even more acute under a government which seems to think that it must progress swiftly if it is to make any headway at all. Ministers such as Michael Gove (at Education) and David Willetts (Minister for Higher Education and Skills) paradoxically understand the very short time span during which they will possess the political initiative, but not the much larger and overarching narratives of British governance – of which they are becoming a part.
What did post-war governance demonstrate to its practitioners? For one thing, it became absolutely clear the past is critical when thinking about the future. When governments tried to import policy solutions – ‘Scandinavian’ national wage negotiations, ‘French’ industrial planning, ‘American’ plant level efficiency – they usually failed utterly. Had they possessed more confidence in building on more particularly British examples, the soil in which new ideas were supposed to take hold and flourish, they would have been more successful.
The next thing that became clear was the tyranny of unintended consequences. Step by step, and sequentially, many policy initiatives ran into the sand. Defence spending, that was supposed to make the Anglo-American alliance powerful, undermined the two countries’ currencies; consumer spending ended up buying continental Europeans’ goods, leaving those countries holding many billions of dollars and pounds that allowed them to dictate terms within NATO and the EEC. Above all, they learned the importance of the policy environment: not the simple inputs-in and outputs-out of our present managerialist and technocratic reformers, but the conduct of policy as a discretionary art that had to take account of the world around them – temporally, as well as geographically.
All developed world governments might profit from a little of this realism at the moment. But there seems little chance of this changing in the short term. Many mainstream economists’ models do not allow for time. There’s a supply line and a demand line on their graphs, and where they meet – not when – that’s the price and the quantity of the most ‘efficient’ goods and services. The fact that both producers and consumers are coming from somewhere – that they have stories and preferences – remains (in the jargon) ‘exogenous’ and outside their ken.
The best studies of politicians’ and officials’ use of academic history reveal their engagement with the discipline to be spotty and inconsistent at best, and downright mendacious at worst. Virginia Berridge of the London School of Hygiene and Tropical Medicine has asked NHS policymakers about this, and what she found [pdf] was rather unsurprising, if depressing. Academics that knew nothing about modern healthcare history were invited to seminars; PhD students from Oxbridge who sometimes worked as MPs’ researchers were asked to help with speechwriting; top officials were aware of policy entrepreneurs’ and popular writers’ books, which drew on history but did not present new research findings. If we want to find a reason why politicians constantly reinvent the NHS, doing damage every time before practitioners can build capacity anew, we might do worse than look to this lack of institutional memory.
But time and the role of inherited practices, those ‘icebergs’ that make up more than ninety per cent of policy at any one time, are in fact one of the most important insights of the new economics of the late twentieth century, which has re-emphasised the importance of inheritances, institutions and language. From the ‘butterfly economics’ of Paul Ormerod to the post-modern emphasis on language and perception of the economic historian Deidre McCloskey, there is in fact a reaction well underway against timeless ways of thinking.
This should be expected, of course. Even during the years when what was known as the ‘New Classical Synthesis’ ruled economics during the 1950s and 1960s, propounding its own particular form of bastardised or ‘steam’ Keynesianism that in practice involved politicians using government spending to regulate economic activity, an underground counter-revolution singing the praises of more traditional and neo-classical thinking was well underway in think tanks and universities. It eventually emerged into the public sphere, and was politically successful, in the form of Thatcherism and Reaganomics, slashing state spending and bearing down on inflation through high interest rates.
So today, in a mirror-image of the intellectual revolution of the 1970s, long-standing intellectual trends obvious since the 1990s are only now beginning to make themselves felt at the level of actual policy. Economists such as the Indian development economist Amartya Sen and the liberal American Keynesian Paul Krugman alike have long argued that temporal and social context should be much more important to economics than they have been hitherto. In the first instance, and as Sen has always maintained, that means that equality and inequality are key issues even in terms of traditional economic success measured in terms of growth and productivity. For Krugman, the increasing centralisation and concentration of financial firepower since the 1970s helps to explain the scale and sheer speed of our present financial crisis. Economists have, in short, been waking up to the importance of time and history for a good many years.
In so doing, all these authors have brought the past and the future – and an understanding of sequencing – back to social choice and economics. They have, in short, brought history (and History) back in. Marx argued that ‘the dead grasp the living’; that we cannot escape our pasts. But it would be just as apposite to retort, with D.H. Lawrence, that ‘the dead don’t die. They live on and help’. A sense of the deep past is a necessary corrective to the idea that our present establishment economics is – always and forever – the economics that we must adhere to, and should alert us to the importance of our own institutions’ ‘rootedness’, the radical specificity of policy, and – lastly – our own language. Economics is changing, as the number of Nobel Laureates opposing British deflation and Greece’s expert-led descent into chaos demonstrates. What we need to ask ourselves now is: why aren’t policy-makers listening?
Dr Glen O’Hara is Reader in the History of Public Policy at Oxford Brookes University. His latest book, Governing Post-War Britain: The Paradoxes of Progress, will be published by Palgrave Macmillan in April of this year. He blogs, in a personal capacity, at PublicPolicyPast.
Read more
Get our weekly email
Comments
We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.