Search

Saturday 27 April 2013

From Spreadsheets to Shirtwaists

As you might have heard, one of the key academic papers (Reinhart & Rogoff, 2010) on which the case for austerity stood has been widely debunked after the revelation that it's main conclusion (debt to GDP ratios over 90% spell doom) was the product of a simple spreadsheet error. And when I say simple, I mean a failure to include all the rows in the sum total of a column, not a syntax error in a complex array formula. We're talking dipshit simple here. Imagine if the first book to be lodged in the new George W Bush Presidential Library was Pippa Middleton's party planning opus. That sort of simple. The FT made the astute point that if you want to blame software you should point the finger at Powerpoint rather than Excel, on the grounds that this was a classic case of a dubious assertion being seized upon and over-hyped by policy-makers looking to justify a prejudice. There was only one spreadsheet but the dodgy claim will have been replicated in countless presentations.

Of course, this epic-fail has not led to Austerian politicians the world over slapping their foreheads and saying "My God! How could we have been so stupid!" You might even have spotted a smirk playing around the corner of George Osborne's mouth at the mere thought of this. Since 2010, the increasingly evidence-free campaign against stimulus has simply been resistance to any backsliding on the neoliberal project, with government debt taking on the bogey role played by inflation in years gone by. The macroeconomic policy dichotomy boils down to the preservation and enhancement of existing concentrations of wealth versus the diffusion of wealth as a means to increase aggregate demand. Thus we have QE and bank bailouts, which support property and equity values, rather than "helicopter money" or cuts in VAT. Together with the indulgence of tax-dodging and rent-seeking, this has led to advanced countries starting to resemble the polities of "resource curse" nations. As Steve Randy Waldman puts it "There’s a sense in which we are all Nigerians now", or more fully:

The global economy is succumbing to a technologically-driven resource curse, coalescing into groups of insiders and outsiders and people fighting at the margins not to be left behind. Our governments are transforming themselves from mediators among widely dispersed and interdependent interests to organizations that maintain and police the boundaries between the civilized and the marginal, who put down the insurgencies and manage the pathologies of the latter so that they do not very much impinge upon the lives of the former.

As wealth becomes more concentrated it becomes more rigid. This rigidity is manifested in an increase in privilege, which means both the ability to use wealth to buy status and advantage and the tendency of that advantage to secure more wealth. Though the recent Thatchgasm saw some commentators harp on about her rise from kinda-humble origins, as if she were an example of social mobility, the post-premiership award of a hereditary baronetcy (now passed on to "Sir" Mark) and the improbably generous speaking fees (now passed on to Tony Blair) were a perfect example of privilege in both old and new clothing. But privilege isn't usually so overt or personalised. More insidious is the casual acceptance of prejudice, which is the normative basis of privilege: I deserve it because I'm better than you. While racism and homophobia are in decline, largely because there are increasing numbers of rich non-whites and gays who expect the privilege that their wealth entails, the new "classism" is rampant, from the popular chav trope to the generalised assumption that the disadvantaged have only themselves to blame.

Another instance of prejudice is the belief that developing nations are made up of money-driven drones only too happy to make our commodities and share in the rising tide of global prosperity. An example of this can be seen in some of the commentary around the Bangladesh factory building collapse that killed at least 350 people. Matt Yglesias on Slate.com argued that it was fine for Bangladesh to have weaker health and safety laws than the US as this represented a trade-off against better wages: "in a free society it's good that different people are able to make different choices on the risk–reward spectrum". His assumption that there was a "collective calculus" by Bangladeshis is absurd, and wholly undermined by the reality that poor health and safety is usually the result of a failure to apply the law, often due to corruption or negligence, not the absence of law. The evidence of prejudice in this case is the unwillingness to apply the same logic to the coincidental explosion at the Texas fertliser plant that killed 14. Clearly, Texans did not make a "collective choice" to ignore risks in favour of rewards. Yglesias even fails to spot the obvious parallel with US history, namely New York's Triangle Shirtwaist Factory fire of 1911.

Other neoliberals have attempted to re-word Yglesias's faux-pas, emphasising that jobs in shitty factories are better than shitty jobs in the countryside, and thus peasants are making a rational choice when they head off to the city. Of course, this ignores a couple of facts that evaporate the figment of choice. The first is that increases in agricultural productivity inevitably lead to fewer agricultural jobs, obliging surplus rural workers to head for the towns. Productivity gains are partly more-from-the-same (higher yields) and partly the-same-from-less (fewer workers). As the amount of land (i.e. base productive capacity) cannot increase, agriculture always reduces employment in the long run. The second fact is historical coercion. The enclosure of vestigial common land prevents agricultural labourers from being self-supporting and therefore choosy. You either accept low rural wages or you move to the city. This process has been going on for centuries - it's a trickle not a flood - though there are peak periods (e.g. Britain in the early 19th century and developing nations over the last 20 years).

The common thread here is prejudice and self-delusion. Bangaldeshi factory workers are assumed to be willing participants in globalisation because the alternative thought is unpalatable. Similarly, the revelation that high debt ratios do not necessarily impede growth will not cause a change in policy, because that policy was never about securing growth. It's about protecting privilege.

Tuesday 23 April 2013

Doing it in the Streets

Derek Cianfrance's latest film, The Place Beyond the Pines, is an interesting addition to the canon of American Catholic cinema. A triptych, it concerns the trinity of a father, a son, and the strange ways of the spirit. Ryan Gosling plays Luke, a fairground stunt motorcyclist with dodgy tattoos. He is an absent father and existential loner, whose eventual desire to provide for his son (the J boy) threatens the holy family (he beats up Joseph) and leads to his own death after one too many bank heists. Bradley Cooper plays an older Jesus, rather portentously named Avery Cross. He's a trained lawyer, the son of a judge, who decides to pursue his ministry on the streets as a rookie cop. He kills Luke during the last, botched getaway, perhaps unjustifiably. He is privately gnawed by guilt at leaving a boy fatherless, while publicly celebrated as a hero. He is twice tempted by the Devil (first corruption and then ambition) and succumbs the second time.


In part three, the two mens' sons, Jason and AJ (Avery Junior), meet and form a combustible teen friendship centred on drugs and father issues. AJ cannot connect with his dad, who has sublimated his guilt beneath political ambition and keeps the boy at arms-length, while Jason initially knows nothing of his. Their anger and sense of loss leads (somewhat implausibly) to a mutual breakdown of trust and bloody assault. When Jason finally finds out that Avery Senior killed Luke, he threatens him with a gun. Cross shows remorse and asks for forgiveness. The burden of guilt lifted, we see AJ at his father's election victory rally having an epiphany of love for his dear old dad (either that or he's still high). In contrast, Jason buys a second-hand motorbike. The seller asks if he knows how to handle it. Jason says nothing but shows his competence by revving the engine, echoing a scene in part one where Luke has scared the baby Jesus, sorry Jason, by revving his bike. Jason rides off, perhaps to join the circus. The film title is a rough translation of the name of Schenectady, the upstate New York town where the action takes place, though the frequent use of long shots of the verdant surrounding hills, often bathed in late afternoon sunlight, suggests the place may be somewhere else.


Cianfrance is on record as a Catholic boy who lost his faith but remains drawn to holy stories. His previous film, the breakthrough Blue Valentine, starred Ryan Gosling and Michelle Williams as a couple who marry when they find she is pregnant, though both suspect that he isn't the father. The film then charts this holy family's gradual loss of faith. The mixing of the sacred and profane is a classic trope in American Catholic cinema, which is best thought of as films that explore Catholic themes, particularly forgiveness and grace, rather than films that feature nuns. In other words, Mean Streets rather than Sister Act. Coincidentally, Scorsese's classic was on the telly the other week, which perhaps tuned me up for Cianfrance's film. The core of the earlier work is the frustrated attempt of Charlie, played by Harvey Keitel, to steer between church and family while trying to save Robert De Niro's Johnny Boy from his own stupidity.


Another happy coincidence was seeing the current exhibition of the work of George Bellows, Modern American Life, at the Royal Academy. Bellows was a Methodist from Ohio, but he made his name as a New York artist and member of the Ashcan school of painting. His short career (he died at 42) lasted only a couple of decades spanning the First World War. He is most famous for his boxing scenes, notably A Stag at Sharkey's (above), though he was equally accomplished in gritty cityscapes, rhapsodic landscapes and sombre, anxious portraits that show the influence of Goya and Velasquez. Scorsese's Raging Bull shows the influence of his boxing pictures, but it was fascinating to see the extent to which Bellows also employed traditional Catholic motifs, perhaps influenced by the religious street theatre of the Lower East Side as much as old Spanish masters. A painting that owes nothing to the boxing ring is Fisherman's Family, an obvious Holy Family posed by the artist, his wife and child.

 
Bellows' external scenes are usually lit from above with a heavenly light, partly obscured by clouds or falling between intermittent storms, such as in An Island in the Sea. While this is conventional enough for the great outdoors, it's a more deliberate ploy in the bustle and grime of New York where it appears to alight like grace on fallen humanity. In one scene, Excavation at Night, centring on a vast pit being dug for Penn Station, the light from the workmen's brazier down below echoes the light of the streetlamps up above. This fondness for images of the fallen and of blessing is repeated explicitly in Benediction in Georgia, a prison scene where light coming through the barred window is echoed by the prisoners hooped uniforms, but is also implicit in Out for the Count, where the referee raises two fingers to heaven with one hand while reaching to gently touch the head of the stricken boxer with the other.


Bellows' boxing scenes are intensely physical and meaty, the fighters looking like faceless carcasses in an abattoir, but they're also staged like devotional groups. Dempsey And Firpo (below) is clearly the descent from the cross, while the various ring scenes are centred by light falling on the dynamic turmoil of flesh, surrounded by a sea of obscure faces. His boxers are Christ-like figures. Together with the dramatic dynamism of his drawings, which were published in the magazine The Masses, there is a stylistic influence that can be traced to the later generation of commercial artists who would develop the superhero comic in the 1930s, such as Joe Shuster and Jerry Siegel, also raised in Ohio, the creators of Superman (absent father, world saviour etc). Adding yet more coincidence, the trailers ahead of The Place Beyond the Pines included one for Man of Steel, the latest Superman reboot (rather than a biopic of Stalin).


There is no particular moral here, just interest in the stylistic thread that runs from Bellows, through Shuster and Siegel, to Scorsese and Cianfrance: the Holy Family, the struggling Christ, the benediction of light. If there is a common political or philosophical thread, it is the belief that we are compelled to action in the world. As Charlie says in Mean Streets: "You don’t make up for your sins in church. You do it in the streets. You do it at home. The rest is bullshit and you know it".

Saturday 20 April 2013

A Simple Lack of Education

Michael Gove has been at it again. I don't just mean attacking teachers and claiming we're in a race with the Chinese - that goes without saying - but casually rewriting history. In a speech at a conference organised by The Spectator (whose education section on their website is exclusively - I think that's the right word - devoted to independent schools), the Education Secretary advocated shorter holidays and longer school days. His justification was partly the need to compete with the fiendish Chinese, but also the need to join the modern world: "The structure of the school term and the school day was designed at a time when we had an agricultural economy". Er, not quite.

Elementary education did not become compulsory till 1880, following the 1870 Education Act, and then only for children between 5 and 10. Secondary education only became compulsory, for children up to 14, in 1918. The introduction of compulsory education, and thus the standard school year, was the result of industrialisation and the need for a literate and numerate workforce. It was not designed to suit an agricultural economy. Hop-picking in Kent is a good example of this truth. The seasonal migration of London families to the hop fields was not an ancient tradition but the result of the development of the railways. In other words, this grew to prominence after the 1870 Act; and it's worth noting that itinerant Gypsy and Irish labour continued to provide the bulk of the hop-picking workforce, despite Cockney legend. During the first half of the twentieth century, the overlap of the hop-picking season in September and the start of the school year was a cause of considerable friction between parents and schools due to unauthorised absences. The idea that the school year was in tune with the agricultural year is plain wrong.

So where did the long summer holiday come from? The answer is that it simply followed the template already established by private schools and universities. To this extent, Gove has a point - there is a link with our agrarian past, but it isn't to do with holidays coinciding with sowing or harvesting (that would actually mean long breaks in April and September, not July and August). The real cause was the aristocratic routine of retiring to a country estate during the summer months, so avoiding the urban heat and smells that were thought to spread disease ("miasmas"). The habits of landowners determined the traditional political and social calendar, hence the long summer recess of Parliament, the court-centred social season, and The City hiatus ("sell in May and go away and return again on Leger day"). The likes of Eton and Oxford naturally followed this rhythm, and just as naturally the Victorian tendency for popular institutions to ape superior manners (e.g. public libraries modelled on college epitomes) meant that state schools did so too.

Gove added personal anecdote to his dodgy history: "I remember half-term in October when I was at school in Aberdeen was called the tattie holiday – the period when kids would go to the fields to pick potatoes. It was also at a time when the majority of mums stayed home. That world no longer exists and we can't afford to have an education system that was essentially set in the 19th century." Gove famously won a scholarship to the elite Robert Gordon's school in Aberdeen, where getting your fingernails dirty anywhere other than the rugby pitch would be frowned upon, so his half-term recollection presumably dates to his primary school years in the 70s.

If you've ever done "tattie howking", you'll know that it's no task for a small child, and common sense dictates that unaccompanied 7-year olds are not an efficient workforce. Though older primary schoolkids were drafted in during the 40s and 50s, by the 70s it was mainly teenagers who would earn cash in the October holidays harvesting spuds and soft fruit. In reality, the nostalgic era of rosy-cheeked kids in wellies was relatively brief and a direct consequence of wartime labour shortages. Before the war, the Scottish potato harvest was predominantly lifted by itinerant Irish labourers - a seasonal migration that dated back to before the Famine (see Patrick McGill's Children of the Dead End for an early twentieth century record). After the war, Irish labour headed off to the better pay and conditions of construction and road building, which buoyed up the seasonal demand for willing teenagers until greater mechanisation and cheap adult labour came to the fore after the 80s. This October, any remaining fieldwork is likely to be done by itinerant Poles and Lithuanians. The point is that the institution of the "tattie holiday" is a mid-twentieth century invention, not the product of the nineteenth century, let alone a legacy of our agrarian past.

So what's Gove up to? What he's proposing is the deregulation of school hours. In other words, there will be no national standard and individual schools can do as they please, which means undermining collective bargaining on teachers' terms of employment. Ultimately, the unions are being neutralised as opponents of privatised education. As the teaching unions pointed out, British kids already spend more days and hours in school than most other countries. It's also worth noting that independent schools typically have longer holidays than state schools (essentially to accommodate ski trips and get full value out of that second home), but this does not appear to have held back their pupils or led to calls for reform. Naturally, there is no shortage of newly-minted free schools implementing longer hours, though this Gradgrindian preference for quantity over quality is clearly a reflection of status anxiety and the normalisation of the belief among the work-rich and time-poor that schools are providers of value-added child-minding.

Amusingly, a "Whitehall source" supported Gove by claiming: "We can either start working as hard as the Chinese, or we'll all soon be working for the Chinese." The source is obviously Gove himself, but his shyness reflects an inconvenient fact: a Chinese 14-year old spends 793 hours over 175 days a year in school, compared to 925 hours over 190 days for the same age group in England and Wales. And while such xenophobia might play well with his natural constituency of Telegraph and Daily Mail readers struggling with (or aspiring to) private school fees, it obscures the fact that The City is already quite sanguine about working for the Chinese. Selling them elite services, from money-laundering to private education, appears to be our future. Floreat Etona.

Tuesday 16 April 2013

Democracy, Gangnam Style

The New Statesman is celebrating its centenary. Showing that it has lost none of its feel for modern manners, it decided to ask various worthies to vox-pop in short-form on the question: "Did the Left win the twentieth century". No one actually tweeted an answer, but the results were predictably superficial, ranging from the partisan name-calling of Michael Gove to the fence-sitting of David Milliband (is he still here?) What was strange, in between the claims that the Nazis were leftwing and that the factory has been replaced as an organisational form by Facebook, was the almost complete absence of the word "democracy". The only one who mentioned it, and then only in passing, was the historian-cum-MP Tristram Hunt. Insofar as it is possible to agree what "left" and "right" mean, democracy is a triumph for the former (votes for all) rather than the latter (votes for the few).

The key political change of the 20th century was the introduction of universal adult suffrage. In the UK this only happened in 1928 when the vote was extended to all women over 21 (men got it in 1918). In most European countries universal male suffrage only came in after WW1, and in many it would go into a hiatus within 20 years that would last till the early 1990s. Looked at in historical terms, this fragility clearly reflects the degree of economic development, with the heartland of European democracy being the more advanced North West corner of the continent. As democracy expands, so too does plurality (the idea that multiple bases of power in society is a good thing) and tolerance (the acceptance of limits on majoritarianism). The expansion of democracy has followed industrialisation (not markets), often after decades of conservative resistance, notably to Southern Europe in the 70s and Eastern Europe in the 90s. The steps towards democracy in Turkey and the Maghreb are a continuation of this process.

Representative democracy was the product of the industrial revolution. The upheaval in society led to the concentration of labour (which facilitated collective action), growing demands for leisure and state protection from the vagaries of the market, plus a growing need for an educated workforce and thus buy-in to the social order. In all cases, electoral democracy was preceded by democratic practice among industrial workers. The social pressure for universal (male) suffrage came primarily from organised labour, not liberal reformers. If you've ever wondered why there is little democracy in the Gulf states, consider the absence of independent trades unions (this in turn reflects the deliberate apartheid of the oil industry, dependent on foreign workers and used by rentier conservative elites to maintain tribal society).

Fascism was a reactionary movement that attacked the "degeneracy" of democracy (i.e. giving votes to workers) in favour of an organic concept of nationhood (everyone knows their place and the nation has a destiny). It was doomed to failure because of its need for constant crisis as an organising principle and because industrialisation eventually dissolves borders. Not just in the sense of trade breaking down barriers, but in the pressure for labour and capital mobility. One of the many ironies (or internal contradictions) of Nazi rule was that it led to millions of "racially-inferior" foreign workers being forcibly imported to Germany, along with looted plant and equipment. The postwar European project is one of "constrained democracy", in which conservative elites, scarred by the experience of the 30s and 40s, sought to neutralise the threat of populism, and the economic volatility that could give rise to it, through constitutionalism, federalism and the social market. In the early 90s, the project was upgraded by neoliberals to manage democracy through a shadow state constructed through "post-democratic" and technocratic mechanisms such as the Euro/ECB and EU Commission, and by appeal to "market forces" and other hegemonic phenomena.

The 20th century can be seen as a series of conservative experiments to marry the volatile combination of industrialisation and anti-democracy, from Fascism through various shades of military-backed authoritarianism to the single-party, state-capitalism of East Asia. Only the last of these currently has legs. The post-1979 era in Europe can be interpreted in terms of the subversion and capture of democracy by anti-democratic interests, often inspired by the achievements of the LDP in Japan, Park Chung-hee in South Korea and Lee Kwan Yew in Singapore. The difference between the two is that the East Asian model has sought to control the impact of industrialisation, while the European model has sought to exploit deindustrialisation and the weakening of organised labour. In Eastern Europe, the bureaucratic elites quickly oligopolised both economic and political power, but with more of an accent on security and nationalism than managerial competence (understandably). In the West, globalisation and the neoliberal turn has gradually normalised the idea that much of economic and social life is beyond the scope of democracy, which in turn leads to low expectations of politicians and the lionisation of successful businessmen.

The deindustrialisation of the UK has been mirrored by the erosion of pluralism. As well as the disempowering of local government and the introduction of anti-union laws, the last 30 years have seen the public sector circumscribed to a set of commodities (hence the disrespect shown to teachers who express an opinion on teaching) and public corporations dismissed as "elitist" and "unrepresentative" by unrepresentative elites (e.g. the Murdochs re the BBC). Tolerance has been reduced to a fiscal preference: you can buy a lifestyle (gay marriage), but you can't expect it for free (benefits). Democratic control, eroded by privatisation and self-regulation, has increasingly given way to calls for transparency (the fetish of the camera-phone) and accountability, which can be satisfied by a televised auto da fe in front of a parliamentary select committee. The ritual humiliation of a banker is neither systemic reform nor democratic engagement, but it's an effective trick for channelling popular anger that the East Asian regimes have long been wise to.

In a week when we're told how awful everything is in North Korea (not that I doubt it), it's easy to forget that the shining example of South Korea is a state where democracy is recent and fragile, and where the inter-connected elites of big business and the military ultimately call the shots. We shouldn't take democracy so much for granted.

Friday 12 April 2013

The Dear Leader Bestrides the Globe

The bizarre conjunction of Margaret Thatcher and Kim Jong-un this week has been particularly pronounced in the ritualistic performance of foreign policy, as the TV footage segues from The Iron Lady directing a tank in pre-unification Germany to the chubby Dear Leader personally directing missile strikes against a defenceless hill. The Thatchfest has tended to focus on her domestic impact, which is understandable given that the 80s were nothing if not memorable round our way, so I thought it might be interesting to look at her dealings abroad in isolation. I have never knowingly passed up an open goal, so I'll start by noting that isolation was a recurrent motif. While many sincerely believe that "she put the Great back into Great Britain", it would be just as true to say that "she put the Little back into Little Englander".

The opening act in our drama is the sturm und drang of the Falklands War. Its long-term geopolitical significance was that it triggered the first step in the rejection of military rule and the restoration of democracy across South America. This was clearly not the chief goal of Thatcher, who famously remained supportive of General Pinochet and largely non-judgemental when it came to military dictatorship. In a British context, there is perhaps some basis to the claim that the war exorcised the demons of Suez, and there is no doubting the significance that the earlier event must have had for someone who entered Parliament in 1959, but the key facts to remember are that the war was, according to the Franks Committee, "unanticipated" and that the running down of military resources in the South Atlantic "may have served to cast doubt on British commitment to the Islands and their defence". That judgement, given in the afterglow of victory, was as generous as could be expected. A less friendly assessment would be that a lack of due care and attention encouraged the junta.

Defeat, or even a negotiated compromise, combined with her domestic unpopularity at the time, would probably have spelt electoral doom for Thatcher. Labour, led by the pro-intervention Michael Foot, would have had a field day criticising her failure to stand up to the fascist junta. In this light, her decision to go to war was based on the calculation that there was, to coin a phrase, no alternative. Real bravery would have been to negotiate a leaseback during the sabre-rattling phase ahead of hostilities, though we cannot know whether Galtieri & co would have settled for a bird-in-the-hand or regarded it as a sign of British weakness and been emboldened. The consequence is that we are saddled with a remote annoyance that can (and will) only be resolved through an eventual transfer of sovereignty. Ironically, the sense of a page being turned after Thatcher's funeral may improve the chance of discussions with Kirchner & co with a view to a leaseback deal. The further irony is that Thatcher herself attempted to negotiate an extension to the leaseback deal for Hong Kong, but failed to persuade the Chinese. She tended to be less successful at jaw-jaw than war-war.

Victory in the Falklands famously depended as much on covert support by the US (and Chile) as it did on British arms. Thatcher's attitude to the US was indulgent to the point of soppy, even after Reagan stiffed her over Grenada (I suspect her vocal opposition was largely face-saving and in deference to the Queen, as she otherwise showed little solidarity with the Caribbean). She was happy for the UK to become Airstrip One in time for 1984, housing Cruise and Pershing missiles, buying Trident, and facilitating the US takeover of Westland Helicopters. Though much was made of her relationship with Reagan, I suspect her view of the US was based more on Churchillian folk-memory and respect for Eisenhower, the president of her formative political years, the architect of cold war policy, and the "father figure" who intervened to stop the squabbling at Suez.

Typical of her background (by which I mean wife of the bigoted Denis as much as daughter of Alderman Roberts), she was instinctively sympathetic to the white remnants of empire, which led to her tacit support for apartheid-era South Africa and the incautious language about Britain being "swamped" by alien cultures. Her lack of interest in the non-white commonwealth, and their estrangement following her opposition to sanctions against South Africa, led to other African nations and India turning in time towards China and the US. While subsequent administrations have attempted to repair the damage, you sense that Thatcher may, for largely the wrong reasons, have got the direction of history right, and that the Commonwealth won't survive the Queen.

For some, Thatcher's chief geopolitical achievement was talent-spotting Gorbachev as a man "we can do business with". The truth is that she was in the right place at the right time and the initiative came from him. He knew that a direct approach to the US to broach disarmament was too risky, as it might have been triumphantly greeted as a sign of weakness, and he could not use Germany or France as interlocutors without raising US suspicions of a covert deal. The UK was the ideal go-between. Thatcher was pragmatic enough to grasp the opportunity, and fully milked the boost in status it provided, but her role did not extend beyond making the introductions. She was soon sidelined as Reagan and Gorbachev went head to head in Geneva and Reykjavik.

Thatcher's attitude to Europe is perhaps best understood as a throwback to 19th century diplomacy and the pursuit of a balance of powers. She actively encouraged fragmentation and subsidiarity. It is well known that she opposed German reunification, but perhaps less well remembered that she supported the independence of Croatia and Slovenia, and thus the breakup of Yugolsavia, or that she was worried about Solidarity destabilising the East-West standoff. Her opposition to "ever closer union" in the EU was consistent with this stance, but bereft of any coherent alternative strategy beyond "no". Ironically, her opposition to German reunification prompted closer Franco-German agreement on the subject, which in turn led to the French securing the Euro as a quid pro quo. Had she cut a deal with Kohl and marginalised Mitterand ahead of the Maastricht conference in 1992, the design and timing of both the EU and the Euro might have been quite different. In the event, she exacerbated British isolation and bequeathed her party a toxic legacy.

Her stated reason for opposing reunification was that it "would lead to a change to postwar borders, and we cannot allow that because such a development would undermine the stability of the whole international situation and could endanger our security". She clearly struggled to think beyond the worldview of her own formative years in the 1940s and 1950s (she was twenty when WW2 ended). Though she was pro-Europe in the early 70s, this was very much in the mould of Churchill and Macmillan: free trade would bring prosperity and advance liberty, saving us from the Road to Serfdom. It is interesting to note that she called on Macmillan for advice during the early days of the Falklands War, even though he would become a critic of her social policy in the mid-80s. She always exhibited far more deference to the establishment in foreign dealings than in domestic.

She had no particular vision in respect of the UK's international role, beyond keeping things much as they had always been: cleave to the US, promote a Europe of small states and free burghers, and indulge a sentimental nostalgia for our white kith and kin in the old empire. She was a true conservative abroad, despite her support for neoliberal reengineering at home. That her seedy son should have made his money by trading on her name, and now flits from one expat enclave to another, has the flavour of a tale by Graham Greene or Evelyn Waugh. The mother, whose favourite book was apparently Frederick Forsyth's The Day of the Jackal, was more in the tradition of John Buchan and Rider Haggard. I have absolutely no idea what Kim Jong-un's literary preferences are, but I suspect he is more familiar with Godzilla than The Mouse That Roared.

Wednesday 10 April 2013

The Dear Leader

The death of Margaret Thatcher has given us a glimpse of that rarest of creatures, the Great Woman Theory of History. Both friend and foe seem bedazzled by the impact that a single person had on the very warp and weft of life, which perhaps reflects her psychic importance as the first woman prime minister, or perhaps just her longevity in office. Eulogies obviously big up personal impact. You don't get up at a funeral and say "Fred was a nonentity who just went with the flow". Equally, histories emphasise the tides and currents on which our boats bob about at the expense of personal agency (and no doubt to Michael Gove's chagrin). There are pivotal moments where an individual can influence direction, but these are just oscillations in a longer trend. By and large, "management" (of which politics is a particular species) means keeping things running and not screwing up (a la James Crosby). "Making a difference" is a common ambition but a rare achievement. The flip-side of this truth is the cult of personality - the tendency to attribute inventions and great deeds to the powerful. You could have been forgiven this week for thinking Margaret Thatcher was North Korean.

The desire to credit her with near magical powers of influence produced one particularly surreal meme. Yesterday morning, Ian McEwan reminded us that before her coming you had to wait 6 weeks to get a phone extension installed. Stephanie Flanders, who may have copied Ian's homework, rolled out the same emblematic factoid on BBC News last night. Of course, the Leaderene was no more responsible for technological advance than Canute was for the waves. The key change in telephony was the widespread adoption of cordless phones in the mid-90s, which largely did away with the need for extension wiring. Warming to the theme, Philip Hensher imagined a counter-factual world in which Thatcher never came to power: "Perhaps we would be waiting six months for a mobile telephone... Perhaps there would be three TV channels and the requirement for a licence before you could use the internet". North Korea again.

McEwan guilelessly recalled a late-80s literary conference (presumably just before the Wall came down) where Italian writers (presumably Marxists) expressed frustration at the Brits obsession with the individual: "Take the larger view. Get over her! They had a point, but they had no idea how fascinating she was". I imagine the point that the Italians were making is that society is the product of long-term structural changes and literature reflects the accommodations we make with them. A focus on the individual heroine, bending history to her will, is just romantic wish-fulfilment. In other words, we need more Lampedusa and less Mills and Boon, which seems a reasonable conclusion at a literary conference in Italy, or elsewhere for that matter.

Even the more sober analysts of economic performance succumbed to the Dear Leader hysteria. Jeremy Warner in The Telegraph noted that GDP relative to France began to improve "almost from the moment she came to office, after more than three decades of decline". You can almost smell the magic, not to mention the unwillingness to credit any of this to the spadework of a prior administration. This hagiography came in the form of a critique of Paul Krugman, who asked (and admits he doesn't know the answer) whether the change in the UK's relative economic fortunes, which he sees as arriving in the mid-90s, is really attributable to policy changes 15 years earlier. In contrast, Allister Heath, editor of the City AM fanzine, claimed that blaming Thatcher for the 2008 debacle and its legacy is to "misunderstand history". Our current economic woes are all the work of Gordon Brown: "she can be held responsible neither for the state of today’s manufacturing sector, nor for the financial crisis".

He then promptly undermined his case by claiming: "Today’s ultra-efficient car industry, and its record exports, is a direct product of the Thatcherite revolution". So that's a 30-year time-lag between cause and effect, which ignores the role played from the 80s onward by technology (notably automation and logistics), the quality management revolution, and the opportunities presented by the EEC/EU. Only the last is (partly) attributable to policy, specifically Thatcher's championing of the single market. Contrary to the implied counter-factual, had Labour called and won an election in 1978, and continued in office through most of the 80s (not inconceivable as the SDP split might not have happened), the same changes (broadly) would have occurred in the UK car industry, just as they did in socialist France. There were bigger, global forces at work. The idea that we would have been stuck producing "notoriously poor" goods for the duration is, to coin a phrase, to misunderstand history.

Heath is also selective on financial services, insisting that the Big Bang was unavoidable: "The City’s old partnerships didn’t stand a chance; they would have been wiped away within a few years by meritocratic, hard-working global competitors with vast balance sheets. Thanks to Big Bang, the new players ended up being based in London, rather than elsewhere, contributing greatly to the Exchequer." This trades on the ridiculous notion, much beloved of Boris Johnson, that bankers have no reason to be based here beyond the love and appreciation of the locals. The plain fact is that deregulation allowed foreign banks and brokerages to buy up The City. They were always intent on moving in because of London's structural advantages in the money trade (notably Eurodollars and bonds). The abolition of the old restraints was not an enticement, like the introduction of a happy hour, so much as levering the pub door off its hinges. The Big Bang didn't defend UK interests, it merely allowed UK players to dip their snouts in a larger global trough. Working for Charles Schwab rather than Cazenove, or USB rather than Hambros, was an incidental detail.

Like many others this week, Heath also tried to scare us with tales of how the 70s were scarred by industrial action: "the workforce used as pawns by militant union leaders who would call strikes at every opportunity" (the man himself was 2 years old when the decade ended). It's funny how the trope of  the "undemocratic trades unions" is accompanied in my mind's eye by the image of workers in a car park voting on strike action (we only had 3 TV channels and it was wall-to-wall news). In other words, practicing the democracy that was conspicuously absent within the workplace. It is difficult to convince people today that strikes were nowhere near as prevalent as the myth has it, and that days lost to strike action didn't fall to a historically low level till the 90s, largely as a result of the changed composition of the economy. It was globalisation and deindustrialisation that primarily undermined organised labour, not anti-union law or police truncheons. The pitched battles of Wapping (inevitable technological change) and Orgreave (simple revenge for 1974) were exceptions, not the rule.

What I find interesting about the "militant unions" trope is the determination to maintain the bogey to the point of absurdity. Trying to paint Frances O'Grady as Arthur Scargill in drag is just silly. The visceral hatred of organised labour remains a significant dividing line on the right, between the small capitalists and Hayekians on the one side and the more pragmatic big capitalists and neoliberals on the other. I should point out that the "right" in this sense includes much of Labour. Talking of which ...

Another popular meme is the claim, made by Mrs T herself, that her greatest achievement was Tony Blair and New Labour. In fact, both she and he were common products of a wider tendency, not one of the other, and there were crucial differences between them. Thatcher was instinctively a champion of small capital who also benefited finance capital, though she showed no sign that she fully understood what she unleashed with financial deregulation. During the 80s, big capital was ambivalent, with the dwindling of institutional government support for industry offset by lower corporation tax and increasing privatisation. The growing rift over the EU pushed big capital towards New Labour in the 90s. Finance capital, now dependent on global flows and the need for domestic regulatory shelter, found a willing ally in New Labour and adopted an ambivalent attitude in turn. Thus the Tories were forced into a small capital laager ("moving to the right") on issues such as Europe, business regulation and immigration. Cameron attempted to obscure this with a socially liberal makeover, which predictably failed. The legacy of Thatcher is the persistent strength of this small capital core, now driving the demonisation of welfare and Europe.

The fact of New Labour's continuity is taken as evidence that Thatcher "made the weather" and "shifted the centre ground". The judgement of history will surely be that most of the socio-economic changes that came to pass in the 80s would have occurred regardless of who was PM. This is not to say she made no difference, merely that the difference is over-stated. Whether Michael Foot or Michael Heseltine, we'd still have ended up with mobile phones, the Web, Acid House, the Premier League, Channel 4 and easy credit by the 90s. We would not have been able to opt out from deindustrialisation and globalisation, any more than we could have ignored advances in technology.

It is reasonable to ask what a different leader might have exacerbated or mitigated; what a different oscillation might have looked like. Perhaps we'd still have built council houses and so avoided a property bubble. Perhaps the transition for mining communities (which was inevitable and not unwelcome - most miners hated the job) would have been handled better. Perhaps a larger manufacturing base might have been maintained and retooled. I doubt The City would have developed in any other way than it did, though the absence of a property bubble might have limited the damage of 2008, and the historic shift to a service economy was inevitable to some degree. Perhaps income inequalities and regional imbalances would not have been so stark, and perhaps the SNP would never have gained traction in Scotland (surely no one else would have been mad enough to introduce the Poll Tax). Peace in Northern Ireland would probably have come earlier, but I doubt we'd have joined the Euro. The counter-factual nostalgia of many commentaries this week is that we'd have ended up in much the same place but with less grief, less hatred, and less ugliness in our society. A bit more German, in other words. That thought alone would probably convince the old bat she was right all along.

Sunday 7 April 2013

Contributing to the Problem

The current debate on "welfare" (the Mick Philpott Show) is being widely described as a dividing line between the parties ahead of what is likely to be a vicious general election campaign in 2015. For the Tories this is a return to the "are you thinking what we're thinking" snidery of 2005, while for Labour there is a determination to resurrect the contributory principle, if only to avoid the charge of being soft on the "something for nothing" culture (TM Tony Blair). Liam Byrne's interpretation of reciprocity owes little to socialism ("From each according to his abilities, to each according to his needs") and a lot to the liberalism of  Lloyd George and Beveridge ("Benefit in return for contributions, rather than free allowances from the State, is what the people of Britain desire").

It would be easy (and futile) to point out that popular belief on benefits is wildly out of whack with the facts, or that the political debate is driven more by the unscrupulous deployment of myths. What's more difficult to explain is why the popular mood seems to have turned against benefit recipients. Some of this is just misreporting to suit an agenda (the combination of a YouGov poll and a Tory-leaning newspaper tends to produce a predictable result). But though the death of solidarity is exaggerated, there does appear to have been an increase in suspicion around false claims and undeserving cases that cannot be wholly explained by media tub-thumping. I suspect some of this is actually a product of the growth in low-wage jobs. If you are working long hours for shit wages, you'll probably become more resentful of imagined skivers. Also, the visible increase in immigration around the millennium has probably fed the belief that "benefit tourists" are real, despite the evidence that immigrants are less likely to claim benefits than natives. Social change that we feel is not in our control leads to anxiety.

One of the more depressing changes in attitude concerns disability. Even well-meaning lefties have played a part in this, subscribing to the widely-held belief that the Thatcher administration deliberately shifted the long-term unemployed in deindustrialised regions onto incapacity benefit in order to shrink the unemployment figures. While there is truth in this, it masks an underlying secular trend that would have pushed up the numbers anyway. The number of people on incapacity benefits rose steadily during the 1980s and early 90s but did not show a correlative decline when unemployment dropped in the late 80s or an uptick during the recession of the early 90s. However, the rate of growth slowed during the late 90s and then started a gradual decline from 2000. How so?

While the acceleration in the 80s was probably amplified by government policy, the underlying trend appears to reflect two other factors. The first is the higher rate of work-induced illness during the postwar years. Industry wasn't any more dangerous than before, but the advances in medical science meant that we woke up to the scale of the damage, hence the introduction of Invalidity Benefit in 1971. I remember as a kid in the 70s watching programmes on TV about the growing understanding of pneumoconiosis and asbestosis, though chronic back pain was probably a bigger problem in terms of numbers affected. The second secular trend is an ageing population. This meant that the cohort in the 1990s who had worked in heavy industry, become unemployed in the 80s, and had now developed chronic symptoms, were disproportionately larger due to the postwar baby boom. If you started work in a colliery or steel mill in the 1960s, and lost your job by 1990, there's a fair chance you were a candidate for a couple of decades of incapacity benefit ahead of your state pension.

It's hard for people today to imagine just how physically damaging work in heavy industry was. We forget (if we ever knew) just how easily your spine could be knackered by spending your working life in a 5 foot high coal seam or lifting heavy metal. One of the results of a greater focus on health and safety, and the shift to working in offices and retail, is that many fewer workers are now exposed to such dangerous environments. This has led to a corresponding decline in empathy as we assume that time off work due to back pain is just some lard-arse moaning about their chair upholstery.

The really long-term trend, encompassing the last 100 years, has been the evolution of benefits from a focus on frictional (i.e. temporary) unemployment assistance and short-term protection against "slumps", to a focus on long-term need, such as disability, child benefit and (more recently) in-work benefits. Structural unemployment has exacerbated this. Pensions have been there from the beginning, but have grown from a small consideration for the "lucky few" who didn't die before 70 to the largest item in the benefits bill. This is the truly chronic aspect of welfare. Though the total we spend on benefits (as a percentage of GDP) is gradually declining (if you exempt the current recession-induced bump), and the amount we spend is modest in comparison to other countries (contrary to the propaganda), there has been a significant shift in the composition of benefits since 1980 towards the elderly and the working poor. The unemployed and the disabled are a distraction from the real change. We can't do much about the ageing of society, and too little is being done about the persistence of low-paid jobs, so this will just get worse.

The shift towards chronic benefits means that Labour's attempt to resurrect the contributory principle, the idea of social security as a form of episodic insurance where you get out what you put in, is doomed to failure, and not just because of the popular misunderstanding of how insurance works. The point is that the world of Beveridge has gone and cannot be recaptured. Episodic unemployment and short retirements have been replaced by chronic need. The "crisis" is not that benefits are unsustainable or counterproductive, but that they have become a necessary prop for the modern economy. Imagine what would happen if all in-work benefits (working tax credits, child benefit, housing benefit etc) were abolished tomorrow. Society would implode. This is chiefly a function of low wages rather than too many OAPs.

The "something for nothing" meme is misleading as it assumes an unequal exchange. In reality, we rarely give money away (unclaimed benefits vastly outstrip fraudulent claims). Even allowing for HMRC's puzzling indulgence of tax dodgers, the state is generally not a soft touch. There is usually a return, so perhaps we should ask what the quid pro quo for benefits actually is. It isn't a return to work, or any other "make-work" contribution to society, like tidying up parks or over-painting graffiti, it is in fact the absence of a negative: the desperate poor indulging in antisocial activity such as crime, self-destruction and prostitution. The welfare state is the price we pay to avoid the levels of viciousness that debilitate societies that lack either public or private provision. The disruptive force of capitalism on private support (the informal networks of family and established community), and the success of neoliberalism in neutralising defence mechanisms like organised labour, means that advanced societies have become ever more reliant on the welfare state to prevent the Hobbesian war of all against all. The Jeremy Kyle Show, where Mick Philpott first came to fame, is just televisual Tamezepam.

Where benefits become a significant and regular part of workers' income, they provide a lever for government coercion independent of the employer. Crap wages and in-work benefits extend the power of the state, so it should be no surprise to hear the agents of the state start to make behavioural demands on the working poor. It's hard to resist the heaven-sent opportunity. The Philpott menage was clearly exceptional, but in one respect if was actually quite typical. Both women worked, but both were dependent on in-work benefits as well as child benefit. If there is a root cause for the growth of a "dependency culture", it is low wages. A contributory principle for benefits does nothing to address this, while sanctions to force the unemployed into low-wage jobs just exacerbates it. Liam Byrne is part of the problem.

Friday 5 April 2013

Antisocial Media

John Cassidy in The New Yorker asks "What happened to the Internet productivity miracle?" This combines two worries of the modern age: techno-pessimism and declining productivity. In a nutshell, US productivity between 1973 and 1995 averaged 1.5%, between 1996 and 2000 it rose to 2.75%, between 2001 and 2004 it peaked at 3.5%, and then from 2005 to 2012 it dropped back to 1.5% (the UK followed a similar pattern). This last stretch was obviously affected by the post-2008 recession, which has pushed annual productivity growth below 1% in recent years, but the underlying trend is clear. Productivity rose quickly around the millennium and then slowed down. Cassidy notes that the inflexion point appears to have come shortly after the infamous O'Reilly Media Web 2.0 conference in 2004, but sees this as a paradox rather than a big fat clue.

In trying to explain this slowdown, Cassidy eschews any exclusively economic or social explanation and points us in the direction of Robert Gordon's "headwinds" paper, with its conclusion that recent technological advances are not a patch on previous industrial revolutions in terms of productivity. He quotes Gordon's observation that innovations like the iPod "provided new opportunities for consumption on the job and in leisure hours rather than a continuation of the historical tradition of replacing human labor with machines". As a counterweight, Cassidy quotes Kevin Kelly, founding editor of Wired and author of the hive mind meme: "Gordon missed the impact from the real inventions of this revolution: big data, ubiquitous mobile, quantified self, cheap AI, and personal work robots. All of these were far more consequential than stand alone computation, and yet all of them were embryonic and visible when he wrote his paper. He was looking backwards instead of forward" (I can't help thinking this new world sounds like the Culture novels of the soon-to-be-missed Iain Banks).

Kelly is a well-known Silicon Valley booster who considers the new golden age to have started with "the dawn of the commercial Internet" (i.e. the Web, circa 1993). His negativity about IT before the early 90s is almost as withering as Gordon's assessment: "Standalone personal computers hardly changed our lives at all. They sped up typing, altered publishing, and changed spreadsheet modeling forever, but these were minor blips in the economy and well-being of most people. Big mainframe computers helped the largest corporations manage financial assets or logistics, but a number of studies have shown that they did not elevate much growth". Gordon thinks low productivity growth in recent years means that IT wasn't so transformative after all, while Kelly thinks the real transformation is just around the corner. Go, robots!

Gordon and Kelly both assume that innovation should lead to the increased automation of labour, which would result in higher per capita productivity. Kelly, like most utopians, is vague on the socio-political implications of his vision. He says: "The real revolution erupts when everyone has personal workbots... Everyone will have access to a personal robot, but simply owning one will not guarantee success". Indeed. He doesn't explain how we will all come by these robots, or whether there will be rationing. Will it be considered a human right to have one, or will we deny them to the unemployed? Robots are units of capital. They don't come free and they will not be owned by everyone. Most people will be subject to robots, rather than their masters, and I don't mean that in a scary sci-fi way. Robots are just a metaphor for the substitution of labour by technology. When you swipe your Oyster card on the Tube, or scan your groceries in the supermarket, you are interacting with a robot that displaced a worker.

Kelly is wrong to dismiss the pre-networked era of IT. While globalisation, and its concomitant de-industrialisation in the West, was crucially dependent on containerisation, it only really took off when improvements in logistics management (those mainframes) and corporate communication (pre-SMTP email) allowed for more sophisticated supply chains and just-in-time inventories. It's also worth remembering that the rise of financial services, and in particular global capital flows, was facilitated as much by Lotus 123 on PCs in the 1980s as Monte Carlo simulations on mainframes. The impact of spreadsheet modelling on the growth of exotic financial instruments was no "minor blip". Productivity growth is obviously a lagging indicator of the investment and retooling that produces operational efficiency. The boost to productivity in the late 90s coincided with the dotcom boom, but it was actually the product of the maturing technologies of the 80s, such as LANs, CAD/CAM, 3D modelling, corporate Unix, email gateways, MS Office and (perhaps most importantly) the rapid growth in comms bandwidth that would in turn facilitate the explosion of the Internet.

Gordon and Cassidy are wrong to see the recent stagnation of productivity growth as evidence of the IT revolution running out of steam, as if there is only ever one active force at work. They fail to think about it in social and historical terms and consider the interplay of contending forces. The first industrial revolution (steam, cotton and railways), between 1750 and 1830 in Gordon's scheme, produced enormous social dislocation in the UK up until the late nineteenth century. The spontaneous social response, the "double movement" of Karl Polanyi's phrase, mitigated this piecemeal until by the end of the century the UK, and other advanced industrial nations, had a patchwork of supportive and regulatory institutions from factory acts through basic education to rudimentary social insurance. The second technological revolution (electricity and the internal combustion engine), from 1870 to 1900, produced the social pressures that were eventually mitigated by the growth of organised labour and the development of the welfare state. My theory (which is all mine) is that the third revolution (IT and comms) has stimulated another counter-movement, but one that took a privatised form rather than a socialised one, in keeping with the ideological tide from the late 70s onwards.

The chief feature of this has been the offsetting of whitecollar productivity gains through a combination of time-wasting and the creation of supernumerary roles. This is where Gordon's insight about triviality is sound. SMTP email and the Web simultaneously boosted and reduced productivity by blurring the boundary between the workplace and the wider world. Initially there was more gain than loss, so this did contribute to higher productivity growth in the late 90s and early 00s. Whereas most businesses gained something from email and the Web, few saw a return from social media (a perfect conspiracy of the idiot and the fraud), hence productivity turned down in the mid-00s. Though the Web 2.0 bubble has now flatulently deflated, most companies have become accustomed to a working environment where staff routinely interact with technologies that provides access to non-work activities. Wearing earphones at your desk or tweeting while in a meeting barely merit comment any more. Many people think our culture has become more selfish and atomised since the 1980s, but it might be more accurate to say it has become more self-absorbed and distracted.

We have been able to sabotage the incursion of robots into the modern office (a PC is a stationary robot) because capital has been able to secure profit growth by automating and offshoring bluecollar jobs. Globalisation and de-industrialisation have produced sufficient wealth to both enrich capitalists and provide a comfortable lifestyle for the key whitecollar electoral bloc, at least for a while. Of course, the inexorable logic of capitalism means that this fragile alliance will eventually crumble. The growing cohorts of educated youth in developing nations, and the continuing impact of technology in mitigating geography, means that those high-tech and creative jobs that we spent the last twenty years cultivating will in turn start to be offshored. As global GDP grows, we will gradually lose our disproportionate share of high pay jobs. The system will tend towards equilibrium.

The results are already visible, from increasing inequality to middle-class bleating about unemployed graduates. The problem is that the decision to opt for a privatised strategy of social change management in the 80s, rather than a socialised one, has left us ill-equipped to resist the next turn as well-paid jobs (and home ownership and further education) become a perquisite restricted to a social elite. Those worried by this change direct their anxiety outwards at the usual scapegoats, the visible underclass - i.e. the people who they fear to become. Instead of solidarity we prefer to beggar our neighbours. Office-based skivers, browsing Mail Online, demand that benefits be cut from all families with more than two kids, as if victimising children who are not responsible for their circumstances would be a just punishment for the imagined moral failings of adults. Meanwhile, Cameron and Osborne stoke the ugly mood by talking of welfare as "subsidising lifestyles", at a time when the top rate of tax is being cut by 5 pence.

It seems fitting that in a week scarred by triviality and fear, the BBC released their Great British Class Calculator, which allows us to see where we fit in the emerging socio-economic hierarchy. I suspect this has been responsible for a temporary 0.1% drop in office worker productivity. As it's Great British Bollocks branding indicates, this is more a marketing device than a genuine sociological tool. The supposed class identifiers are mainly consumption preferences, though the chief factor in positioning you on the "scale" is plain old wealth and income. If you select nothing, the default result is "traditional working class" - i.e. poor and not a user of social media.

The good news for John Cassidy is that once this current recession has passed (which it will), productivity growth will probably move up towards 2 or 3% again. The bad news for most of us is that this will be the result of three forces, none of which bode well for employment: company churn as "zombies" are replaced by new entrants, the unwinding of labour hoarding, and a fresh round of automation that will predominantly affect middle tier jobs (i.e job polarisation). Furiously tweeting how much you hate benefit cheats from your office desk won't save you.

Monday 1 April 2013

Jobs for the Boys

The news that David Miliband has resigned his non-exec position at Sunderland FC, in protest at Paolo Di Canio's Fascist sympathies, proves that you can take the boy out of politics but you can't take politics out of the boy. Given last week's announcement that he was off to New York to pimp for the Tracy Gang, one assumes he was planning to desert the Black Cats anyway, a gig tied to his role as a local MP rather than any footy affiliation (he's actually an Arsenal fan and will now be able to hang out with Piers Morgan in an upscale NYC sports bar on match days - lucky fella). Citing the stroppy Italian looks like opportunism intended to deflect attention from what is essentially further collateral damage of his sulk. Unsurprisingly, the people of South Shields seem reluctant to have another "princeling" parachuted in.

It was amusing to read the panegyrics on the elder Miliband last week as they strained to identify some significance, some cause that has lost its leader, in his curtailed political career. With Bonnie Prince Charlie there was no mistaking the cause, either at the time or subsequently, but this latter day prince over the water represents little beyond the busted flush of Blairism, with David as the Young Pretender to Tony's Old Pretender. The carefully calibrated speeches given since he lost the leadership election are now held up as evidence of his agonising dilemma, unable to define a distinct position without causing unintended grief for himself or his brother. In fact they just revealed his intellectual bankruptcy, unable to progress from the "tough but compassionate" neoliberal bollocks of old. Like Charles Edward Stuart, history passed him by some time ago.

One of the most regretful reviews came from Martin Kettle, who was particularly bothered by the topsy-turvy world of modern politics in which talented youth are hothoused at Westminster before being cast upon the political scrapheap with promise unfulfilled: "In an ageing society, politicians start and finish younger than ever, with little experience other than politics in-between. This is crazy on several counts. It's all very well saying that they leave in time to fully embrace fulfilling new postpolitical lives. But there is something very wrong with a political system in which you learn your wisdom after being in charge of government rather than the other way around". As usual with Mr Kettle, this ignores the pertinent fact that young politicians with negligible experience are nothing new, and that the age profile of MPs has barely changed over time.

As the academic Philip Cowley has noted, "In 1964, the median age of Conservative MPs was 45. In 2010, it was 47. The median age of Labour MPs in 1964 was 52; in 2010 it was 52". Young MPs are not so rare - simple maths tells you that with plenty in their 60s and 70s and an average near 50, there must be a decent number in their 30s and 40s. In the past, many became MPs in "khaki elections", riding a tide of sympathy after war service, like Winston Churchill in 1900 (26 years old) and Oswald Mosley in 1918 (22), even though this meant not only narrow experience but experience particularly unsuited to the needs of peacetime. In the modern era, a dab of public service is less likely to mean a stint in the Colonial Office or the armed services (though examples like Rory Stewart show that it still happens), and more likely a short career alternating between time spent as a political adviser and a thinktank wallah.

The dominance of the career politician has been a regular trope in modern commentary, though you do suspect this owes as much to regularly watching The Thick of It as engaging with reality. The true significance is not the rise of the SPADs but the decline of political participation and local party autonomy, leading to an increased reliance on party headquarters for both direction and candidates. In the case of Labour, the gradual loss of power by trades unionists in the constituencies since the 1980s, and the introduction of all-women shortlists (which address one problem but encourage another, i.e. parachuting-in), has exacerbated this. Among Tories, preferential treatment is overt and blatantly tied to money in many cases, but twas ever thus.

The SPAD meme has perversely burnished the reputation of any politician who managed to hold down a job between gap year and election. There was a good example of this last summer when John Harris had a small epiphany: "Watching Jeremy Hunt's day at the Leveson inquiry, one thought hit me like a hammer: that he looked like the perfect modern politician, and for all the wrong reasons. He seemed shaky, inexperienced and regularly out of his depth ... Hunt's backstory, involving time in Japan and a successful education business, might seem to set him apart, but he looks and sounds like a risen-without-trace politician straight from central casting".

If Harris has thoroughly checked the facts, or just run his copy through his own internal bullshit-detector, he might have been less puzzled. Hunt, an emblematic politician for our times, is the son of an admiral. After Charterhouse and Oxford PPE, he went straight into management consultancy (i.e. selling theory rather than experience), then did a TEFL stint in Japan (you need to be well-off to afford to do this). His main commercial achievement was becoming a partner (via a mate) in an IT PR firm in the 90s (not exactly a tough market to sell into), while his "education business" was Hotcourses, a search listing for training courses (a derivative of an online job board - I doubt he masterminded the technical design). This is the profile of a serial entrepreneur - i.e. someone without any particular talent but with good connections. In Russia, he would be a "biznessman".

The revelation that Hunt equated Rupert Murdoch's corporate interests with the national good, and seemed to respond to flattery like a particularly dim ingenue, should hardly come as a surprise given his career to date. His current job, tut-tutting over the frightful NHS, is clearly being approached as just another management consultancy assignment in which outsourcing is the answer, irrespective of the question. For David Miliband, the NGO berth allows him to believe that he will be doing something socially useful on the World stage, as he glad-hands his many friends at the US State Department and chides EU leaders to be more internationalist. No doubt he will also write the occasional article directed at the UK, subtly reminding us of his talents and possible future availability should we need a saviour. The self-regard of these people is staggering. Personally I'd rather have Martin O'Neill as Secretary of State for Health (he has an air of dour probity) and Piers Morgan doing penance by running International Rescue, preferably from Thunderbird 5.