Search This Blog

Wednesday, 20 August 2014

Crime and Project Management

Project management and politics are once more in the news. This is partly due to it being August and a slow news week, hence the appearance of thinly-disguised listicles and boilerplate about why government projects tend to fail. Plus ça change. Ahead of the 2010 general election, and on the back of another list of government IT failures, David Cameron "signalled a move away from big IT projects, suggesting he will use technology to increase the transparency of government". In light of the NSA and GCHQ revelations, as much as the Universal Credit debacle, this is ironic.

IT projects regularly fail in both the private and public sectors. The supposed higher rate of failure in the latter is attributed to the extra complication of regime change (at both government and ministerial level) and the conflicting interests of politicians and civil servants (the Yes Minister trope). In fact, there is no good evidence that there is a higher rate of failure in the public sector. What is known is that the private sector is better at obscuring failure and redefining success. Conversely, public sector projects are (with a few exceptions) publicly audited and their success or failure is often a matter of political judgement, so there is usually an opposition willing to challenge the government's verdict. The Obamacare rollout, which for a while looked like the epitome of big government project failure, has dropped out of the headlines in the US and lost its political charge due to its gradual success, as could have been predicted.


The main reasons why IT projects fail are also well known, because they are common to most projects and have nothing to do with technology. It's mostly about people, and a little bit about process: a lack of genuine commitment to change, poor leadership, not involving the right people, poor supplier management, a failure to meaningfully define success, a failure to know when to stop or change course. The mitigations and contingencies for these weaknesses are also well known, and widely ignored. The root problem is that many projects are faith-based. The earliest records of project management we have are religious works. For example, the Old Testament is one project after another: build an ark, find the promised land, build a temple.

Project management is an attempt to provide a rational framework for goals that are too often inspired and driven by emotion. It is no accident that words such as "mission" and "vision" feature prominently in the language. The problem that public sector projects face is not that government is incompetent, but that these projects tend to be more emotionally charged, both in terms of their impacts and ambition. Replacing payroll system A with payroll system B may lead to tears and tantrums in one company, but this is trivial in comparison with the change to a national benefits system that will supposedly transform shirkers into strivers.

Many government IT projects are initiated for gestural reasons. This is not just about ministers being seen to be doing something, but about articulating a political vision, a practice as old as the pyramids. This truth is usually ignored by the "experts". For example, suggesting that UC could be better implemented using an incremental approach, avoiding a risky "big bang" and allowing for gradual adaptation, makes perfect sense technically but ignores the political importance of the scheme, which goes well beyond "making work pay". The commonly-understood goal is to cut the welfare bill (officially through greater efficiency and reduced fraud, unofficially through increased harrying of claimants). The gestural purpose is to introduce the idea of a citizens' basic income. It is the all-or-nothing approach, and IDS's associated zealotry, that constitute the symbolic message to the electorate: everyone must work, the poor must be subsidised (due to the incontestable needs of the market), and complexity leads to fraud. A low-key, gradual improvement in process efficiency is politically irrelevant.


This emotivism is a double-edged sword. Just as politicians will happily lead a project to failure so long as it continues to reflect the desired policy stance, so public opinion (or at least the media) may deem a project a failure because of unreasonable expectations. The e-Borders system has "failed" because it is popularly assumed to be a tool for "controlling immigration", which is not something that can be effectively done at passport control (and has arguably been a non-issue for some time now). As a policing tool - i.e. intercepting criminal suspects - it appeared to work fine. The downgrade to the "Go home or face arrest" vans earlier this year was an equally pointless gesture aimed squarely at xenophobes attracted by UKIP and a demanding press, not at people who had overstayed their visas.

Government also has a role as an advocate of technology and (increasingly) the "power of information" to both business and the wider population, hence the e-government frenzy that started in the late 90s. This partly explains its reluctance to consider non or low-tech solutions for public services (e.g. the insistence that job seekers must do online job searches, despite the well-known problem of fake jobs on job boards) and also partly explains the attractiveness of government as a client for large solution providers. But the higher risk of perceived failure (or at least the higher profile given to failures in the media as evidence of ministerial incompetence, big government ineptitude and supplier abuse) makes this a costly strategy.

Clearly then, the subjective rewards (or "soft benefits" in project management-speak) must be of considerable value to politicians. I think there are three worth noting. First is decisiveness. Launching a project (or any sort of initiative) is the end-game for many politicians who fear their ministerial tenure may be short. Unless you can arrive late and take credit for a successful implementation (e.g. Boris Johnson and the London Bike Scheme), few want or expect to be around at project end (IDS may have had a stay of execution till 2015, but he is likely to quit the scene long before UC exits the pilot stage, if it ever does).

Another attraction is that government IT projects encapsulate the neoliberal idea of managerialism: the belief that for every problem there is a solution that is amenable to generic management skills. The project management process itself encapsulates the tropes of measurement and monitoring, with reality easily giving way to the figment of targets and progress (the "reset" of Universal Credit and IDS's insistence that it is "on target" are examples of the lunacy this gives rise to). Similarly, e-Borders might appear a colossal waste of money, but it serves to further normalise the idea that society should be constantly surveilled and that we should be able to accurately quantify classes of citizen or resident. Even the growing role of Parliamentary committees acting as project auditors, rather than just spending watchdogs, reinforces the neoliberal trope of constant inspection.


The third benefit is the normalisation of failure, which is central to capitalism. One of the mantras of Silicon Valley, which was adopted from agile software development, is the Beckettian "fail fast, fail better". In reality, this is often little more than cant in commercial organisations, where the empirical method remains alien, while most media evangelists turn out to be self-serving "serial entrepreneurs". Ironically, the history of "the entrepreneurial state" encourages the belief that failure is intrinsic to government projects, albeit in an ultimately beneficial way. You can hear this in the words of Tony Hall, the Director General of the BBC (lumped with government when it comes to project management) when he talks of the corporation's role: "We must be the risk capital for the UK. We have got to be the people who have enough confidence to be able to say that we are going to back things that may not work".

Government is unlikely to wean itself off gestural politics so long as it depends on elections, so a certain amount of project failure is an unavoidable cost of democracy. The state's role as the risk-taker and experimenter of last resort also makes it unlikely that it will ever become (or be seen to be) a better project manager than a private sector with selective memory. Finally, managerialism is bigger than neoliberalism and central to all flavours of government. Even the "night-watchman" state of right-libertarian fantasy (which is by definition a police state) will waste money on projects equipping the military and building courthouses.

This intractable reality leads the critics of government project mismanagement down the blind alley of personal responsibility: "we should guarantee that the ministers, senior civil servants and corporate CEOs involved will all be publicly sacked if the project fails". This is close to Stalin's approach to project management (JFDI) and project failure (the gulags), but as we have seen with the banks, pinning blame in complex organisations for activities that spanned many years is difficult, and project failures rarely produce a smoking gun like an email requesting that LIBOR be rigged. The dirty truth is that big IT projects, in both the public and private sectors, carry an implicit indemnity for the participants, much as banking does for executives who stop short of actual criminality. If this weren't the case, no one of any real talent would take the job on. Despite the ample scope for fraud and abuse, big IT projects remain largely crime-free zones.

Monday, 4 August 2014

Lights Out for the Territory

The suggestion that we should all turn the lights out between 10 and 11 this evening, as a commemoration of the centenary of the start of World War One, is annoyingly sentimental and likely to be largely ignored. A cynic might suggest that it is merely an encouragement to gather round the glow of the TV, as the BBC broadcasts a candlelit service from Westminster Abbey, or to fire up Jeremy Deller's specially-commissioned app. I'm sure the latter will be worth watching (social commemoration is Deller's forté), but the former looks way too earnest (over two hours of Huw Edwards' downbeat tones as po-faced clerics snuff out candles) and likely to prove poor competition up against Death in Paradise and Tulisa: The Price of Fame.


The lights out theme (which also made me think of Iain Sinclair and Mark Twain) is meant to evoke Edward Grey's famous words, spoken on the eve of war, that "The lamps are going out all over Europe, we shall not see them lit again in our life-time". This is one of those portentous sayings that has acquired resonance due to subsequent history (not least that he died in 1933). Had the war really all been over by Christmas, as many politicians thought, we'd probably never have heard of the phrase (Grey only published it, and then on the basis of a friend's memory, in 1925). It also hints at the coming of aerial warfare and the blackouts of WW2, not to mention the destructive impact on Europe of what would eventually amount to over three decades of conflict. Prescient stuff.

The centenary has had the effect of obscuring and confusing current conflicts, as we look for echoes of the past and worry about an incipient global crisis. The conflicts in the Middle East (Syria, Iraq and Palestine) are all refracted through vague memories of the collapse of the Ottoman Empire (in the commentariat market the Sykes-Picot Agreement is up, the Balfour Declaration down), while Ukraine is increasingly cast as a murderous Balkan affair that risks dragging in the Great Powers. Even victory in the World Cup is seen as evidence of the final rehabilitation of Germany (for Germans that meant 1954, for Brits it appears we've only just got used to the idea).

Grey's words, dripping elite nostalgia (you can picture the scene, as the Foreign Secretary in his frockcoat gazes from his office out over St James' Park), have naturally appealed to American conservatives fearful that the US is losing its dominant role in the face of rising powers abroad and perceived weakness at home. This is a continuation of the pro-empire polemic undertaken by neocons since 2001, where the assumed errors of British policy (insufficiently interventionist before 1914 and too entangled in Europe afterwards) and the consequent cycle of decline are held up as warnings to the current global hegemon. In essence, the US should whack its assumed enemies wherever and whenever they appear, regardless of territorial integrity and collateral damage, and should treat multilateralism with disdain ("We're an empire now, and when we act, we create our own reality"). The plot of pretty much every action movie since Vietnam.

Despite the caution of the Obama administration, which is little more than an oscillation back from the over-reach of Bush Jr, US foreign policy has not turned to peace and love, and there is no prospect of it doing so any time soon. Hillary Clinton's defence of Israel over Gaza is not about appeasing the "Jewish lobby" ahead of a Presidential campaign, but a simple articulation of State Department policy in which Israel functions as a proxy for the US. If you harm us, we will whack you. Disproportionately. The non-intervention in Syria is not appeasement, it is acceptance that regional interests are best served by a grinding conflict that largely takes that country out of the game. Similarly, the "loss" of Crimea is of little consequence compared to the advance of NATO to Russia's borders.

Such pragmatic calculation is the way of the world, but as Iraq and Afghanistan have shown, it is easy for delusion and prejudice to cause those calculations to backfire. The two fundamental errors the British made in 1914 were thinking that they could restrict their military contribution to naval domination and a small expeditionary force, while France and Russia provided the land armies (and the bulk of casualties), and that the cost of financing the war (including loans to their allies) would be manageable because it would be short. In effect, a repeat of the episodic conflicts of the Napoleonic era, with a similar byproduct of additional imperial possessions picked up on the cheap (I suspect the bicentenary of the start of the Congress of Vienna will not be marked).

What US foreign policy shares with that of its British analogue of one hundred years ago is the strongly defined division between here and there, between home and abroad. Not just in the quotidian sense that foreign is a different country, but in the belief that normal laws and norms of behaviour do not apply "there", nor do they apply to "them" when they are here. The US-UK cooperation over torture and intelligence-gathering since the millennium are cases in point, as is the tolerance of Israel "mowing the lawn" in Gaza. This is the wider truth: despite the best attempts of apologists like Niall Ferguson, empire corrupts both ruler and ruled, and the lesson from Britain today is that the stink hangs around for a very long time. The "unsivilized" territory beyond the horizon that entices Huck Finn is also a site of genocide, but that crime and many others can be traced back to offices overlooking Foggy Bottom and St James' Park. What we need is more light, not less.

Wednesday, 30 July 2014

The Opinion of Others

All media are driven by the opinion of others. Factual news is expensive to acquire, unreliable in its supply, and tends towards the dull but worthy. Opinion, on the other hand, is inexhaustible, cheap and reliably contentious. In a variation on the City speculators' mantra, there is always a "greater fool" who will respond to Richard Littlejohn or George Monbiot (I plead guilty on numerous counts, m'lud). We turn gratefully from pictures and testimony of suffering in Gaza to verbal bunfights between Israeli spokesmen and TV news anchors, where the likelihood of a mind being changed is precisely zero (and, surprisingly, not likely to be helped by Mia Farrow and other slebs).

The first newspapers, which were gazettes of court announcements, were of limited interest compared to illustrated chapbooks detailing Catholic atrocities or tracts promising the imminence of God's Kingdom. Though literate snark soon came to the fore (The Spectator was launched in 1711), most newspapers continued to rely on adverts as much as editorial to pique interest until nineteenth century mass-literacy led to the "human interest story" and a realisation of the power of mobilised opinion to sway elected legislatures. The long twentieth century, from the launch of Tit-Bits in 1881 to the Web 2.0 media-moment in 2004, was distinguished by gossip, the privileged opinions of the high and mighty, and messages from "our sponsors".

The Internet has supposedly democratised opinion, providing a variety of platforms for "everyone" to have their say, but this just means a vast increase in content (gossip, opinion, ads) and thus an even greater value accorded to aggregators amd filters, which is what Tit-Bits was and arguably Addison and Steele's imaginary spectator was too. Plus ça change. Though some services claim that you, the consumer, are now able to act as your own aggregator (choosing who to follow, specifying your interests etc), the trope of "content overload" obviously serves the purposes of those who would "pre-curate" your content stream based on algorithms that analyse your history and relationships. Of course, such algorithmic precision is a myth, which even some services are happy to admit.


The churn in technologies and corporate providers obscures the persistence of opinion as the major driver of media. Thus incumbent providers, like the press, lift up their skirts and screech at the thought of Facebook manipulating a user's stream, while music nostalgists bemoan the death of the album (curation by Big Music) under the onslaught of streaming service playlists (dominated by sleb curation, which is the new face of Big Music). The success of Twitter, which is a pretty ropey piece of technology, is down to the simple fact that it provides raw, instant opinion, the slebbier the better (accept it: Rihanna is a better football pundit than you are).

Search engine results can be thought of as a type of playlist, dynamically curated by an algorithm that aggregates and orders relevant pages based on the "opinion" of other pages. The plea by Google that they should not be held responsible for the opinion of others (which has predictably found favour with the House of Lords) depends on a belief that the algorithm is an accurate reflection of that aggregated opinion (impossible to know) and not subject to any manipulation or systemic bias (clearly untrue). The myth of the Internet is that everyone has an opinion (because property is universal) and that the expression of that opinion should be free and unrestricted (because the state should not limit your rights in your property).

The key change that occurred a decade ago was that we moved from a culture in which being a passive consumer of the opinion of privileged others was deemed sufficient to one in which we must now all express an opinion as well, whether on Gaza or a friend's new cardigan. But that opinion does not need to be a reasoned analysis (we don't seriously seek to dethrone the aristocracy of pundits, merely to vote them up or down by the weight of our comments). A simple preference will do, either through a binary "like" or a top 10 ordering (the modern "listicle" is simply a way to teach us correct practice). Creating lists, and curating them on an ongoing basis, has become a social obligation: performative opinionising as a way of defining who we are. By their public playlists ye shall know them.

Thursday, 24 July 2014

L'Etat C'est Moi

Tony Blair's speech this week, twenty years to the day after he was elected Labour Party leader, prompted a number of retrospectives by his supporters in the press, which in turn prompted the usual venom beneath the fold about duplicitous war-mongering. Both sides largely ignored the speech itself, which was delivered in memory of focus group impressario Philip Gould and was eerily anachronistic in its promotion of a supra-ideological "third way", the false dichotomy of the individual and the collective, and the notion that the "zeitgeist" is some sort of constructed reality amenable to policy. It's as if Blair has barely been in the country of late.

Most of his defenders feel that his progressive record has been unfairly tarnished by the fallout from Iraq, which the former PM mentions precisely once in his speech, in a clause starting with "whatever". John Rentoul in The Independent was typical: "The country has changed, mostly for the better, in 20 years and much of it is because of Tony Blair. Unexpectedly, the change was best summed up by David Cameron in his words on entering No 10 four years ago, when he said that the country he inherited was 'more open at home and more compassionate abroad' than it had been". This is Great Man history, which marginalises the contribution of others (the snub to Gordon Brown is deliberate), wallows in nostalgia ("the sun shone more under the old king"), and equates the national mood with a personal style.

Janan Ganesh in The FT suggests that Blair was more pragmatic than he is given credit for: "He governed with the grain of history, nudging it along from time to time, but never upending a country that was functioning well enough". This is vapid insofar as almost all heads of government do exactly the same. It would serve just as well as a testament to John Major. It might appear paradoxical that a self-styled progressive like Rentoul would laud the impact of the individual on history, while a conservative like Ganesh emphasises structural forces, but the former is merely the neoliberal valorisation of "talent" as a proxy for class, while the latter believes that the exceptional Thatcher changed the course of history, tearing up the postwar settlement and embarking on a social and economic revolution that Blair merely continued (the reality was that she rode the wave of global structural change as much as she vigorously turned the tiller).

Much of Blair's achievement is simply down to longevity. If you cling to office long enough, you will get the credit for all sorts of secular changes and social shifts that simply coincided with your watch, while observers will marvel that you haven't stayed exactly the same ("from Bambi to Bliar"). In The Telegraph, Stephen Bush credits Blair with installing a security door on his childhood block of flats, much as good harvests were once attributed to the beneficence of the distant monarch. Similarly, Rentoul reckons Blair "achieved the near-impossible in Northern Ireland" with the signing of The Good Friday Agreement, ignoring the patient build up since the Downing Street Declaration in 1993, not to mention the obvious readiness of the paramilitaries for a face-saving peace deal long before Blair's accession in 1997.

Ganesh claims that Blair "did not come from anywhere in particular", despite the public school and Oxford background, though his classlessness and cosmopolitan ease were more apparent to journalists than to the average voter. Cameron's belief that he could model himself on Blair sprang from a realisation that he was not a million miles away from the upper middle class lawyer in style and experience. Chris Dillow identifies Blair's "managerialist ideology" as his weakness, leading to over-confidence and poor decision-making. This ideology was part of a wider commitment to neoliberalism at home and abroad, which included the privileging of The City, his mugging by US neocons over Iraq, and his subsequent ascension to the global 0.01%. (In The Guardian, Michael White said: "Someone told me recently he'd brokered an oligarch's yacht sale". The point is not that this tale is true, but that it is credible).

Once they got beyond the noise about the horrors of the Saddam regime (i.e. "he had it coming"), Blair's supporters initially defended the decision to go to war in Iraq on the grounds that it was made in good faith, the absence of WMD notwithstanding. As the cost of failure mounted, that tactical error was subsumed beneath the massive strategic blowback, suggesting that either the plan all along was to trash the region or else the architects of intervention were drunk on their own power. Blair seems oblivious to the irony of his contemporary words: "Third way politics begins with an analysis of the world shaped by reality not ideology, not by delusionary thoughts based on how we want the world to be, but by hardheaded examination of the world as it actually is".

A feature of Blair's delusion, which is being eagerly advanced by his supporters in the press, is that all the structural failings that he ignored or even encouraged, such as the growth of in-work benefits and the indulgence of The City, were the fault of Gordon Brown. This is a monarchical defence, in which the good king is undermined by his bad ministers. For Blairists among the Tories, this allows Ed Miliband to be tainted by association with Brown, though they struggle to square the idea of him as Cardinal Richelieu's Eminence Grise with the Wallace meme. For Labour Blairists, it holds out the prospect that the king over the water (in Jerusalem, mostly) may one day return, like De Gaulle recalled from Colombey. What odds a Blair-Sarkozy entente by 2020?

Monday, 21 July 2014

Damnatio Memoriae

Each time a celebrity is jailed for sexual abuse, or is definitively found guilty by a posthumous review, you can guarantee a flurry of commentaries on how their memory will now be effaced, their image edited out of the Top of the Pops archive and embarrassing memorials removed from view, thereby enacting symbolic violence against their person. Though we may be uncomfortable with the overlap of this practice with Stalinist airbrushing and the "unperson", there is also a sense that removing the disgraced's representation from the public record is a punishment with an impeccable pedigree. As a political act, this goes back to the Roman's damnatio memoriae and the cartouche tippexing of the Ancient Egyptians. As a social act, this is merely a larger-scale example of the willed forgetting and photo-chopping that follows on from bitter divorces and family estrangements. Rewriting the historical record to excise painful memories is normal.


In the early days of this process, while the evil nature of the disgraced is being established beyond doubt, nothing is allowed that might contradict the reduction of the person to the essence of their evil acts. Once the reputation is sufficiently blackened, contrasting imagery begins to reappear. The bigger the evil, and thus the more secure it is from revision, the more of this hinterland can be accommodated over time. A good example of this is Adolf Hitler, whose documentary appearances in recent decades have featured more incidental scenes with children and animals, not to mention the now clichéd horsing around with Eva Braun at Berchtesgaden, rather than just stock footage of ranting or map-pondering. A Channel 4 documentary about Hitler's dogs is surely in development already.

Presenting a rounded and contextual picture of a person is good history, but it also serves to blur the boundary between the person and their environment, i.e. other people. One reason for the extreme editing of the early years is the fear of contamination: the embarrassment of the other DJ seen grinning as Jimmy Savile leers at a young girl on TOTP, or the variety show host introducing his "very good friend" Rolf Harris. Effacement is always a cover-up, obscuring the connivance and negligence of others as much as the memory of the disgraced. This extends to the common stock of social memory, with associates now claiming they "hardly knew" X or had "little to do with" Y.

This willed oblivion will be much harder in future due to the inexhaustible memory of the Internet. The aftermath of the EU Court of Justice ruling on Google and the misnamed "right to be forgotten" has seen a campaign by the US search company to convert the issue from one of data property to one of "press censorship", at the same time as the UK government has forced through the Data Retention and Investigatory Powers (Drip) Bill, which will provide Google et al with the legal cover to continue supplying GCHQ with data. The framing of the EU judgement as a threat to "free speech" and "fair comment" is in the interests of both Google, which uses "freedom" to enclose the commons, and a print media that is largely anti-Leveson, anti-EU and keen to adopt the moral high ground relative to the search giant.

In practice, the threat of censorship arising directly from the ECJ ruling is non-existent, not least because the obligations on Google and others have yet to be established in the national courts. The US company is deliberately and provocatively jumping the gun in order to trigger an anti-ECJ groundswell. It serves Google's interests for people to assume that it and the Internet are one, rather than it being simply a glorified index that only covers a fraction of the Web and is already biased by the needs of advertising and existing government regulation. Google's difficulty with the ECJ is the product of its ambition to construct a persona for all users, i.e. a single profile encompassing both the individual's behaviour in Google applications and the wider "file" of that individual fragmented across the Web. It cannot admit the trivial impact of the ruling without simultaneously robbing its brand of the mystique of omniscience. There is a sound commercial reason why Google does not even acknowledge the existence of its competitors in the search market.

In contrast, while Google gets agitated about something that its users (mostly) don't care about, Facebook finds itself criticised for treating its users like guinea pigs, even though its now notorious "experiment" was simply a standard A/B test of the sort routinely carried out by commercial websites: just substitute the words "interested" and "bored" for "happy" and "sad" (the infantilism of the experiment is noteworthy). The "emotion contagion" finding is hardly news, and the focus on manipulation by Facebook alone is odd when editing content to stimulate interest or reaction is what all media organisations do every day of the week (the negative reaction by old media may indicate a fear that new media is much more effective at shaping public opinion).

The basis of the ethical storm is that while users have willingly alienated their data, they did not believe they were agreeing to be experimented on when they signed the terms of service (which they didn't read anyway). The difference this highlights is that Facebook see the users as an extension of the data (which they ultimately own), while the users still believe they are independent, even if they have ceded rights over that data. Google has the same view as Facebook, hence the importance they give to treating the online persona as superior to the offline person. The ECJ ruling embodies a different philosophy, which sees the data as the inalienable property of the citizen. The dispute is thus being framed as a conflict between previously complementary liberal principles: free speech and private property. Of course, free speech, in the particular form of a "free press", is essentially a property right, so the whole affair (including Drip) boils down to a tussle over property.

Because personal data is increasingly the property of others, I suspect that the convention of damnatio memoriae may well be on its way out. The instinct and desire to obliterate will remain at the individual level, but that just means "unfriending" and deleting accounts so that the offensive memory is hidden from your own view. At the societal level, organisations like the BBC, with a perceived responsibility to curate the historical record to reflect public opinion, will find their actions increasingly irrelevant and difficult to defend. That epigraph is the property of Google, that cartouche the property of Facebook.