Search

Wednesday, 22 February 2017

Ignorance and Democracy

One thing we can already say in advance of this week's two by-elections is that the structural decline of democracy was not arrested by the jolt of last June's referendum. Popular sovereignty was quickly absorbed by the Tories as they jerry-built a soft plebiscitary dictatorship in which the unilateralism and secrecy of Brexit looks set to infect all areas of government. Despite Tony Blair's bizarre call for an uprising, the fragmented centre has committed to a self-indulgent martyrdom, ironically proving its members to be all heart and no head. Meanwhile, Corbyn's attempt to triangulate between the popular will and the interests of Labour's electoral base has been condemned by those for whom triangulation was once a supreme virtue. As I noted back in June, the response to the vote was heavily conditioned by a theory of democracy, dating back to Plato, that focuses on two failings: the people's lack of expertise and their inability to discern the good.

The latter has proved the weaker critique, reflecting both the subjectivity of liberal modernity (we're told we can each pursue our own good) and the impossibility of getting the mass of voters to recognise themselves in the caricature of an irresponsible and inconsiderate demos (an ironic product of the creation of a demonised underclass as the real "enemy within"). It is the critique of ignorance, and the concomitant defence of expertise, that has proved to have legs, not least because it has provided a theoretical framework for the modish concern with "fake news". But the real reason for this idea's success is that it has never gone out of fashion, whereas the belief that the people couldn't recognise the good was forced to take a back-seat with the advent of universal suffrage. Today's "alt-facts" are part of a long tradition in which democracy is corrupted by the base appetites of the lower orders fed by opportunistic new media and demagogues.

In the current cycle, the finger of blame has been pointed firmly at the Internet, despite the evidence that UK tabloids were more decisive in influencing the leave vote and that US cable news was more influential with Trump voters (and with Trump himself - see his recent comments on Sweden prompted by a Fox News report). The idea is to suggest a better status quo ante, and thus relative decline, though academic evidence suggests that levels of public knowledge in respect of public policy have been pretty consistent over the years - i.e. consistently low - while general levels of trust in experts remain high. This points to two great truths. First, most people take a limited interest in politics because the subject has limited relevance to their daily lives. This is why issues around health and education (and occasionally housing) have resonance when they arise. Second, social hierarchies are nowhere near as fluid as the myth of meritocracy would have it. The idea that society as a whole would suddenly lose faith in "experts", while still retaining respect for a monarchy and an unelected House of Lords, is absurd.

The claim that society has intellectually degraded because of the Internet is a commonplace among both conservatives and establishment liberals, though while liberals often emphasise malign forces (e.g. neo-Nazis gaming Google), conservatives tend to focus on the indolence of the people. Tom Nichols, writing in Foreign Affairs on "How America Lost Faith in Expertise" (a summary of his book on the subject), gives this old idea a modern, special snowflake spin: "Americans have reached a point where ignorance—at least regarding what is generally considered established knowledge in public policy—is seen as an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to demonstrate their independence from nefarious elites—and insulate their increasingly fragile egos from ever being told they’re wrong". The culprit is clear: "Ask an expert about the death of expertise, and you will probably get a rant about the influence of the Internet. ... It has allowed people to mimic intellectual accomplishment by indulging in an illusion of expertise provided by a limitless supply of facts." At least he didn't blame postmodernism.


Nichols' disdain for the Internet reflects the professional anxiety of the credentialed academic: "I fear we are moving beyond a natural skepticism regarding expert claims to the death of the ideal of expertise itself: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, teachers and students, knowers and wonderers—in other words, between those with achievement in an area and those with none". Ouch. But as an academic he cannot avoid admitting the lack of novelty in all this: "Of course, this is no more and no less than an updated version of the basic paradox of the printing press ... Libraries, or at least their reference and academic sections, once served as a kind of first cut through the noise of the marketplace. The Internet, however, is less a library than a giant repository where anyone can dump anything. In practice, this means that a search for information will rely on algorithms usually developed by for-profit companies using opaque criteria."

He doesn't explain how this is different to the bias of traditional publishing houses or newspaper proprietors, he merely asserts that it is much worse: "The Internet is the printing press at the speed of fiber optics", which is as meaningless as his ahistoric use of "marketplace" in respect of knowledge and ideas. Nichols' trawl through history does at least identify his true focus, which is not expertise in general (despite weak asides about cognitive bias and anti-vaccine nutters) but politics: "Over a half century ago, the historian Richard Hofstadter wrote that 'the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and comprehendingly perform for himself ... In the original American populistic dream, the omnicompetence of the common man was fundamental and indispensable. It was believed that he could, without much special preparation, pursue the professions and run the government'". That mythical common man obviously didn't include slaves, native Americans or even white indentured labour.

Hofstadter was the author of The Paranoid Style in American Politics (published in the same year that Dr Strangelove was released), which discussed the instrumental use of conspiracy theories. Without irony, Nichols notes that "Conspiracy theories are attractive to people who have a hard time making sense of a complicated world and little patience for boring, detailed explanations", which might seem to reinforce Hofstadter's point about omnicompetence were it not for the indisputable fact that conspiracy theorists actually have huge patience for boring, detailed explanations, from fluoridation to email servers. This appetite for "theory" sits uneasily with the characterisation of the demos as ignorant and lazy, such as in Nichols' citing of a Washington Post poll on American intervention in Ukraine: "Only one in six could identify Ukraine on a map ... the respondents favored intervention in direct proportion to their ignorance. Put another way, the people who thought Ukraine was located in Latin America or Australia were the most enthusiastic about using military force there".


This is a classic party trick, like asking people to estimate the population or point due north. Most people get this wrong simply because the information isn't necessary to them in their daily lives. It doesn't mean that they are stupid or that their views should carry less weight. Nichols concludes by invoking another trope first deployed by Plato, the demos as children, and uses this both to justify technocracy and excuse it as the inevitable response to populism: "Americans (and many other Westerners) have become almost childlike in their refusal to learn enough to govern themselves or to guide the policies that affect their lives. ... In the absence of informed citizens, for example, more knowledgeable administrative and intellectual elites do in fact take over the daily direction of the state and society. ... Today, however, this situation exists by default rather than design. And populism actually reinforces this elitism. ... Faced with a public that has no idea how most things work, experts disengage, choosing to speak mostly to one another."

This argument seeks to reverse the causal relationship, suggesting that what Peter Mair, in Ruling the Void, called "the withdrawal of the elites" has been occasioned by a recent failure of the public to maintain sufficient knowledge of policy, rather than public disengagement (as measured in falling turnouts and party membership) being the result of the professionalisation of party politics. This idea of a secular decline in public competence competes in the marketplace of conservative ideas with the theory of structural disincentives: "the probability that our votes will make a difference is, for most of us in most major elections, vanishingly small. ... In short, the reason people are mostly ignorant and biased about politics is that the incentives are all wrong. Democracies make it so that no individual voters' votes (or political beliefs) make a difference. As a result, no individual is punished for being ignorant or irrational, and no individual is rewarded for becoming informed and rational. Democracies incentivizes us to be 'dumb'" (that there are no real-world political systems that incentivise everyone to be well-informed suggests an elite bias against popular knowledge).

Though this appears to condemn us all to ignorance, given that we're each subject to the same disincentives, the unspoken assumption is that a reduced, more "qualified" electorate would fix the problem. But qualified in what? While contemporary epistocrats talk about educational achievement, the more traditional Platonists advocate the return of property qualifications or votes proportionate to tax contribution. This reflects the fact that politics is not a natural science with observable laws but a social construct and therefore both contestable and malleable. The problem with the critique of ignorance is that what is considered consequential is politically determined. In other words, the people can be alienated from politics by defining it in terms that are preferential to elites. To that end, "expertise" can play a role in isolating politics rather than opening it up to general understanding. The obvious example is foreign affairs, the continuation of aristocratic governance by other means, though this often backfires when the public does take an interest - hence the "Do you even know where Ukraine is?" manoeuvre.


As a social and thus historically-situated construct politics is also subject to structural change. Two recent examples are the impact of globalisation and neoliberal practice. The transfer of powers to Brussels may have been exaggerated by a Eurosceptic media, but it was none the less real, as was the role of privatisation in removing housing and much of economic management from public influence. The growth of "independent" regulation and technocratic management has substituted expert scrutiny for public oversight but at the cost of regulatory capture and groupthink. While experts can legitimately complain about assertive ignorance in the face of scientific evidence, e.g. in respect of climate change or vaccination, this is more difficult to do in the realm of politics when so much of policy has been deliberately steered into areas beyond public purview. The problem is not that people are rejecting the evidence of the experts, but that the evidence is increasingly unavailable to the public.

By-elections have long been promoted as opportunities for a "protest vote". In recent years this has started to morph into the idea of by-elections as exercises in attention-seeking in which electors lash out in frustration: a cry of pain rather than a specific demand. In other words, emotion has got the better of intellect. A hilarious example of this framing was a Guardian-sponsored focus group in Stoke, made up of 10 wavering Labour voters who were subjected to the infantilising exercise of "drawing the parties as cars", which produced this conclusion: "they all agreed that a Ukip win would have an impact on a national level as it would force people to listen to the area's concerns". The rhetorical inflation from "send a message" to "force people to listen" pays lip-service to the instrumental theory of by-elections (though it isn't clear how the election of Paul Nuttall would force anything), but it transfers the electoral outcome from the realm of reason to that of emotion. This is the flip-side of the "listening to people's concerns" cant: the concerns are never interrogated and thus properly engaged with, because the people are only capable of emotional spasms not reasoned argument.

There is a perceived tension at the heart of representative democracy between the need to emotionally reflect the people and intellectually constitute the state. This is the head/heart dichotomy beloved of the self-proclaimed pragmatists and it reflects Plato's original belief that the people must be guided by their betters because they lack self-control as much as expertise. In practice, many voters are frustrated by their representatives' intellectual timidity. The dissatisfaction with the Article 50 debate and Jeremy Corbyn's election(s) are two different examples of a real appetite for policy. Likewise, many are irritated when politicians indulge in the emotionalism of the state, such as last year's "project fear" or Theresa May holding Donald Trump's hand before wittering on about a special relationship. The people are no more emotional or ignorant today than they have ever been. If they are alienated from politics, then that is the fault of politicians, not the people. I have no idea how the two by-elections will go, but my fear is that turnout may be poor, and not just because of the weather.

Wednesday, 15 February 2017

Algopops

One of the defining characteristics of the debate on the role of software in modern society is the tendency towards anthropomorphism. Despite the stories about job-stealing robots, what we apparently fear most is not machines that look vaguely like humans, with their metal arms whirling over production lines, but malicious code that exists in a realm beyond the corporeal. We speculate about the questionable motives of algorithms and worry about them going "rogue". Such language reveals a fear of disobedience as much as malevolence, which should indicate its origin in the old rhetoric of class (much like the etymology of the word "robot"). In a similar vein, the trope of the hacked kettle recycles the language of outside agitators and suborned servants. In contrast, artificial intelligence is subject to theomorphism: the idea that in becoming superior to human intelligence it becomes god-like, an event that can occur well short of the technological singularity (as Arthur C. Clarke noted, "Any sufficiently advanced technology is indistinguishable from magic").

This distinction between algorithms and AI has echoes of the "great chain of being", the traditional theory of hierarchy that has enjoyed something of a revival among the neo-reactionary elements of the alt-right, but which can also be found buried deep within the ecological movement and the wider culture of Sci-Fi and fantasy. Given that mix, there should be no surprise that the idea of hierarchy has always been central to the Californian Ideology and its (non-ironic) interpretation of "freedom". If Marxism and anarchism treat class and order as historically contingent, and therefore mutable, the defining characteristic of the party of order - and one that reveals the fundamental affinity between conservatives and liberals - is the belief that hierarchy is natural. Inheritance and competition are just different methods used to sort the array, to employ a software term, and not necessarily incompatible.

Inevitably the cry goes up that we must regulate and control algorithms for the public good, and just as inevitably we characterise the bad that algorithms can do in terms of the threat to the individual, such as discrimination arising from bias. The proposed method for regulating algorithms and AI is impeccably liberal: an independent, third-party "watchdog" (a spot of zoomorphism for you there). Amusingly, this would even contain a hierarchy of expertise: "made up of law, social science, and philosophy experts, computer eggheads, natural scientists, mathematicians, engineers, industry, NGOs, and the public". This presents a number of scale problems. Software is unusual, compared to earlier general purpose technologies such as steam power or electricity, in that what needs regulation is not its fundamental properties but its specific applications. Regulating the water supply means ensuring the water is potable - it doesn't mean checking that it's as effective in cleaning dishes as in diluting lemon squash. When we talk about regulating algorithms we are proposing to review the purpose and operation of a distinct program, not the integrity of its programming language.


In popular use, the term "algorithm" is a synecdoche for the totality of software and data. An application is made up of multiple, inter-dependent algorithms and its consequential behaviour may be determined by data more than code. To isolate and examine the relevant logic a regulator would need an understanding of the program on a par with its programmers. If that sounds a big ask, consider how a regulator would deal with an AI system that "learns" new rules from its data, particularly if those rules are dynamic and thus evanescent. This is not to suggest that software might become "inscrutable", which is just another anthropomorphic trope on the way to the theomorphism of a distracted god, but that understanding its logic may be prohibitively expensive. Perhaps we could automate this to a degree, but that would present a fresh problem of domain knowledge. Software bias isn't about incorrect maths but encoded assumptions that reflect norms and values. This can only be properly judged by humans, but would a regulator have the broad range of expertise necessary to evaluate the logic of all applications?

Initially, a regulator would probably respond to individual complaints after-the-fact, however history suggests that the regime will evolve towards up-front testing, at least within specific industries. The impetus for standards and regulation is typically a joint effort by the state, seeking to protect consumers, and capital, seeking to protect its investment. While the former is dominant to begin with, the latter becomes more dominant over time as the major firms seek to cement their incumbency through regulatory capture and as their investors push for certification and thus indemnities in advance. You'd need a very large regulator (or lots of them) to review all software up-front, and this is amplified by the need to regression test every subsequent software update to ensure new biases haven't crept in. While this isn't inconceivable (if the robots take all the routine jobs, being a software investigator may become an major career choice - a bit like Blade Runner but without the guns), it would represent the largest regulatory development in history.

An alternative approach would be to leverage software engineering itself. While not all software employs strict modularisation or test-driven development, these practices are prevalent enough to expect most programs to come with a comprehensive set of unit tests. If properly constructed (and this can be standardised), the tests should reveal enough about the assumptions encoded within the program logic (the what and why), while not exposing the programming itself (the how), to allow for meaningful review and even direct interrogation using heterogeneous data (i.e. other than the test data employed by the programmers). Unit tests are sufficiently black box-like to prevent reverse engineering and their architecture allows the test suite to be extended. What this means is that the role of regulation could be limited to ensuring that all applications publish standard unit tests within an open framework (i.e. one that could be interfaced with publicly) and perhaps ensuring that certain common tests (e.g. for race, gender or age discrimination) are included by default.


The responsibility for independently running tests, and for developing extended tests to address particular concerns, could then be crowdsourced. Given the complexity of modern software applications, let alone the prospect of full-blown artificial general intelligence systems, it might seem improbable that an "amateur" approach would be effective, but that is to ignore three salient points. First, the vast majority of software flaws are the product of poor development practice (i.e. inadequate testing), the disinterest of manufacturers in preventing vulnerabilities (those hackable kettles), and sheer incompetence in systems management (e.g. TalkTalk). Passing these through the filter of moderately-talented teenagers would weed out most of them. Second, pressure groups with particular concerns could easily develop standard tests that could be applied across multiple applications - for example, checking for gender bias. Third, quality assurance in software development already (notoriously) depends on user testing in the form of feedback on bugs in public releases. Publication of unit tests allows that to be upgraded from a reactive to a proactive process.

Interestingly, the crowdsource approach is already being advocated for fact-checking. While traditional media make a fetish of militant truth and insist on their role as a supervisor of propriety (technically a censor, but you can understand why they avoid that term), some new media organisations are already down with the idea of active public invigilation rather than just passive applause for the gatekeeper. For example, Mr Wikipedia, Jimmy Wales, reckons "We need people from across the political spectrum to help identify bogus websites and point out fake news. New systems must be developed to empower individuals and communities – whether as volunteers, paid staff or both. To tap into this power, we need openness ... If there is any kryptonite to false information, it’s transparency. Technology platforms can choose to expose more information about the content people are seeing, and why they’re seeing it." In other words, power to the people. Of course, near-monopoly Internet platforms, like global media companies, have a vested interest in limiting transparency and avoiding responsibility: the problem of absentee ownership.

The idea that we would do better to rely on many eyes is hardly new, and nor is the belief that collaboration is unavoidable in the face of complexity. As the philosopher Daniel Dennett put it recently, "More and more, the unit of comprehension is going to be group comprehension, where you simply have to rely on a team of others because you can’t understand it all yourself. There was a time, oh, I would say as recently as, certainly as the 18th century, when really smart people could aspire to having a fairly good understanding of just about everything". Dennett's recycling of the myth of "the last man who knew everything" (a reflection of the narrowness of elite experience in the era of Diderot's Encyclopedie) hints at the underlying distaste for the diffusion of knowledge and power beyond "really smart people" that also informs the anxiety over fake news and the long-running caricature of postmodernism as an assault on truth. While this position is being eroded in the media under the pressure of events, it remains firmly embedded in the discourse around the social control of software. We don't need AI watchdogs, we need popular sovereignty over algorithms.

Friday, 10 February 2017

F is for Fake

Propaganda relies more on reinforcement than persuasion. It doesn't change minds so much as bolster them. It works with the grain, building on existing prejudices and common cognitive biases to provide reassurance in support of already-formed beliefs. Propaganda works where there is a predisposition to believe. For example, Nazi propaganda was effective after 1939 because Germany was at war and the population subconsciously feared retribution. It gradually lost its power after 1943 as defeat became inevitable. In contrast, the propaganda of the USSR in the 70s and 80s was ineffective because the economy was visibly stagnant. The dynamic of reinforcement is key to understanding the current flap over "fake news". The consumers of the product are true believers rather than credulous dupes, but their belief is of a particular sort: they have a unifying theory of everything. In Isaiah Berlin's famous typology they are hedgehogs rather than foxes. If you think the world is explained by a busy God, a conspiracy of seven foot tall lizards or the machinations of the Jews, then you will be more likely to believe news that supports your priors and dismiss anything that conflicts.

This might suggest that fake news is limited to an obsessive minority, but the attitude of "true belief" is found across the political spectrum and not just at the extremes. Centrists who insist that the answer to every policy problem is either "competition" or "education" are also in the grip of this monist delusion. Where the centre differs from the right and the left is in not needing proactive reassurance, though this is simply a reflection of the structural reassurance of hegemony, much as the followers of a state religion tend be theological "don't knows". In other words, centrists don't seek out their fake news because it is pervasive. That said, while the bias of the mainstream media is real, it would be wrong to believe it is simply more insidious, or just more skilful, than that of the extremes. Consumers of fake news often know that what they're seeing or hearing is hyped but they enjoy it none the less because it validates their already formed beliefs, which in turn encourages the producers to push the limits of credibility further. That's why fake news is often ridiculous.

The post-2008 confusion of the political centre owes much to a crisis of confidence over policy, not least in respect of the efficacy of the panaceas of competition and education, but it has yet to undermine a worldview that holds moderation and triangulation as self-evidently good. This leads to the sight of liberals going through the electoral motions but without a programme to speak of, most obviously in Hillary Clinton's failed campaign in the US last year and currently in the boosting of Emmanuel Macron in France. Even Martin Kettle in The Guardian could not quite quell his doubts over this farcical re-run of the Blairite project: "Whether there would be a durable national embrace of what Macron stands for is far from certain. In large part that is because Macron has not said what he stands for". In a similar vein, the Liberal Democrats appear to have made unconditional EU love their sole policy, which is likely to condemn them to the political margins for a generation.


The policy void has placed a premium on attitude and behaviour, but it has also led to a sense of vulnerability in the face of external manipulation. This is evident not just in the imputed fragility of "generation snowflake" but in the assumed stupidity and credulity of the working class in the face of demagogues and the imagined power of online grooming. Everybody is about to be hoodwinked, hence the salience of fake news. This reductive attitude divides the world into those who are secure in their conventional beliefs and those who are empty vessels at risk of being filled with frothing madness. It's a pretty obvious transference of self-doubt by centrists, but it also over-states the power of manipulation. Like the hype around "filter bubbles" and unconscious bias ("check your privilege" etc.), this leads us to forget that believing bat-shit crazy nonsense and dismissing science out of hand are characteristic of a very small minority, not the majority. Just as most people do not believe the Earth is flat, so most people are not actually contemptuous of experts.

Lacking a positive vision, liberals are reduced to the conservative strategy of defending the status quo through project fear: constructing a deplorable enemy to rally support for the centre. But the consequence of the policy vacuum is that it draws the enemy centre-stage. Nigel Farage's prominence, the obsession with Donald Trump's idiot tweets and the attention garnered by a provocateur such as Milo Yiannopoulos are symptom's of the centre's malaise, not a sea-change in society. This is partly driven by structural change as traditional publishers try and adjust to new media, hence fake news is emblematic of poor quality control and so serves to reinforce the role of gatekeepers. But at heart it reflects the dependence of the media on a political centre that is failing to produce "non-fake news" of sufficient calibre. Once centrists deserted the arena of policy, it was inevitable that "squatters" would move in. Leave didn't win the EU referendum because voters believed £350m would be spent on the NHS but because voters wanted more spent on it and one side of the argument was prepared to agree. The point is not dishonesty but the inevitable attraction of the false promise when the establishment promises nothing.


The centrist choice of the right for the role of "deplorable"- the left being marginalised as irrelevant or incompetent (see the collected works of John Harris) - is clearly a projection of unspoken desires: toying with nationalism entails an illicit thrill greater than toying with nationalisation. This is a miscalculation of epic proportions, not because real Fascists will seize power (not even in France), but because it normalises racial and sectarian discrimination as a lesser evil through such formulations as "legitimate concerns". This ultimately coarsens political discourse and corrodes liberal norms. The problem with demanding a "debate about immigration" is that we've been talking about the issue for decades without any satisfactory conclusion. That's because there isn't one. Xenophobes will never be reconciled while "controls" will crumble in the face of economic imperatives. Centrist politicians who suggest there might be some happy medium are being dishonest. This will eventually lead to intellectual exhaustion and the acceptance of dysfunction, like the government's admission this week that it has no housing policy to speak of.

In the current environment - liberal self-doubt, centrist accommodation of right-wing agenda and a media fascinated by conservative outrage - it should come as no surprise that the alt-right have bubbled to the surface of public consciousness. They should not be mis-characterised as a political movement, any more than the victory of Trump should be seen as a popular uprising rather than what it is: a coup by a criminal gang that has ridden a populist wave. The alt-right is not the traditional Fascist right. They are dilettantes rather than social revolutionaries, amateur reactionaries rather than violent community activists. As Walter didn't quite say in The Big Lebowski, "I mean, say what you want about Nazis, Dude, but they always punch back". Just as fake news reflects the vacuity of the centre, the alt-right points to the essential fakery of the "nationalist revival" and its inherent instability, which can be seen in the evident tensions within the Front National between its Fascist core and its metrosexual marketing department.

Watching the Channel 4 News report on the UK alt-right on Wednesday evening I was struck by how middle-class this new generation of white supremacists and crypto-Fascists is. These people have always been around but used to be quarantined in the Young Conservatives or the more outrĂ© university clubs. The Internet has provided them with a forum independent of institutional restraints and crucially it has allowed them to organise without the need to join traditional right-wing groups like the BNP. They don't have to suffer the embarrassment of going on marches with skinheads and can indulge their fantasies about eugenics for the lower orders without the risk of encountering them "in real life". Their emphasis on IQ (The Bell Curve featured) and racial and cultural purity (as did Mein Kampf) was as sociologically telling as the insistence that they were actually libertarians (the equality of now) defending Western civilisation (the hierarchy of then). These bedroom Nazis are ridiculous. A twenty-something Paleo-conservative trying to hide their alt-right activities from liberal parents sounds like a pitch for a sitcom. The alt-right may well turn out to be the biggest fake news of the decade.

Saturday, 4 February 2017

Sicknote

The House of Commons was not at its best this week. John Mann's pursuit of Diane Abbott for apparently throwing a sickie to avoid the vote on triggering Article 50 serves to emphasise that the "momentous occasion" was about gesture rather than decision. Jacob Rees-Mogg's invocation of St Crispin's Day simply made the theatricality explicit. The government was always going to win, hence most MPs have been more concerned about posturing to satisfy their constituents or local party members than their consciences, despite claims to the contrary (many of the Labour "rebels" were among those who followed the whip and shamefully abstained on the government's Welfare Reform and Work Bill in 2015). Meanwhile the media have prioritised Labour dissension over the government's blithe disregard for a coherent Brexit strategy. The ensuing white paper, which the government has had 7 months to prepare but apparently failed to proofread, was aptly described as "the political equivalent of a cat coughing up a hairball".

Had the vote on the 'European Union (Notification of Withdrawal) Act  2017', a bill little longer than a doctor's note, been along party lines - i.e. had all MPs obeyed their whip - then the government would have won. Had the vote sought to accurately represent constituency opinion, then the government would still have won, presumably on something close to a 52/48 split to reflect the popular division last June. Had Labour whipped its MPs to vote against the bill, whether as a matter of principle (the absence of a coherent plan) or as a tactical manoeuvre (to lay down negotiation red lines), then the government would still have won, notwithstanding Ken Clarke's rebellion and even assuming the improbability of Labour leavers like Gisela Stuart, Graham Stringer and Kate Hoey observing the whip. The only circumstances under which the government could have lost would have been a free vote, and that in turn assumes that MPs would have remained largely consistent with their preferences as expressed in the days leading up to the referendum when a clear majority were in favour of staying in the EU.

Of course, a free vote might still have led to a government win if enough MPs had converted from remain to leave since last June. Some might have sincerely changed their minds because of the referendum outcome, perhaps having been won over by the leave campaign's persuasive arguments and incontrovertible facts, but any insisting that they were now obliged to vote against their own belief by a superior need to reflect that of their constituents would be abrogating parliamentary sovereignty, the very principle for which many leavers insisted that we must quit the EU and the same principle confirmed by the Supreme Court's Miller judgement. Had a free vote led to a defeat for the government, this would have been a clear reassertion of parliamentary sovereignty but also a clear rejection of the referendum; i.e. confirmation that not only was the popular vote last June advisory - in effect treating it as a second opinion - but that the advice wasn't decisive in Parliament's final consideration.

If nothing else, this sorry sequence - marked by the naivety of remainers as much as the hypocrisy of leavers - should make crystal clear that parliamentary sovereignty is a myth. A positive result of last year's referendum is that a plebiscitary dictatorship remains remote, not least because the danger of a popular vote backfiring will make future governments reluctant to take the risk. Remainers calling for a second referendum are wilfully ignoring this point. Should the negotiations with the EU lead to an obviously bad outcome that turns popular opinion, you'd hope the Commons would seize the initiative and "represent" this rather than offload the problem to another referendum. A less positive result is that executive dictatorship has been reinforced through the immediate erosion of parliamentary sovereignty and the ongoing weakening of scrutiny under cover of Brexit planning and negotiation. In retrospect, MPs were foolish in not understanding what David Cameron had staked in calling last year's referendum. This was certainly a vote on parliamentary sovereignty, but one in which the real threat was not the EU (as the white paper implicitly and ruefully admits) but the executive, both in its cavalier misjudgement and its lust for the covert.


Looked at in the context of the centuries old struggle between Crown (the executive) and Parliament, it is the Crown that is winning, as it has been since the "great centralisation", started under Margaret Thatcher in the 1980s, began to erode the diffuse sovereignty of local government, unions and public corporations. New Labour's commitment to managerialist opacity and media manipulation ensured that Thatcher's legacy was consolidated, rather than challenged, with the Commons famously marginalised during the build up to the Iraq War by "sofa government". While Cameron took a more "chillaxed" approach to public presentation than the famously anguished Blair, this obscured the further institutionalisation of executive power behind the scenes, not least in Theresa May's domain at the Home Office. The fear is that the effective exclusion of the Commons from proper scrutiny of the government over the next two years, with "commercial confidentiality" becoming as prevalent an excuse as "national security", will lead to a growing acceptance that the House should have only a weak power to interrogate or curb ministers, and one best achieved through narrowly-focused select committees (whose creation in 1979 now looks ever more obviously to have been an inadequate compensation for the subsequent weakening of civic society).

The failure to hold a free vote was entirely down to the decision of pro-remain Tory MPs, with May at their head, to pursue a hard Brexit for essentially opportunistic reasons. The moment the Prime Minister said "Brexit means Brexit" the pass was sold and this week's vote became little more than a formality. The Supreme Court judgement was notable for not presenting the government with any problems: it insisted on the pomp of a Commons vote but rode roughshod over the circumstance of devolution. The Tories have compromised parliamentary sovereignty for the sake of preserving executive power - first in Cameron's decision to allow a decisive referendum and then in the vote this week. Insofar as a strategy can be discerned, it is to pay lip-service to perceived public opinion in the areas of immigration and "foreign control" (as interpreted by the press); accept a degree of economic damage as the necessary cost of divorce (but look after the City); and then push through economic and social "reforms" hitherto impeded by the EU (which means weakening worker rights and consumer protection more than increased state support) with the justification that this will make us more competitive and thus defray the cost.

Leave won the referendum for two key reasons: most Tory voters opted to quit the EU, offsetting the majority of Labour and minor party voters who opted to remain; and the leave campaign mobilised a reactionary element that does not usually vote - i.e. they got the bulk of the increased turnout. It was the latter that was decisive. The Conservative Party appears to believe it can tempt this element into the polling booth more often, essentially by making all future elections centre on Brexit. To this end, a hard Brexit (and a focus on immigration and "control") makes electoral sense. It also explains why the government is reluctant to articulate its strategy, as it boils down to deliberate self-harm. The obvious lesson to draw from this is that the Tories remain the party of power for whom conscience is a luxury and collateral damage is simply somebody else's problem, while Labour remains the party of dissent for whom a plurality of opinion is inevitable. Criticising Corbyn for this is as otiose as criticising May for being unprincipled.

In the circumstances, I'm genuinely surprised that the government's laughably naff white paper on Brexit hasn't immediately snatched the title from Labour's 1983 manifesto as "the longest suicide note in history". At best it serves as a compelling if accidental diagnosis of some particularly morbid symptoms (the lack of facts around immigration, the havering around employment rights), but it is utterly inadequate as a prognosis let alone a course of recommended treatment. This failure is ultimately not the fault of a government that appears simultaneously clueless and malign, but of a House of Commons that has been on life support for decades, failing in its responsibility to fairly represent the electorate and ever more cringeing and subservient in its attitude towards the executive. Diane Abbott may well have bottled the vote on triggering Article 50, but it is short-sighted fools like John Mann who have done most to bring Parliament into disrepute.

Wednesday, 1 February 2017

A Prince Among Men

The transparent lobbying of Prince Charles to replace Gary Lineker as leader of the resistance (UK branch) is evidence that the liberal opposition to Donald Trump is going to be no more effective internationally than Michelle Obama's "When they go low, we go high" was domestically. The move also suggests that the heir to the throne has internalised the attitude to monarchy promoted by the media he effects to despise, which confuses legitimacy with personality. In absolute monarchy, the person (i.e. body) of the monarch was sacred and personality incidental, hence a bad or a mad king was still a king. A constitutional monarch is similarly just an embodiment, but now of the nation rather than divinity, and thus little more than a ceremonial civil servant, hence the famous inscrutability of the Queen. Charles suffers from the delusion that an accident of birth should privilege the opinions of one over the many and that a state visit would be the opportunity for a meeting of fine minds. The prince's views are not representative of anybody but himself and it is vanity to imagine that he above all can talk sense into Trump on climate change or "inter-faith understanding".

What was particularly amusing about Charles's intervention was the claim that "The prince has gone into the Middle East over recent years at the government’s request and has been the honest and neutral broker". This suggests both a semi-detachment of the Prince from government and a supranational realm in which he can mediate between different states. In other words, Charles has inflated the political neutrality of the monarchy in its domestic setting into the role of an international Solomon. This pretension goes some way to explain the royals' choice of global concerns to channel their activism, from the obsession with preserving the wildlife they previously shot to presenting environmentalism as noblesse oblige. These issues are chosen not because they avoid conflict with national concerns or alignment with partisan politics, but because they allow monarchy as a style of governance to be projected beyond the confines of the nation. This should remind us that monarchy - or at least the "top division" sort represented by the Windsors - is instinctively supranational and imperial.

As a representative of the state and therefore government policy, Charles cannot be neutral in his dealings with other states. To suggest that he can is to revive the old idea of loyalty as flowing to the person of the monarch, rather than the nation, and displays a woeful ignorance of British history, not least in the reigns of previous kings called Charles. Of course, what's actually behind this is the suggestion that Charles is a better interlocutor with Middle Eastern monarchs than career politicians because of the affinities of princes (there is an argument to be made that the oil shocks of the 1970s not only reverted the social relations of the Middle East to an essentially feudal form, thereby boosting Islamic fundamentalism, but that the preservation of regional monarchies helped extend the useful life of the British version by making it look less eccentric). The application of this princely reasoning to Trump is a tacit recognition of The Donald's own monarchical ambitions. What was originally seen by Republicans in the US primary as a bug is here recast as a feature.

The petition to deny Trump a state visit to the UK is bizarre. It distinguishes between Trump's day job as "head of the US Government" and his role as head of state (and therefore qualified to receive the honour of a state visit), though such a distinction is meaningless in the US where the head of the government is the head of state. It also misses the point that his objectionable behaviour - the banning of Muslims - was conducted as head of the government (which is why it was an executive order) rather than as head of state, so it is illogical to penalise the latter rather than the former. The petition's justification for denying Trump a state visit - to avoid "embarrassment to Her Majesty the Queen" - is equally illogical as it implies that the division of the two roles in the UK, between the head of government and the head of state, makes the monarch an innocent bystander whose feelings must be protected. The monarch is a well-paid extra and part of the deal is to eschew all feeling.


Rather than denying him a state visit, a better approach might be to embarrass Trump by augmenting the mass protests with more symbolic arrangements. Perhaps a spot of extreme vetting at Heathrow by a British Muslim in a headscarf or a Syrian-themed menu for the state banquet. Maybe William and Harry could pointedly ask why he ever thought he'd have stood a chance of going on a date with their mum. We'd probably struggle to find grounds to put him under house arrest in the manner of Pinochet, but we could make his time here uncomfortable, and I still believe that anything that threatens his personal financial interests will be more effective than public demonstrations that he will only see through the filter of Fox News, so investigative journalism of his UK affairs should take priority over outraged editorials. Considering that Trump has refused to properly step back from control of his business empire, and knowing that his historic dealings have been anything but ethical, you'd imagine there would be plenty of scope.

I don't hold out much hope of any of this happening and the reason is that realpolitik currently has the upper hand over ethics. The domestic political cost to the UK government of being pally with Trump is not high. Not only does the man have an extensive British media claque, but his anti-Muslim bias chimes with the many conservatives who backed the government's grudging support for Syrian refugees under David Cameron. As Prime Minister, Theresa May hopes a state visit will help secure a beneficial trade deal with the US, but the reliance on ceremony suggests that the UK will have a relatively weak hand in future negotiations. State visits to the UK rarely herald economic outcomes beyond arms deals or the foreign purchase of British assets. Their purpose is often to dignify the squalid. In this light, Trump's apparent "concession" not to abandon NATO forthwith should be seen as the over-inflation of policy compensations in advance of an asymmetric deal (I doubt the US has any real intention of leaving the organisation - they just want Germany to pay more). But if the realpolitik is strong (or desperate, in other words), the ethics are weak, and the involvement of the monarchy is the reason why.

The petition's emphasis on the Queen's feelings shows how propriety and decorum have compromised morality in the liberal opposition to Trump, which continues the strategic error of the Republicans during the primary and the Democrats during the election. In drawing a distinction between the man and the office it implicitly questions Trump's fitness for the Presidency and thereby his democratic legitimacy. There is nothing wrong with that - he did lose the popular vote, after all - but to do so by invoking the sensitivity of an unelected monarch is daft, particularly as we weren't overly-fussed by the Queen's potential discomfort in hosting various dictators in the past. Trump may be a moral monster and his behaviour reprehensible, but he is nowhere near the worst US President in recent memory (though obviously that's because he hasn't been in the job long enough yet - give him time) and his potential to do terrible things is a feature of the office and the wider constitution rather than just the man.

The opposition to Trump will remain weak until it focuses on his behaviour in the material rather than the symbolic world. In other words, the social damage of his own business practices and the economic policies that he and a Republican-controlled congress will now enact. His opening executive orders will have real world consequences, but these will be relatively slight compared to the effect of large domestic tax cuts and the gutting of subsidised healthcare and public education. Internationally, our concern should be the instrumental weakening of the EU and NATO in order to advance American interests against those of Europe in the emerging multipolar order. Ironically, our "independence day" on the 23rd of June last year means we are closer to becoming the 51st state in all but name. In the circumstances, you'd think Prince Charles would be a little more circumspect, but perhaps he is deluded enough to believe that the Americans are coming around to the merits of monarchy once more.