Barcelona, city of protest – Friday

Gosh, it’s been ages since I updated this blog! A whole summer went by – I went to a music festival in the mountains of Girona, had a few weeks back in England (which felt very odd now), and started a new teaching job here in the Catalan capital. All, or some of these, I’ll hopefully document at some point.

However, I’m writing now so I can cover the events of this week, which have been pretty dramatic – especially as yours truly ended up caught in the middle of them. While the UK has been tied up in knots over Brexit, the US concerned with the possible impeachment of its increasingly deranged President, and cities all over the world disrupted by Extinction Rebellion, Barcelona has barely noticed any of that. This city’s been preoccupied with issues of its own, as the months long trial of the former Catalan leaders wound to a ghastly but inevitable conclusion. Continue reading “Barcelona, city of protest – Friday”

Welcome to the United Kingdom of Westeros

“One more such victory and I am ruined!” – Pyrrhus

Yes, I was going to catch up on my blogging about Preacher, now some two weeks overdue. But you’ll hopefully appreciate that these last couple of weeks have been a little tumultuous in my country, which overnight on Thursday went from being a peaceable United Kingdom to something more akin to the squabbling factions of Westeros. So I thought maybe I should write something about that. Continue reading “Welcome to the United Kingdom of Westeros”

The Road to Damascus

It’s been a few months now since I’ve written anything on this blog. And what a tumultuous few months they’ve been, in what’s usually the so-called ‘silly season’ when governments go on their hols.

We’ve seen a (in my view) brave man expose the extent to which the US and UK governments are sliding towards full-on totalitarianism, all in the name of ‘security’; revelations for which he’s now on the run, ironically having to seek refuge in nations whose claim to totalitarianism is rarely in doubt.

clip_image002

Continue reading “The Road to Damascus”

How Mr Hammond learned to stop worrying and love Trident

“Our own independent nuclear deterrent… has helped to keep the peace for more than forty years.” – Margaret Thatcher, 1983

“Glory be to the Bomb, and to the Holy Fallout.” – Insane mutant, Beneath the Planet of the Apes

ScreenShot009

Philip Hammond and his spads inspect the missiles at Faslane

When I was a teenager, in the mid-1980s, it wasn’t a question of if the world would be destroyed in a nuclear holocaust – it was when.

After forty years of teeth-bared, nuclear-armed confrontation between the United States and the Soviet Union, it felt like a miracle that we were still walking the tightrope and hadn’t fallen off. When hardened Cold Warrior Ronald Reagan was elected US President in 1980, and immediately started referring to the Soviets as “the evil empire”, it felt like we were starting to wobble on that tightrope very alarmingly.

Popular culture reflected our anxieties, warping our expectations and filling us with apocalyptic paranoia. We might have thought the post-nuclear wasteland replete with adventure after movies like Damnation Alley or Mad Max 2, but we were soon disabused of that notion with the horrific realism (and even that was toned down somewhat) of TV movies such as The Day After and Threads, both of which gave the teenage me nightmares for weeks. Even Raymond Briggs, author/artist of cuddly Christmas favourite The Snowman, got in on the act with cartoon downer When the Wind Blows, which gave kids the opportunity to watch two loveable pensioners die a horrifically protracted death of radiation poisoning.

The music too reflected the sense of inevitable impending doom. When the Wind Blows boasted a doomy score by Pink Floyd arch-miserablist Roger Waters, whose 1983 Floyd album The Final Cut ended with a charming depiction of nuclear holocaust, Two Suns in the Sunset (“could be the human race is run”). Liverpool dance pop band Frankie Goes to Hollywood followed up gay sex celebration Relax with the doomy Two Tribes, which opened with a mock nuclear attack announcement and whose video featured lookalikes of the US and USSR Presidents fighting to the death in an arena.

Apocalyptic paranoia goes dance.

Our own Prime Minister, Margaret Thatcher, had swung back into power after a supreme burst of sabre-rattling in the Falkland Islands, in which she used the sledgehammer of the British military to crush the less than effective conscripts of stupidly aggressive Argentine dictator Leopoldo Galtieri. Following that, she became extremely chummy with Reagan, whose idea of humour was the unintentionally recorded gag, “we begin bombing in five minutes”, which put the Soviet army on a high alert status. We became paranoid that every misinterpreted radar shadow of a flock of geese would spark off a retaliatory ICBM strike. It was only a matter of time.

But by some miracle, it didn’t happen. Against all expectations, President Reagan sat down with new, moderate Soviet Premier Mikhail Gorbachev, and negotiated the climbdown from the Cold War that would culminate in the collapse of the totalitarian USSR in 1991. For ten years, we lived free from apocalyptic paranoia – until September 11 2001 brought the golden opportunity of a new threat, different in nature but similarly all-pervading. Guns blazing, George W Bush declared “war on terror”, ignoring the fact that, traditionally, wars are fought between two states, not one state and a mobile group of fanatics with no national allegiance.

Bush’s problem was that, like the First World War generals whose tactic was to charge machine gun emplacements with cavalry, he was trying to fight the war before the one he was actually fighting. Al Qaeda are not “the Reds”, with a conveniently available selection of cities to rain destruction on; they’re a group of fanatical opportunists most of whose weapons consist of vests with TNT sewn into them. Faced with this, “let’s bomb the bastards” makes absolutely no tactical sense, because they’re in the middle of large populations of otherwise innocent people in otherwise innocent states.

And the reason I bring all this up now is that, in the face of all sanity, military strategy and economic good sense, Conservative Defence Minister Philip Hammond is currently making the same mistake. On Monday, before visiting nuclear submarine base Faslane in Scotland, and in direct contradiction of his party’s Coalition Agreement with the Lib Dems, he unilaterally announced the first steps towards purchasing a like for like replacement for Britain’s Cold War missile system, Trident.

Whoops

“We hold these truths to be self-evident / that all men may be cremated equal” – Vern Partlow, Old Man Atom

This issue has been a political hot potato for some time, and to their credit (whatever their other failings), the Lib Dems seem to be the only English political party who can see this for the massive waste of money and strategic nonsense that it is. Alex Salmond’s SNP, faced with the inconvenience and moral problems of hosting the submarines, has a similar viewpoint. Both make perfect sense – in today’s world, Trident is a sledgehammer to crack a nut. Let’s look at the destructive potential of the system. Here comes the maths bit…

Britain has four Vanguard class submarines, each capable of launching 16 Trident II D-5 missiles, each of which can be tipped with up to 8 W76 warheads, with an explosive yield of 100 kilotons (1 kiloton = 1000 tons) each. That’s a total destructive force equivalent to 51200 kilotons of conventional explosives. To put that into perspective, the bomb that annihilated the city of Hiroshima in 1945 had a yield of 16 kilotons. Just one of the multiple warheads carried by each Trident missile is more than six times as destructive as that. Altogether, Britain’s nuclear capability is equivalent to 3200 Hiroshimas.

Now, it is fair to say that the 2010 Strategic Spending Review has limited that substantially, halving the number of missiles each submarine will carry to eight, and limiting the number of warheads carried to a total of 40. That has massively reduced the destructive potential available at any one time to a mere 250 Hiroshimas. But don’t get too relieved – we’re keeping a (reduced) total of 120 warheads actually available; that’s 750 Hiroshimas. And we could strap them onto the missiles and load those missiles at any time – I doubt we’d tell anyone.

This massively excessive destructive potential sort of made sense as a ‘deterrent’ at the height of the Cold War, with two ideologically opposed blocs, armed to the teeth with nukes, growling at each other. The theory was that nobody would launch a first strike for fear of facing equal retaliation; you can’t win a war if the entirety of civilisation is destroyed (which ignored the probability that even if only one side launched its nukes it would effectively devastate the planet). This strategy was known as Mutually Assured Destruction, with the all too appropriate acronym MAD.

But, militarily speaking, what threats do we face now? Learning lessons from Germany, most international rivals now know that the way to best your rivals is not to conquer them but to buy them. Ignoring the small clutch of nations with a limited nuclear capability (North Korea, Israel, potentially Iran) that can’t hold a candle to the West’s nuclear arsenal, the only states currently posing a similar threat to the Soviet Union are China and Russia. Both are too gripped in their own newfound love of capitalism to risk nuclear war; China in particular, by dint of holding the debts for most of the West, doesn’t even need to. All it needs to do is send round the repo men.

So we’re left with the threat that Western governments have built up, propaganda-wise as the baddies since the demise of the USSR – terror. And more specifically, terrorists. Government press releases and hysterical news media bombard us daily with nightmare scenarios of suitcase bombs, suicide vests and the ever-looming shadow of the Twin Towers airliner attacks.

Against that, what on Earth is the point of launching a multiple warhead intercontinental ballistic missile? Even if the so-far-unproven spectre of small nations (like Iraq) developing “weapons of mass destruction” comes true, those weapons will be like peashooters against rockets compared to even conventional Western forces. A massive nuclear strike – against any of our current enemies and likely any we may face in future – makes precisely zero strategic sense.

“It’s the last thing they’ll be expecting – a daylight charge over the minefield.” – Arnold J Rimmer, Red Dwarf

Watchmen_Kiss

And yet, at the “leaders’ debates” just before the 2010 General Election, both Gordon Brown and David Cameron emphatically insisted that Trident must be replaced with a similar/identical system to maintain Britain’s defences. Why? It made no sense then, and makes even less now, with the repeated mantra that “there’s no money left”. With the massive slashing in public spending on society’s sick and vulnerable, how on earth can anyone justify spending billions on a massive military white elephant?

Brown then, and Cameron now, made no sense from a military perspective in retaining such phenomenal destructive power. With Cameron, you can at least understand the perspective of trying desperately to shore up the illusion that Britain somehow retains the weight it once had as an international power – after all, the very nature of Conservatism is to cling to the past and try to reverse progress. Brown’s Labour Party had no such excuse, and neither does Miliband, who’s been conspicuously quiet on the subject.

However, I’d guess that neither wishes to upset the American defence industry, from whom Trident and any potential replacement would be bought and maintained. Estimates of the overall cost (including new submarines, new missiles, and new or refurbished warheads, plus ongoing maintenance) vary wildly from £25 billion (2006/7 government figures) to £97 billion (2009 Greenpeace estimate). Still, that’s a drop in the ocean compared to the US annual defence budget of $1.4 trillion, most of which I’m pretty sure is spent at home. Put simply, the US defence industry is not desperate for the billions we’d give them, whatever politicians might think. The people of the United Kingdom, on the other hand, are – certainly if George Osborne is to be believed.

To be fair to the pro- camp, all those billions would not be spent in one great lump, whatever the opposition might say or imply. The costs cover a thirty year period; however, it’s still estimated at £1.5 billion to £2 billion per year. That’s a pretty massive sum to be wasting on a weapons system that, even if it made strategic sense as a deterrent, could never actually be used. Particularly when Osborne insists that £10 billion needs to be slashed from the benefit budget because the nation can’t afford it.

And to be fair to the anti- camp, not replacing Trident with an identical system is not the same as complete unilateral disarmament (as espoused in Michael Foot’s 1983 Labour manifesto aka “the longest suicide note in history”). Other nuclear weapons are available. Ideally, ones with slightly more precision than Trident, whose smallest possible effect is the destruction of an entire city. I’d argue that we probably do need nuclear weapons. Just not blunt instruments. Iran is not going to gain the nuclear capability of the USSR overnight; it took them decades to reach that level. If that seriously looks like a threat, we could reconsider. But arming ourselves to the teeth just in case is ridiculous.

NukedCity

“It is the nuclear missile Harrods would sell you. What more can I say?” – Sir Humphrey Appleby
“Only that it costs £15 billion and we don’t need it.” – Jim Hacker
“Well, you could say that about anything at Harrods.” – Sir Humphrey Appleby
Yes Prime Minister

In the end, spending billions of pounds on a weapons system that no longer makes strategic sense, at a time when, if its proponents are to be believed, we are so desperately short of money that austerity is the only possible solution, is utterly, completely bonkers. Why should other countries seeking to acquire nuclear capability listen to us taking the moral high ground when we can’t give up our own Cold War toys? And regardless of your party allegiance, can you honestly say that a very expensive way of waving your willy around to look important matters more than caring for the vulnerable in your society?

If, like me, you’re old enough to remember the all-pervading fear and certainty of destruction in those last days of the Cold War, it should be enough to cure you of any nostalgic tendencies about it. But the Conservatives love the past, and are intent on hurtling us back there, convinced that it was always a halcyon Golden Age better than the one we have now. Buying another dose of Mutually Assured Destruction may satisfy Philip Hammond’s nostalgic urges, but to the rest of us, it’s just MAD.

No representation, no rights–the plight of the invisible working class

The attack on the rights of Britain’s workers continues apace…George-Osborne-laughing

As the Conservative Party Conference rumbles on in Birmingham, I was surprised to hear a note of what felt almost like socialism from Chancellor and professional “posh boy” George Osborne in his speech on Monday. New businesses, he said, should hand out shares to all their workers, giving them a stake in the company’s success and motivating them to work harder. I could hardly believe my ears. A senior Tory – the one most responsible for the ever-widening gap between rich and poor in this country – espousing the idea that the fabled “wealth creators” should share some of their largesse with their underlings, by forming, in effect, co-operatives?

Ah, but there’s a snag, which came up next. In order for the workers to take advantage of this munificent offer, they would have to sign away some, perhaps all, of their employment rights. That sounds more like the Osborne I’ve come to know and loathe. The Osborne that wants to, in effect, bribe the British worker, already one of the least protected workers in Europe, to give away some of the few paltry rights he/she has left.

Meanwhile, a recent brainstorm by some new Tory high fliers has produced a particularly nasty little pamphlet called Britannia Unchained, in which they proclaim that British workers are the laziest in the world. And all the time, unemployment figures are kept low with fixed term contracts and part time positions which now seem the default instead of a full-time, permanent position. Never mind that these workers have to claim state benefits in order to live. They’re not “unemployed”, which makes the figures look better. They’re the new working class – the “working poor”.

Remember when we actually had a “working class”? It used to be almost a badge of pride for some; the idea that you were getting an honest wage for an honest day’s graft, and you were thrifty enough with your meagre income to eke out a modest but unpretentious standard of living. Ah, it were grand in them days…

strike

Somewhere down the line, that disappeared. Perhaps it was during the blindly aspirational 1980s, when we were told that we all had social mobility; perhaps it was when John Prescott declared that “we are all middle class now”. Somehow, the label “working class” took on a mantle of shame, as though if you hadn’t reached “middle class” status, you just weren’t trying. So everyone started calling themselves middle class, regardless of how redundant that made the term. The rot probably set in when we started using terms like “lower middle class” and “upper middle class” in place of the middle/working class division. But however you want to label yourself, the vast swathes of the population toiling away for meagre payments means that the working class is still very much with us.

George Monbiot recently wrote an article recalling the July speech by Barack Obama, in which the US President proclaimed of businesses “you didn’t build that (by yourselves)”, the first part of which was ruthlessly appropriated out of context by the Republicans for their own agenda. Obama was referring to the fact that private enterprise always depends, to some extent, on spending by the state, financed out of taxes – roads, education, infrastructure and so on, while Monbiot focused on how many of these “self-made millionaires” had inherited the means to their success.

But they both ignored another vital aspect of “you didn’t build that by yourselves” – the workers who staff those businesses. The wealthy business owners like to twist the English language to portray themselves as benefactors by calling themselves “job creators”, a term predicated on the idea that their businesses give people jobs. And so they do. But it’s a two-way street. Yes, without those “entrepreneurs”, the businesses wouldn’t exist to “create the jobs”. But without the jobs then being done, by workers, the businesses themselves would crash and burn.

uv-179

There’s been a lot of demonising of the very rich by the left, and the rich have taken exception to being described as “parasites” (even though it doesn’t seem to bother them when applied to the benefit-claiming poor). But the increasing agenda of hacking away at workers’ pay, rights and conditions makes the label all too appropriate. In a fair capitalist system, the relationship between a business owner and his/her workers should be a symbiotic one; the business owner provides jobs for the workers, the workers do the jobs that need to be done, for a fair wage, to keep the business running. Both have a stake in the business’ success, and both are motivated to ensure it. That way everybody wins.

But the agenda of the parties on the right, both here and in the US, is that progressively fewer rights and benefits should be conferred on those workers, while progressively more should flow the way of the already better off business owner. The Tories in the UK and the Republicans in the US clamour for lower taxes and less regulation for the rich, while hacking away at the pay, conditions and few remaining rights of their workers.

In the UK, the lobbyists for the rich urge us to get rid of the already paltry minimum wage, which despite its recent increase from £6.08 per hour to a princely £6.19 is still falling in real terms and was never enough to live on. The shortfall is then made up by the UK taxpayer in the form of tax credits, which effectively subsidise the profits of big businesses by allowing them to get away with paying wages that aren’t enough to live on.

The Tories’ pet businessman Adrian Beecroft and state-dismantling fanatic Mark Littlewood, of “thinktank” the Institute of Economic Affairs, hector us constantly that even that level of minimum wage is too restrictive for businesses to succeed, and it should be done away with. Together with the right to redundancy pay, or to appeal unfair dismissal (the qualifying term of which has already been doubled). There are already moves to legislate that industrial tribunals to consider unfair dismissal should be paid for by the plaintiff regardless of their success, which coupled with the massive slashing of entitlement to legal aid will ensure that this means of redress becomes less and less of a viable option for those sacked because they’re the wrong race, the wrong gender, the wrong sexuality, or simply because the boss doesn’t like their face.

Meanwhile, when the workers threaten to protest against legislation whittling away their pay and rights by threatening to withdraw their services, they’re portrayed as being selfish and uncaring by the political establishment (even Ed Miliband, who can’t tell us that strikes are wrong often enough). But if the rich threaten to withdraw their services, by moving abroad to a more desirable tax regime, politicians can’t kowtow to them quickly enough. A protesting worker gets demonised by the Daily Mail; a protesting business owner gets the tax laws changed in his/her favour. We live in a democracy (allegedly). Which of these groups could rightly be said to constitute a majority?

When we’ve reached a state of affairs when the business owner’s appreciation of his workers’ contribution to that business is non existent, and the business owner wants to take more for less from the workers, that relationship is by definition no longer symbiotic. In a situation where one party takes from another while giving nothing back, you call it what it is – parasitism.

Now, to be sure, the working class of today bears little resemblance to that of yesteryear. The wholesale destruction of Britain’s industrial base began by the 1979 Tory government, and carried on with such gusto by the ideologues of New Labour, left the country’s workers (those it had left) employed primarily in service industries. As much as possible was outsourced overseas to where maximum profit could be gained by exploiting workers used to far less.

But not everything could be shipped overseas. Today’s working class is the vast army of people who serve you in shops, who serve you in restaurants, who answer the phones in the few call centres still left in the UK. And they have nobody to represent them at all. The Tories, of course, never did, despite the Alf Garnett-alikes who always voted for them. The Lib Dems, protest though they will, are so keen to be centrist they represent very few. And the leader of the Labour Party, started by the Trade Unions precisely to give the worker a voice in governing the country of whose population they were the majority, now bleats about the need to appeal to “the squeezed middle”, following in the Middle-England chasing footsteps of his supposedly discredited predecessor Tony Blair.

_48267296_miners2

There was a time when workers had representation, of course. Arguably too much of it. The collective bargaining power of the Trade Unions, hitched to the political clout of the Labour Party, was responsible for attaining many of the workers’ rights we have today, the ones being sliced away at by the Coalition. Thanks to Trade Unions, we no longer have children working sixteen hour days in factories, and workers have the ability to challenge perceived unfair dismissal.

But like so many given a dose of power, the unions grew arrogant. The heady scent of power rose to their heads so that, by the 70s, their fits of pique over the most trivial of issues would regularly bring production to a standstill, while madly unreasonable demands for pay increases crippled Britain to such an extent that the tremulous Heath government, embattled by power cuts and three day weeks, was effectively toppled as a result.

It was inevitable that there’d be a reckoning with their old enemies when the Conservatives got back into power in 1979. So it proved, with the Thatcher government introducing legislation that crippled them while peddling a media narrative that they were all nest-feathering “loony lefties” on the take. Militant union leaders blindly played right into their hands, with the bitter conflict of the mid-80s NUM strike effectively destroying their reputation for good.

So like a seesaw, the balance of power had swung from bosses to workers back to bosses again. And it continued to stay over on the bosses’ side with a “New Labour” party that sought to emulate its adversaries’ agenda. Tony Blair’s modification of the party’s defiantly socialist Clause IV allowed him to start peddling off the state’s assets with the same fervour that Thatcher and Major once did; under New Labour, we got the first public/private partnerships in the NHS, and the first academy schools.

Today, trade unions have so little sway in the private sector that only one in seven employees is a member. The public sector remains the only area in which unions are strong, which explains the Tories’ rabid desire to promote a private/public conflict. Demonise the “tax-sucking, gold-plated salaries” in the public sector, and you’ll have the private sector crying out that its working pay and conditions should be brought down to their level.

This is the political equivalent of shooting yourself in the foot. Let that happen, and bosses will be free to suggest that pay and conditions should be degraded yet further, with no one left to have a higher standard to compare with. Private sector workers shouldn’t be angrily demanding that their public sector equivalents get worse off; they should be angrily asking why they themselves aren’t better off.

Sadly, the current crop of unions are still doing themselves no favours wheeling out 70s-style caricatures like Bob Crow as their figureheads, and demanding full-on socialism. I may have some left wing views, but I think total socialism has, to all intents and purposes, been proved unworkable. The thing is, so has total free-market capitalism.

What’s needed to redress that balance is a compromise between the two. And what’s needed to achieve that compromise is a voice for that now-voiceless mass, the working class. If Labour won’t do it, and the unions can’t do it, maybe it’s time for some new kind of organisation that will, before the few persuade the many to give up their last pennies in exchange for cheap trinkets.

A Nightmare in the Examination Hall

GCSEs! They’re terrible, aren’t they? Unfit for purpose? They must be, because the press told us so (ably assisted by various strategically placed press releases from the Education Secretary). Children the country over are suffering after unfair changes to grade boundaries left thousands with a D when previous benchmarks would have left them with a C. Proof, if proof be need be, that the entire GCSE system (introduced in 1988 by the Conservative Party, of all people) is entirely corrupt and unfair, right?

What EXACTLY is the problem this year?

FryExams

GCSEs are far from perfect, but as usual, the press (and the government) were (perhaps deliberately) telling a very simplistic and generalised version of what was going on.

According to my scouring of the TES forum on results day, and government regulator Ofqual’s official report, the issue of the changed grade boundaries affected two out of three GCSE English qualifications only. English Literature was unaffected, while English Language and English Language and Literature had problems. But only at the Foundation (lower) Tier, apparently (all those worried about introducing a “two-tier education system” might want to remember that GCSEs already do this). And, of three major exam boards across England, only from two of them (“primarily AQA and Edexcel”, says Ofqual).

To put that into perspective, that means that, out of dozens of subjects being examined, this problem affected only one. And that one has, in essence, eighteen separate qualifications (three English qualifications across three major boards, Foundation and Higher Tier for each), of which four were at fault. And each of those four was made up of three modules, not all of which had the grade boundaries dramatically shifted. Suddenly doesn’t look like the damning critique it appeared, does it?

The other issue that all the papers fail to mention is that these particular GCSEs were being awarded for the first time this year, initially by a small group in January then a much larger one in June. Under such circumstances, it’s fairly common for government regulator Ofqual (and their predecessors QCA) to send an observer to the awarding meetings where grade boundaries are decided, in order to monitor standards.

Ofqual’s initial assertion as to the reason for this issue is that the standard was set wrongly in January. It suggests that Ofqual weren’t properly monitoring the awarding in that first series for the new qualifications, which would be unusual. Equally, the boards concerned must have some culpability for setting the boundaries generously themselves, but Ofqual’s monitoring of this is the final arbiter, and the very reason for its existence.

This rather gives the impression that they allowed the first awarding of a new qualification to either be monitored sloppily, or not monitored at all. Ofqual is a fairly new and untried regulator, rushed into existence with alarming haste by the incoming Coalition government in 2010. With this in mind, you start wondering whether it’s the exams that are the problem, or the purported guardian of their standards. Of course, that’s all a bit fiddly for a big, emotive press story about children being unfairly treated by the thousand, and doesn’t fit the political narrative.

What should we do, Mr Gove?

clip_image002

So, after a couple of weeks scandal, Mr Gove (Education Secretary and part time Pob lookalike) has given us his verdict on What Should Be Done with GCSEs. And unsurprisingly, his judgement based on all the evidence is… to do what he always said he wanted to do anyway.

So, a new ‘English Baccalaureate’ (must be good, it sounds classy), comprising the core subjects of English, Maths and the Sciences, each to be tested in one humongous three hour exam at the age of sixteen, with no more coursework. And each subject to be administered by only one exam board each, to combat the (apparent) problem of competition driving standards lower.

Longtime readers of this blog will know that I’m no fan of the current government (not that I have a lot of time for the Opposition either), but taking a step back from partisan politics, is any of this a Good Thing? And more pertinently, if it is, for whom is it Good?

Board to death

Evil Exams

To take the latter point of Gove’s plans first – no more competition between exam boards? I actually think that’s rather a good idea. It’s a bit of a first for a Tory minister to acknowledge that the great god competition actually lowers standards in any situation; perhaps they could try extending that philosophy to the likes of water supplies, railways, bus services…

Still, I digress. It always seemed a nonsense for any real competition to exist when all of the competitors must, essentially, supply the same product meeting the same standards. The press narrative for a couple of years now has been that boards can only compete by offering “easier” exams, thereby giving schools a greater proportion of good results and a better place on the league table.

This is, generally speaking, bullshit. When the government’s standards regulator is doing its job properly, it must ensure that all qualifications in the same subject at the same level offer a parity of challenge. Put simply, if anyone’s caught offering an exam that’s “easier” than anyone else, they face potentially losing the ability to offer it at all. It’s quite common for disenchanted schools, facing a year of bad results, to take their business to another board – only to find next year’s results just as bad, if not worse.

So if all exams are the same, how can you have competition? It boils down to other areas; customer support, teacher training, learner resources and so on. The quest for each board to better the others here, with a finite budget, is what can lead to a stretching of resources and consequent problems the like of which we’ve already seen.

The elimination of competition should therefore be a Good Thing. And so it is, but only in part – boards will still have to compete to be the only one offering each subject at GCSE level. My preference, discussed in a previous post, would be for one board covering all subjects across the country, a system which works well in other countries such as Australia.

Still, competition every few years to offer a subject is better than competition all the bloody time, with each board mercilessly trying to grab a bigger slice of the market. The worry is going to be the initial scramble for licences, particularly with players like Edexcel, which has the financial might of its parent company, multinational publisher Pearson, behind it.

In order to be fair, the process of settling who gets to offer which subject absolutely must be completely transparent and open to public scrutiny. Edexcel’s status as part of a profit-driven multinational gives them an unfair advantage over not-for-profit boards like OCR. And in other areas of the Coalition’s frenetic quest to outsource all things public, we’ve seen private companies like Pearson assert the mantra of “commercial confidentiality” to cover all manner of sins in their negotiations. If this isn’t to be another case of ‘lobbying’ (read ‘paying off the minister concerned with a promise of a juicy directorship on retirement’), the process must be entirely open to scrutiny and investigation.

OK, OK… but what about the exams?

FinalExam

When I was sixteen, I did O Levels – which worked in just the way Mr Gove is so keen on. I wasn’t convinced of their validity even then. A massive, nerve-wracking exam taken after weeks of frantic revision really only assesses what you’ve reminded yourself of recently and can remember on the day. GCSEs, while far from perfect, were designed to combat this with a process of continual assessment throughout the course, introducing the element of coursework to counter the criticism that plenty of intelligent people aren’t actually that good at exams.

Traditionalists have always had a bit of a problem with coursework; and in some ways they have a point. Mainly done without supervision, it was particularly open to plagiarism, a problem that’s intensified with the rise of the internet. The worry now is that entire coursework essays can be cribbed from Wikipedia; or even that certain, ahem, unscrupulous online companies actually offer to do it for you – for a fee, of course.

A halfway decent teacher, though, should be able to spot if work he/she is marking is written by someone other than the pupil they’ve been teaching for the past couple of years. If, that is, they’re not completely frazzled by their workload. Because for teachers, the problem is that coursework effectively means they’re marking students’ exams themselves, and that’s a lot of work – especially in larger schools, where the marking must be moderated by a more senior teacher and sometimes revisited if it’s not up to scratch.

The problem of plagiarism, at least, was supposed to be addressed by the introduction, in these new GCSEs, of ‘Controlled Assessment’ – basically doing coursework under supervised classroom conditions. Being a major change, it caused a lot of disquiet in the teaching professions, but it could have been a change for the better. Sadly, we’ll never know, as it was condemned for replacement after just one year due to the combination of press furore and political ambition. It may have a chance to prove itself in the next couple of years, as the ‘English Baccalaureate’ isn’t due to start until 2015, but its fate is already sealed.

So, assessment will go back to one, externally marked , terminal exam for each subject. I’m sure teachers will be very happy at the reduction in their already massive workload that will result from removing internally assessed work. But as a former exam board employee, I can testify that there was already a huge problem recruiting examiners for the examined units that already exist. Remove internally assessed ‘coursework’, and whatever board/s is/ are left will need many many more examiners.

Given the difficulty recruiting enough for the current level of externally marked work, I can see this being a logistical nightmare. Possibly the reduced workload caused by removing internal assessment will alleviate pressure on teachers, but I’m far from sure it will spur them on to become external examiners. And so Gove’s much-loved final exams may find themselves with a significant paucity of people to mark them. If you try to get exams marked without a sufficient amount of examiners, that’s when standards really suffer.

More generally, I’m not so sure about Gove’s emphasis on memorizing facts, figures and dates. Rote learning is important, of course – you can’t build an argument without facts to construct it from. But I worry that he’d rather have schoolchildren reciting the list of English monarchs without ever thinking about history.

The crux of it is that, while GCSEs could certainly have done with some fundamental reform, Gove’s changes simply push the system back to what he presumably fondly remembers from the 1950s. Hearkening back to a non-existent ‘Golden Age’ is certainly no basis for a programme of education – it’s been tried already, and the world has moved on.

So what should be done?

Abertay_Space_School_AR

I think there’s a real need to have a proper debate about the fundamentals of assessment – what we’re trying to achieve/quantify and how – going down to the absolute basics rather than modifying existing systems or hearkening back nostalgically to earlier ones. We need to properly challenge received wisdom on this issue, and do it entirely separately of political ideology.

For a start, since it’s been mooted that all children should stay in education until the age of eighteen (keeping them off those pesky unemployment registers), do we need a terminal exam at sixteen at all? GCSEs, like O Levels before them, were meant to quantify achievement at the level when children might leave school and go to work. If they’re not doing that, is there any point having them? Other countries, whose children stay in full time education until eighteen, manage perfectly well with tests taken at that point.

Which then leads us on to the question of A Levels. Among other things, GCSEs are used as a measure of whether a student is apt enough to take an A Level in a particular subject. But students don’t take A Levels in every subject; if terminal tests are taken at eighteen, they would necessarily include subjects that might not otherwise have been taken. Not everyone does Maths or English at A Level, for instance, but if all testing happened at the age of eighteen, they would have to. So that would render A Levels redundant too.

Which then, logically, brings us to Higher Education. With A Levels gone, how will universities assess the ability of their applicants? There’s already a problem that universities have to judge on the basis of predicted grades rather than actual results, and for years the idea has been floated of issuing results earlier, to give a more concrete idea of prospective students’ abilities. In practice, it’s unworkable – marking periods are already crushingly short, and to issue results significantly earlier would mean taking the actual exams much earlier, leaving less time to teach the course.

But maybe we shouldn’t assess by testing at all. Maybe there should be some other process of continual assessment throughout children’s schooling, from primary school onwards. And while we’re about it, do we need schools to be divided into a primary and secondary model at all? Again, other countries do it differently, some with more grades of school, some with less.

Also, should tests (if we have them) be norm-referenced (based on percentages of each cohort getting certain grades) or criterion-referenced (based on how well you actually know the subject you’re being tested on)? And why have grades in their current form? It’s always seemed unfair that a difference of one mark can move students from that all-important C to the doom-laden D. Why not express results in percentages of marks gained, as some countries do?

These are all questions that need to be asked. And ideally they need to be asked by educational experts, and not politicians. Labour’s Shadow Education Secretary Stephen Twigg has pledged that, should Labour return to power in 2015 (which is looking increasingly likely), they won’t implement Gove’s proposed changes at all.

That might be good news for those who cleave to the current system (which may not be a good thing either), but it means that for several years the entire educational system will be in turmoil, exam boards frantically designing new qualifications and tendering for licences to deliver them, while the poor overworked teachers must yet again begin training to deliver a new style of course – for the second time in three years. As always, the first group of students to take the test will be terrified of that leap into the unknown. And all for naught, if Labour get in and Twigg keeps his word.

It’s the clearest illustration ever of why politicians should be kept out of education altogether. Apart from the fact that they tend to know nothing about the subject, the constant demand to imprint your political ideology onto the education system means that it changes every time the government does, often for the worse. Teachers never know whether they’ll need retraining every five years, while students end up with incompatible results from completely different qualifications, that offer little comparability to prospective employers.

So if we really want reform, and we want it for the better, let’s keep political ideology out of it altogether and leave it to the experts – teachers, academics, you know, people who actually do the educating. Because Gove’s time trip to 1956 doesn’t strike me as much of an improvement.

It’s only words….

‘When I use a word,’ Humpty Dumpty said in rather a scornful tone, ‘it means just what I choose it to mean – neither more nor less.’ – Lewis Carroll

In recent weeks, we’ve been blessed with the political excitement of both the Democratic and Republican National Conventions in the US, and a much-derided Cabinet reshuffle here in the UK. As party conference season looms for us and politicians start flying unfeasible policy kites in preparation to appease their more insane members, I thought it might be interesting to have a look at how the politics of class is currently shaping – and being shaped by – its use of language.

The English language, with all of its ambiguities, multiple meanings, synonyms, antonyms and homonyms, has always been a bit of a gift for political rhetoric. There’s nothing so telling of the political climate of the times as seeing the prevalence of particular words and phrases, cunningly employed to drive home a political message in speeches, press releases and party-affiliated news stories.

Scenes from the class struggle with the English language

DNCMiddleClass

One of the most noticeable things at both the Democratic and Republican conventions was a relentless focus on the middle class. At a time of economic hardship, when hard-right policies seem designed specifically to funnel money even further towards an already massively wealthy clique, this is fairly understandable. “Ours is a fight to restore the values of the middle class,” declaimed Barack Obama, as his supporters waved banners proclaiming “middle class first”. Over in the homogenous dream world of the Republicans, ultra-reactionary VP candidate Paul Ryan set out his stall: “We have a plan for a stronger middle class, with the goal of generating 12 million new jobs over the next four years”.

So what’s missing, you might ask? Well, both parties were taken to task for neglecting to cover the “poor”. But what’s interesting is that the term “poor” seems to have supplanted the term “working class”. If you’ve a “middle class”, then you must have one above and below it, by definition. The one above it is fairly clear, both here and in the US – they’re the ones with all the money, bankrolling each country’s more rightwing party to run the government for their own advantage.

But where’s the one below it? Why is “working class” now the more pejorative “poor”? “Poor” seems to carry connotations of helplessness, dependence, and inferiority. “Working class”, by contrast has overtones of decent, hardworking nobility.

It now seems quaint and old-fashioned. In part, this is because of the aspirational culture of the last few decades. “We are all middle class now,” said John Prescott in 1997. That’s John Prescott of the Labour Party, the one that was founded by and for the working class. The same party whose current leader, nerdish school prefect lookalike Ed Miliband says he wants to appeal to the “squeezed middle”. Being a “poor but honest” worker isn’t trendy any more. If you don’t have the mortgage, the two-year-old car, and the annual foreign holiday, you probably aren’t “working” anyway.

So the lowest class is not now “working”. Instead they are “poor” or even more pejoratively, with an overtone of menace, the “underclass”. Sorry to get all Godwin’s, but it’s always worrying when politicians or political journalists use terms reminiscent of “untermenschen”.

With the rightwing holding sway politically in the UK, after the riots of last summer, another word found itself attached to that – “feral”. That’s even more disturbing. Now not only are the former “working class” the “underclass”, but they’re actually animalistic and unhuman. You can see why this makes for a worrying narrative progression.

As if to emphasise that the “underclass” are no longer the “working class”, they’re now routinely conflated with the unemployed – conveniently ignoring all those full time workers here in the UK whose wages are so low they have to rely on government benefits anyway. So the “poor” are demonised as “scroungers”, part of an “entitlement culture” whose “dependency” is on money taken unwilling from virtuous, hardworking taxpayers. For added venom, the adjectives “idle” and “feckless” tend to be used in varying combinations, in government speeches, press releases and the news stories that cover them. The result is an unhealthy climate where if you’re not “middle class”, it’s your own fault for being “idle” and “dependent”. Never mind that the minimum wage is so low and the cost of living so high that often full time employment won’t pay enough to live on.

 

Rebrand the rich

RomneyTax

“For the last time, I am a job creator! You must, you will OBEY ME!!”

In tandem with the linguistic subjugation of the lower class from “working” to subhuman “scroungers” who steal from the virtuous middle class, the “upper class” have tried to twist the language describing them into more glowing, fulsome praise. The word “rich” has for many years (possibly since the French Revolution) had snobbish, uncaring and materialistic overtones. How then should the rich present themselves as altruistic and beneficial to the society whose money they’re gradually accumulating all of?

The result, initially, was the insidious term “wealth creators”. I first heard this emanating from the Republican Party in the US, and I’ve wondered ever since if somebody was actually paid to think up this asinine term. It does sound like just the sort of thing that might be focus grouped and moulded by the sort of consultants who briefly tried to rename the Post Office “Consignia”.

“Wealth creators” implied that the rich’s accumulation of material assets was good for the wealth of the country as a whole. But people cottoned on to the fact that any wealth they “created” went straight to them and stayed there, often moored in offshore tax havens so it wasn’t subject to that inconvenient burden of taxation for the good of society – “wealth hoarders” would be a more accurate description. Plus, the phrase still contained the word “wealth”, as in “wealthy”, ie “rich”. And if the wealth you’re creating is your own, you’re hardly going to be seen as contributing to the society you’re funnelling it from.

So “wealth creators”, even though it’s still in common currency, morphed into “job creators”. You can imagine some smarmy image consultant somewhere sitting back and folding his arms in satisfaction at that one. Well, if the business you’re running has made you rich, you must have “created jobs”, right? And that can only make it look like your contribution to society is more important than your employees, who pay a far greater proportion of their meagre incomes in tax than you do. Mitt Romney stated that he didn’t need to release any more tax returns; he’d definitely paid enough tax, it was a whole 13% of his $20.9 million income (2011).

But Mitt’s a “job creator”, so that’s OK .Even though most of the jobs he “created” while running Bain Capital were in India and China. Governments will find it far less acceptable to impose heavy taxes on “job creators” than they would on “the rich”. If “job creators” leave the country because tax rules aren’t favourable enough to them, who will “create the jobs”? You can see why that’s worse than “the rich” leaving the country, which by and large people don’t really care about. Ask Phil Collins.

 

Race to the bottom

NOLALady

With the upper class elevated to sainthood and the lower class reduced to the level of animals, you can see why, linguistically, “middle class” is the only uncontroversial one left. Particularly in the US. It’s been said that in the UK, the political struggle is always about class, whereas in the US, it’s always about race. That’s only half true; class does exist in the US, it’s based on money, and it often seems determined by race. Its prisons bulge at the seams with young African-Americans, many of whom turned to crime as the only refuge from a desperately poor background. Visit Southern California, and you’ll see the class divide even more starkly in racial terms. Whites have the good jobs and the nice cars; Latinos have the service jobs and the beatup but respectable older vehicles; and blacks, if they have jobs at all, may well have to travel on the bus because they can’t afford cars.

Yes, it’s a sweeping generalisation, and far from true universally. But it’s true often enough, and here in the UK too, non-white ethnicities tend to be poorer and/or jobless at a level disproportionately higher than Caucasians. In the US, where Republican state governments are passing voter ID laws that explicitly target the poor, class and race overlap. The “poor” in a state like Florida is disproportionately made up of non-Caucasians. Perhaps coincidentally, a recent poll registered African-American support for the Mitt Romney at a modest total of 0%. OK, Herman Cain and Marco Rubio will probably be voting Republican, but there’s always a margin of error. Nevertheless, that’s a poll figure that might make even the Lib Dems here in the UK feel slightly better.

 

Turn Left

Occupy01

Trying to reclaim the word “rich” from the “wealth creators”

Still, the right haven’t had the monopoly on shaping the political and class debate by distorting the English language. Since austerity (another political buzzword) bit, and income inequality (and there’s another one) became hot political topics, the left have found their own way to load words with unintended meaning. In the wake of the Occupy movement, the word “elite”, which always carried faintly nasty overtones of exclusion, took on a far more damning meaning when used to describe the tiny clique of hyper-rich people who seemed simultaneously responsible for and immune to the financial crisis engulfing the world.

In the UK, left-leaning politicos and journalists got their own back on the right by taking their pejorative adjective “feral” and applying it to that “elite”. For a while, the phrases “feral underclass” and “feral elite” were flung at each other with such frequency they ceased to have much meaning; as a result, after a brief period in the linguistic limelight, they seem to have faded somewhat into obscurity. Significantly, the terms coined by the left to describe the unfairness of the situation which stuck are not linguistic but numerical – the “elite” are “the 1%”, and the rest of us who pay a greater proportion of our income as tax are “the 99%”. Put in those terms, the injustice is hard to argue with even with any amount of “job creators” in that “1%”.

 

Language in a post-truth world

RyanPinocchio2

Politics and truth have always had a rather abusive relationship, as US journalists are finding as they struggle to adjust to the “post-truth” world in the wake of Paul Ryan’s epically inaccurate speech. The astute use of language can make an untruth seem less like an actual lie. It’s nothing new. When arch-Republican Chuck Norris claims that re-electing Barack Obama will usher in “a thousand years of darkness”, that’s hyperbole at its most extreme. Of course, Winston Churchill said something similar about Adolf Hitler, but it’s hard to equate Obama with Hitler (unless you’re Glenn Beck). Meanwhile, Fox News and other histrionic right wing news outlets pander to their sponsors by treating the words “liberal” and “progressive” as descriptions of something beneath contempt, which in turn passes into mainstream Republican discourse.

Taking poor, innocent English words and twisting them into political weapons is, of course, a longstanding practice in both the US and the UK. But in the modern era of spin doctors, image consultants , key demographics and focus groups, it’s hit an all time high that’s often ridiculous – as Nick Clegg, with his repeated meaningless blather about “alarm clock Britain” seems not to have noticed. The flexibility of the English language is both a blessing and a curse for political discourse, but it’s never less than interesting to watch. To help you out, here’s a little chart of phrases to look out for in the coming US Presidential election and UK party conference season. Have fun playing political bingo, or alternatively, use it for a drinking game. It should get you so drunk that you might stop despairing…

 

Austerity

Middle class

Feckless scroungers

Public sector waste

Illegal immigrant

Entitlement culture

Job creators

Gold-plated pensions

Socialist healthcare

Private healthcare

Underclass

Benefit fraud

Hardworking taxpayer

Big society (getting rare now, this one)

Alarm clock Britain (not rare enough)

Plan B

Terrorism

Liberal media

Conservative media

Bureaucratic excess

Deregulation

Reregulation

Small business

Big business

Lending

Family values

God

Innovation

“..and I’m not making this up.”

“…well here’s the truth.”

The venomous Cameron and the amphibious Clegg

CleggCameronScorpionFrog

There’s a well-known fable involving a scorpion and a frog. The scorpion, keen for nebulous reasons to be on the other side of a river, asks a nearby frog to help out by carrying him over. The frog is dubious. “How do I know you won’t sting me?”, he asks. The scorpion, reasonably, replies that if he did, they would both die. So the frog, naively, agrees to the plan, and inevitably, halfway across, the scorpion stings him. “Why did you do that?” gasps the dying frog. “You know that we’ll both die now!” The scorpion is phlegmatic: “It’s my nature. I can’t change it.”

This charming tale came to mind on Monday, when BBC News managed to find a space in between its wall-to-wall Olympic coverage for some actual news. Said news was a dejected looking Nick Clegg seeming to finally realise the nature of the predatory beast he’d harnessed himself and his party to. He’d called a press conference to announce that the last of the Lib Dems’ central policy planks, the reform of the House of Lords, was to be abandoned in the face of overwhelming opposition not just from Labour, but from the Lib Dems’ own coalition partners/masters, the venomous Conservative Party.

It took Clegg rather longer than that mythic frog to feel the repeated stings his ‘allies’ barbed tail was inflicting on his party’s policies. The Tories granted him his cherished referendum on reforming the electoral system, then despite careful negotiations defeated him at the polls with a hell-for-leather opposition campaign infinitely better funded and organised than that in support of the motion. They allowed him his increase in the income tax lower threshold – while at the same time slashing the top rate of tax for their hyper-rich cronies (sorry, ‘job creators’ – ha!) and rejecting his proposed ‘mansion tax’, a variation on another cherished Lib Dem policy, Land Value Tax. And all this after beginning their marriage of convenience by forcing the Lib Dems to not only abandon their policy of opposing rises to, and trying to abolish, university tuition fees, but actually twisting their arms to tacitly support having the fees trebled.

Yes, the Lib Dems have wrung some small concessions from the Tories (see my previous blog on this), to the extent that Cameron has had to tell the press that he could “govern like a true Tory” if only it weren’t for those pesky Lib Dems, in an attempt to placate his more barking rightwing backbenchers. But to the voting public, those concessions are small fry compared with the formerly compassionate-seeming Lib Dems’ complicity in slashing the Welfare State and laying the groundwork for further privatisation of the much-loved NHS.

And in any case, the public image of the Lib Dems rests primarily on those central planks they’ve been so vocal on in the past – abolition of tuition fees, the reform of the voting system, and the reform of the House of Lords. With the first two well and truly scuppered by their underhand partners, the forced abandonment of Lords reform was the final sting for Clegg, who miserably took to the air to (finally!) condemn the damage his supposed ‘allies’ were doing not just to him, but to their own party and the government of the country as a whole. Finally discovering some balls, he chose to use them in precisely the wrong way, with a childish tit-for-tat gesture that promised to scupper a cherished Tory policy – the electoral boundary review, which, on the face of it, would gain the Tories more seats in a reduced Commons to the detriment of both Labour and the Lib Dems.

The problem is that, in the minds of the voting public, the Lib Dems are most strongly associated with constitutional reform, and with good reason. As my Lib Dem friend Richard explains in his blog, if you’ve lost faith in the political system because you think it’s broken, nothing about it will work properly until it’s fixed. Trouble is, looked at without partisan goggles, the proposed boundary review does seem a fairer way of dividing votes for the British electorate. Clegg himself said in November 2010 that it would mean "correcting fundamental injustices in how people elect their MPs". And it’s less than certain that the Tories would like or benefit from it as much as he seems to think.

So retaliating against the Conservatives by blocking a policy he should by rights be in favour of, and which they may not be as keen on as he thinks, looks, well, a bit silly really. Unfortunately, this is the same kind of political naivete, derived from years spent in opposition, that produced his other great no-win scenario, tuition fees. Like that debacle, he’s damned if he does, and damned if he doesn’t.

If he blocks the boundary changes, he can be seen to be finally standing up to the Tories – but be seen as a hypocrite by many, not least in his own party. If he chooses to retain his integrity and support a constitutional change perfectly in line with his own party’s policies, he’s got nothing in the arsenal left to strike the Tories with, and looks like he’s bent over to let his party get shafted by them for the umpteenth time.

Hindsight is of course a wonderful thing, but I’d say he should have stood firmer ground when the Tory leadership first broke their word and began aggressively campaigning for a ‘no’ vote to the AV referendum. There were any number of cherished Tory policies he could have seriously damaged by withdrawing support back then – the NHS reforms, the welfare reforms, reduction in the top rate of tax – and he might have seriously regained some respect and support for his party in the eyes of the electorate. But they’ve all been passed now, with the tacit or actual support of his party giving them a mandate, and all that’s left to torpedo is a policy that he should, logically, be in favour of.

 

OK, so what about the other Parliamentary numskulls?

Cameronouroboros

To be fair, no party has emerged from this debacle covered in glory. David Cameron, in particular, now looks incredibly ineffectual as a leader who can’t even deliver his own party’s support for an agreed-for policy. The backbench rebellion of 91 MPs is one of the largest rebellions against the party whips in the history of his party, and the fact that they felt they could get away with it does Cameron’s leadership standing no favours at all. One tweeter commented, accurately, that “Lib Dems thought they were in Coalition with the whole Tory party not just David Cameron”. The backbenchers are making the Tory party look like a reverse ouroboros, a snake whose tail is somehow eating its own head.

Not to mention undermining the tenuous alliance they have with the until-now supine Lib Dems, which they seem to have forgotten is the only thing currently preventing them from struggling through as a minority government. They can’t even go to the polls without a 55% vote of no confidence due to the Fixed Term Parliament Act, the one constitutional reform that has been enacted. Not that they would be likely to win an outright majority this time, but it’s never worth underestimating the power of delusion in the rightwing Conservative ranks. In short, they’ve finally succeeded in alienating the party that is the only thing keeping them properly in power. But like that scorpion of old, that’s Tory nature.

 

But surely Labour are dealing with this with their usual sensitive maturity?

MiliBallsDukatWeyoun

Labour too look pretty crap here, torpedoing a policy they’ve championed of old as a (supposed) party of the working class. Politics is about compromise, and while the Lords reform might not have been all they wanted, voting for no reform at all feels like cutting off your nose to spite your face. It could always have been built on later, in the increasingly likely result of the next government being Labour. Instead, next time the Tories can fling the accusation that they’ve already rejected it once. It’s the same as those ardent PR supporters who voted no to AV on the grounds that it wasn’t full PR – those opposed now have the ammunition that the British public have already rejected electoral reform.

The Lords reform legislation as drafted had its roots in a Labour-instituted study, too, and they voted for it at both first and second reading, dropping their ‘no’ bomb at the late stage of debate timetabling. They probably did have a point that the legislation was shoddy and needed reshaping, and that might well have required considerably more than 14 days’ worth of debate. But crucially, at that late stage, they offered no alternative suggestion – just an emphatic “no”. It looks most like a childish fit of pique, designed to drive a wedge between the coalition ‘partners’ in the full knowledge of Cameron’s ineffectiveness of whipping his backbenchers into line.

And yet, Labour might be being cannier than they seem, especially with Clegg’s (perhaps?) unwitting connivance. For all the ambivalence of the Tories and the Lib Dems, Labour were the one party surest to lose out on seats due to the boundary review. Now, despite Cameron’s plan to persevere with it, if the Lib Dems hold to Clegg’s word and oppose it, it’s finished. Advantage: Labour.

The fact that this couldn’t have worked out better for them in that sense has led some to speculate that this was an arcane plan of Miliband and co’s all along – oppose Lords reform and Clegg will have to retaliate, and boundary changes is the only significant thing he has left to strike at. And one Lib Dem I know has gone even further and suggested that perhaps the opposition to the boundary review is Clegg’s own olive branch held out to the previously intransigent Labour party, laying the groundwork for abandoning their treacherous Tory partners at last.

 

A match made in Hell?

MiliCleggDoctorMaster

I’m not sure I buy that, but there still seems to be some vain hope that Clegg, Cable and co will cross the floor and hook up with Labour. I don’t think that would do them any favours; they’d be seen as fickle and opportunistic, bending their ideology to whichever main party would offer them the most regardless of their principles. Far better to abandon coalition altogether while they can still (truthfully) assert that they have tried to make it work in the same mature fashion as European parties, only to be thwarted by the childish, materialistic behaviour of their supposed ‘partners’.

They could then stand some chance of regaining respect by supporting the Conservatives on a ‘confidence and supply’ basis, leaving them free to oppose any measures they genuinely didn’t support. The only problem there is that most of the measures a lot of Lib Dem MPs would oppose have already been passed in a frenetic haste by a Conservative party desperate to enact their ideology in case they turn out to be a one term government.

And of course, leaving the coalition would leave the Lib Dems with no positions in the Cabinet from which to moderate the Conservatives’ brutal, ideologically motivated policies. But in the eyes of most of the electorate, they’ve singularly failed at ‘moderating’ anyway, content with a few piecemeal breadcrumb policies thrown from the table while the Conservatives hacked away at everything liberals and the welfare state have achieved since 1948.

Again, hindsight is a wonderful thing, but I wonder how many Lib Dems now think their leaders should have walked away from coalition with either major party, retaining their integrity by saying, “we tried to make it work, but neither party would compromise maturely enough for us to find common ground”. Those who support the coalition may be saying that “enough common ground” was precisely what the Tories offered, but they’re just now finding out how much those promises were worth (rather later than many others, I think). It’s taken long enough, but the Lib Dems may finally be realising that Tories can no more change their nature than that scorpion, even if it means their own electoral destruction.

What if God was one of us?

Some musings on the current uneasy relationship between religion and secular society…

religion-politics

In recent weeks, there’s been a surge of news articles which detail religion coming into conflict with states that are, nominally at least, secular.

Religion is a thorny issue for secular liberals to get their heads around. A defining factor for liberals is our insistence on tolerance and inclusivity for all, and that usually includes religious freedom. The problem comes when the religions whose freedom we’re insisting on espouse beliefs that come into direct conflict with our own philosophy of tolerance – and while it may not be true of all who follow each faith, almost every major religion has one or more group that they are actively intolerant of. Women and homosexuals tend to come top of the list, with varying degrees of intolerance directed at them notably from the mainstream of all three Abrahamic faiths. But religious dogma has been used to discriminate against other groups throughout history – and that tends to be most focused on a dislike/hatred of religious groups other than themselves.

So what do we secular, inclusive liberals do when faced with the problem of tolerance for groups who tend towards intolerance? There’s a tendency towards contorted doublethink, but it’s a hard one to address without coming across as hypocritical. At this point, it’s worth noting that objections to a religious philosophy don’t (or shouldn’t) encompass all those who follow it. I know both Christians and Muslims, and not one has a problem with either my atheism or my homosexuality. Neither do I have a problem with them having beliefs that I don’t share.

No, our objections to religion (if we have them) should be directed at religious orthodoxy – those who come up with the mainstream positions of each faith on issues that might seem reactionary in a secular, inclusive country. Even here, this is far from a clear issue. Within each major faith are any number of factions, large or small, whose feelings on such issues vary wildly. Beyond the obvious division of Christians into Catholics and Protestants, there’s a variety of smaller subsets, while Islam’s notable division into Sunni and Shiah also embraces a multitude of factions within each. Indeed, Islam is difficult to ascribe any overriding, definitive philosophy to, in the absence of a central governing body like the Anglican Synod or the Catholic Vatican.

Compounding the problem is that the lines between religious faith, culture, politics and ethnicity are extremely blurred. And if there’s anything we liberals hate, it’s racial prejudice and bigoted stereotyping. But it’s not that simple. Judaism in particular is associated with a specific ethnicity, which is to ignore the wide variety of Jewish racial and cultural characteristics. Islam tends to be associated with Arabic peoples, due to its area of origin, but encompasses huge swathes of other races in the West, Asia and Africa. Nonetheless, criticism of these religions tends to be simplified into a debate which generalises any objectors to them as racists, in a way that tends not to happen with Christianity (stereotypically, and inaccurately, viewed as a faith dominated by Caucasians).

The gold standard for this is, of course, the Holocaust, which still casts such a long shadow over history that it’s the standard reductio ad absurdum response in any debate, particularly online. Adolf Hitler, ironically, made no distinction between the boundaries of faith, culture or race in his persecution of the Jews – if you had any trace of Judaism in you, whether it be genetic or cultural, off to the camps you would go. It’s still a massively emotive historical event, as evidenced by the slightly cynical manipulation inherent in the articles by Owen Jones and Jonathan Freedland which invite you to substitute “Jew” for “Muslim” in criticism of Islam “and be shocked”. As though Judaism should, somehow, be above criticism because of its long history of pogroms and persecution.

The irony is that in reducing all criticism of religion to the accusation of racism, those commentators who most strenuously oppose interference in religion tend to be guilty of the same kind of generalisation. The current wave of articles decrying the UK’s ‘Islamophobia’ is a perfect example. There undoubtedly is an excessive media fixation on Muslims in the UK (and the US, for that matter). It’s been argued (with some validity) that Islam seems more socially acceptable to criticise than other faiths. And it is utterly ridiculous that anyone writing about Islam should be required to state their positions on the faith’s more contentious philosophies in order to be taken seriously.

But to sweep all objections to Islam into the gross generalisation of ‘Islamophobia’ is similarly bigoted. There is, I think, a wide variety of people and motives in this slew of criticism. Some, like the EDL and a disturbing number of ‘neo-Nazi’ groups in Europe, genuinely do seem to be motivated by racism, or at the very least xenophobia – the irrational fear of ‘others’ that seems hardwired into the human psyche, which civilisation strives to overcome.

Then there are those who object to all religion on principle – these tend to be militant atheists of the Richard Dawkins school, who fail to notice the irony that they are constantly proselytising for their own belief system just as much as any religion does. In fact, this kind of atheism seems blind to its own illiberal prejudices, flinging insulting terms like “sky fairy”, “invisible friend” and “childlike nonsense” at believers. I tend towards atheism myself, but I realise that it’s a belief system as much as any religious philosophy, and that we atheists would find it unacceptable for devout believers to be as insulting as we often are.

However, any religion should be able to bear criticism (in much the same way as I’ve just criticised atheists), and it’s right and fair that Islam or Judaism should not be exempted from this in secular societies. Most nominally secular Western states evolved from overtly Christian ones, and liberal commentators certainly don’t shy away from pointing out Christianity’s failings.

Islam over the last few decades has been conspicuously resistant to criticism, which ironically has probably spurred more to fixate on its perceived failings than they otherwise might have. The September 11 attacks were obviously the work of a small group of fanatical extremists (which every religion has), but even before those we had the Iranian-issued fatwa on author Salman Rushdie for his perceived blasphemy in his novel The Satanic Verses. And more recently, the admittedly childish provocation of the cartoons depicting the Prophet Muhammad in Danish newspaper Jyllands Posten resulted in a hysterical outcry from some Muslims across the world, which encompassed death threats, violence and arson. This despite the fact the proscription on depicting the Prophet is a comparatively recent ruling in Islam, and not a specific commandment in the Qu’ran but an interpretation of the general antipathy towards icons in Islam.

Islam is also the only major religion to still rule over states as actual theocracies, and where it does, the leaders’ interpretation of their faith is massively intolerant in its treatment of those old bugbears, non-believers, women and homosexuals. Saudi Arabia has policies directed at its female population that would be considered repressive and totalitarian in secular states, while Iran’s treatment (frequently execution) of homosexuals would be considered barbaric in the West (except perhaps by the Westboro Baptist Church). In Pakistan, the draconian anti-blasphemy laws (ironically derived from colonial rules established by the British) make its religion almost totalitarian in nature.

But those are sovereign states with their own cultures, and despite Tony Blair’s fervent wishes, we don’t have a moral high ground to change their practices by force. All we can do is try to influence them by other means. We should, however, resist any pressure to exempt their beliefs from the rules of our own secular societies, and firmly refute any attempt to influence the law of the land in the name of those beliefs. That goes for fundamentalist Christians too, whose virtual hijacking of the Republican Party in America is abhorrent to the freedoms espoused in its Constitution.  Anyone should be free to believe whatever they like, and to practise whatever rites their faith demands – up to and until the point where those practices have a negative impact on others.

So, I would defend to the death Cardinal O’Brien’s right to believe that I am an abomination and bound for Hell. It’s when he starts using that belief to try and influence the laws of the land that he becomes fair game for criticism. I am not ‘racist’ against Celtic Catholics for objecting. Neither am I being anti-Semitic if I object to the partially secular state of Israel’s treatment of Palestinians, nor rabidly Zionist when I assert Israel’s right to exist.

There’s a common consensus in most secular societies that religions have had to adapt to as their political power became less all-pervading. Christianity survived being told that it could no longer burn heretics, prohibit English translations of the Bible, or stone adulterers to death, and it will survive equal marriage. Islam as a philosophy seems to be adapting more slowly when in secular states, but it has adapted. There’s no reason to assume it won’t continue to do so (although trying to hurry it along can be tempting).

But what about that German ruling on infant circumcision? That’s an example of how none of this is clear cut or simple, as usual. Speaking from my own cultural perspective, it seems an act of irreversible bodily alteration carried out without consent (ie a negative impact), and should be resisted (though whether a state ban is the best way to resist it is a complex debate in itself). Muddying the waters is the fact that a great deal of infant circumcision has no religious motivation at all (notably in the US, where it’s more of a cultural norm, though this appears to be declining).

Defenders of the practice produce convincing scientific studies alluding to health and hygiene benefits, while opponents produce equally convincing studies arguing precisely the opposite. A wealth of data supporting both positions means that neither is conclusively convincing, and in the end it boils down to a question of cultural tradition. Tradition is a very hard thing to change, whether religious or not, and in the case of Judaism circumcision is so fundamentally bound up with Jewish identity that it’s virtually impossible. The statement on Abraham’s covenant with God, and its foreskin-removal requirement at the age of eight days, is pretty unequivocal.

Islamic circumcision is a more recent tradition, but still of very long standing. It does have the get out clause of not being mentioned in the Qu’ran itself, but the obligation is spelled out in Sunnah and is unlikely to find much appetite for abandonment. Christians, of course, manage to sidestep the whole issue via Christ’s New Covenant, which renders a number of Old Testament conventions obsolete.

A slightly less draconian regulation of the proposed German ban was tried in Sweden in 2001, and has had little effect on its frequency. Like all of the subjects touched on here, this is by no means a straightforward issue – what may seem a negative impact to me may seem quite the opposite to those inside a religious community. Obviously if we had deranged mohelim going around trying to circumcise the secular, that would be unacceptable. But we don’t – it’s a rite which affects Jews, and many would say positively.

And yet, we have legislated against other practices which religious communities would like to carry out internally – for example, forced marriage or female genital mutilation. Secular states have been able to do this because of an overriding consensus that these are ‘negative impacts’ (to put it mildly), a state of affairs we’ve yet to reach with circumcision, which is less demonstrably harmful. Given all of that, I’d say we need to work towards making the tradition less generally acceptable via education rather than the blunt tool of a state ban.

This lies at the heart of the problem with our acceptance (or not) of religious rites and influence on general society. These are practices which have become so deeply entrenched because of centuries, sometimes millennia, of tradition that they are rarely questioned – and yet, were they to be introduced now, many would be unfathomable and unacceptable. Obviously, this will always be the viewpoint of those outside religious communities.

The more longstanding the tradition, the less it’s questioned, hence the numerous exemptions from social rules that the Abrahamic faiths in particular benefit from. The First Amendment of the US Constitution has the balance about right – “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof” – and yet the incoming President still swears the Inauguration oath with one hand on the Bible.  More recently established religions find less reverence from outsiders; Mormonism (founded in the 1820s) had to abandon its cherished practice of polygamy due to US law, and a Supreme Court ruling held that the right to “free exercise” of religion did not extend to religious practices that conflict with the law of the land.

And yet so many secular states (the UK may have an established church, but its government is nominally secular) still extend freedoms to longstanding religions that would seem distinctly peculiar if they were asked for in the present day. And in order to ensure fairness, these then have to be extended to any officially recognised religion, however bizarre it may seem (hello, Scientology).

I’d contend that in a secular society (as the US First Amendment states), we should be tolerant of religion but allow it no role in governing a populace, and further that a secular government should be able to criticise, and in some cases outlaw, traditional practices if they are judged (by majority consensus) to be unacceptable. In the UK, we should not have 26 seats in the House of Lords reserved for bishops. We should not allow religious organisations to practise outright discrimination because of their beliefs. We should not be giving state funding to faith schools whose primary raison d’etre is to perpetuate the beliefs of some of the richest organisations on Earth. And for the same reason, religious property, institutions and personnel should not be exempt from taxation (estimated to be depriving the US Treasury alone of some $71 billion a year).

Obviously I’d prefer it if everyone shared my belief system (atheism), but believers of all other stripes must feel the same, and let’s face it, it isn’t going to happen. Religions may die out – some have, over the course of history – but none of the currently prominent ones are in any danger of that. But if we are to respect them, they must respect us – and I’m not restricting that to any but ALL of them. Yes, including the Jedi.

What if the rightwingers were right?

Feral underclass

The feral underclass, tomorrow.

In the wake of David Cameron’s recent kite-flying statement about cutting off housing benefit for everyone under 25, the whole subject of welfare reform – and the demonisation of welfare recipients – has reared its head again. Displaying his usual stunning lack of empathy in his determination to ‘Con-Dem’ the ‘scroungers’ so berated by Iain Duncan Smith, our glorious leader chooses once again to conveniently ignore a few facts:

  • The vast majority of those claiming housing benefit are actually in work. The paltry minimum wage (£6.08 per hour, fact fans) isn’t enough to live on in many places even if you’re working full time.
  • A large proportion of the benefit bill is composed of working tax credits, claimed, again, by those who already have the jobs IDS is exhorting claimants to go out and get.
  • Removing housing benefit from young people would massively reduce their ability to move to other areas where they could get jobs, even while IDS echoes Norman Tebbit’s “get on your bike” refrain.
  • Forcing them to move back in with their parents would be difficult if their parents have been forced to ‘downsize’ their accommodation (as recommended by that self same government).
  • The artificially (and politically motivated) inflated house prices make it virtually impossible for even those on a decent wage to buy a house, while buy-to-let landlords, free of any kind of regulation, are able to raise already exorbitant rents as high as they like, knowing tenants have no other choice.
  • And finally, the £10bn Cameron’s seeking to save on the welfare bill is a drop in the ocean compared to the amount of unclaimed tax that wealthy individuals and corporations are allowed to get away with not paying. Vodafone alone could pay for more than half that saving if it paid what it actually owed, but has been allowed to ‘negotiate’ with HMRC to pay a fraction of what is due (under Labour, who are every bit as culpable). And that’s just one company.

Cameron and his cronies have this propaganda approach of picking the tiny minority of extreme cases of ‘welfare entitlement’ then somehow managing to tar the majority with the same brush. Anyone who’s been in the position of being unemployed is aware that there isn’t a vast army of ‘entitled scroungers’ who ‘don’t want to work’ and choose benefits as a ‘lifestyle choice’.

But let’s assume, for the sake of argument, that this view is correct, and there’s a huge ‘feral underclass’ thumbing its nose at the taxpayer while using their hard-earned cash to go on holidays and drive BMWs.

Even if this IS the case, then is simply removing all their income the best way to deal with it? Assuming you favour the ‘stick’ approach of "they deserve all they get" or "they’ll have to get jobs then", you’re ignoring the fact that, according to the ONS, there are only one sixth as many job vacancies as unemployed people.

So the actual effect of cutting off all income from these ‘scrounging parasites’ is to throw them out on the streets to starve. Now, some rightwingers are callous enough about their fellow humans as to say that’s no more than what they deserve. "Survival of the fittest", etc.

But if you can’t appeal to a sense of caring for the vulnerable in society, it’s worth taking a pragmatic look at what effect this would have on them, those ‘hard-working taxpayers who’ve had enough of supporting the ‘idle’. Said ‘idle’ classes will now be starving on the streets, complete with that army of kids you say they’ve had just to get more benefits. Local authorities can’t afford to house them – shouldn’t, by your arguments. So what will this ragged army of former ‘parasites’ now living in boxes under railway arches do?

Well, those least equipped to cope may well simply die – of starvation, cold, exhaustion, all the things that currently ravage the homeless population all the time. Look forward to the possibility of discovering rotting corpses lying unburied hither and yon throughout city centres. At least it’ll create job opportunities in the funerary market, I suppose.

Some might turn to selling themselves sexually – look forward to a vast increase in those street corner prostitutes of both genders, many probably underage. I’m guessing those ‘hard-working taxpayers’ wouldn’t be too happy about that; most of those who espouse such views probably already hate the ‘immorality culture.’

But if the rightwingers are right about the type of ‘entitled’ person that makes up this ‘army of scroungers’ what most of them would do on finding themselves destitute would be to turn to crime. Burglary, mugging, car theft, drug dealing, public disorder – all would increase many times over from their current rates, as the only options left for these ‘nasty parasites’ would be CRIME or DEATH. City police forces, already starved of budgets and resources, would be unable to cope, and even if they could catch any significant amount of this new army of criminals, it would be impossible to squeeze them into an under-resourced, privatised prison system that’s already bursting at the seams for lack of investment.

So, even if you genuinely believe there’s an army of entitled parasites living the high life at the expense of the hard-working taxpayer (which is, of course, bollocks), be careful what you wish for. If these people are the feral, lawless, troglodytes you believe them to be, your fervent desire to cut off their only income could only lead to city centres being besieged by armies of homeless, barbaric, criminal thugs, intent on robbing, raping and selling their bodies to the highest bidder. It would be, effectively, the kind of dystopia the Daily Mail seems to have a disturbing fetish about.

Quatermass1979

And while I hate to bring Godwin’s Law into this, I find it all too easy to imagine what the rightwingers would do if confronted with this dystopia of their own making. “Round them all up”, would be the cry. Followed, inevitably, by a lack of caring as to where they went, as long as it wasn’t ‘here’. With prisons already too full, swathes of land would have to be adapted into makeshift ‘camps’ like those already crammed full of asylum seekers and refugees. But you couldn’t deport your ‘feral underclass’ if they’re British citizens, and you couldn’t release such scum back into society. So what would you do with them? At this point, you begin to hear the sound of ovens being fired up…

Yes, I know this is all exaggerated dystopian hyperbole. More moderate Conservatives certainly don’t want it to come to this. But the particularly extreme, rabid rightwingers who insist on the complete dismantlement of the Welfare State and the abolition of all workers’ rights and protections, clearly haven’t thought through the impact this would have on them.

Fortunately, the majority of benefit claimants aren’t a combination of extras from Shameless and Mad Max, but decent, law-abiding people, many of them actually in work, trying their best to honestly pay their way in a society where inflation and income inequality have made it impossible to do so without state help. Meanwhile, that same state is giving concession after concession to the real ‘entitled parasites’ – corporations and individuals so wealthy that they can afford to shirk their fair share of paying for the country they live in. Tax breaks, HMRC negotiations and non-dom status are of course condemned by Cameron and Osbourne – but only in the case of certain people. Jimmy Carr might be “morally wrong”, but strangely the same judgement doesn’t apply to the Conservative front bench and their friends in the City.

Contrary to the fevered beliefs of many on the left, Conservatives aren’t actually ‘evil’. Nobody sees themselves in that way. They honestly believe they’re doing the right thing, and that by removing state intervention and allowing ‘the market’ to dictate terms they can sort out society’s problems. To some extent, they may even be right. The problem for me is that, even if it works, they’re ignoring the inevitable human suffering and carnage it will cause before it does.

Of course, Cameron’s draconian welfare proposals aren’t actually policy. At least, not yet. The obvious political motivation was to throw a few bones in the direction of his party’s more extreme back benchers, ahead of what’s liable to be a fraught debate about Lords reform this week. Reading the news articles about this, even the right wing press don’t seem to think it’s a good idea – the below the line comments on even the Telegraph and the Mail seem to be generally damning of it. But there’s always a few free market, state-dismantling fanatics who advocate the complete abdication of any responsibility to society as a whole. To them I say – read the above (or perhaps HG Wells’ The Time Machine) – and be careful what you wish for. Because (HYPERBOLE ALERT) treat the poor badly enough and they will eat you.

Morlocks