Your Christmas TV Highlights!

christmas_is_coming_by_brandtk-d5mlck1

It’s the time of year again when normally sensible TV genre shows abandon all logic in favour of doing “the Christmas episode”. While Christmas episodes sit well in comedy shows, they always seem oddly forced in shows like, say, The X Files, Supernatural or Grimm. Unfortunately it now seems de rigeur, especially for US shows – though we in the UK have the wildly fluctuating in quality annual Doctor Who special, which this year seems to have outdone itself by actually including Santa.

It’s only a matter of time before even critically acclaimed dramas will be obligated to produce a “Christmas episode” every year. What could that look like? Let’s find out…

Continue reading “Your Christmas TV Highlights!”

It’s only words….

‘When I use a word,’ Humpty Dumpty said in rather a scornful tone, ‘it means just what I choose it to mean – neither more nor less.’ – Lewis Carroll

In recent weeks, we’ve been blessed with the political excitement of both the Democratic and Republican National Conventions in the US, and a much-derided Cabinet reshuffle here in the UK. As party conference season looms for us and politicians start flying unfeasible policy kites in preparation to appease their more insane members, I thought it might be interesting to have a look at how the politics of class is currently shaping – and being shaped by – its use of language.

The English language, with all of its ambiguities, multiple meanings, synonyms, antonyms and homonyms, has always been a bit of a gift for political rhetoric. There’s nothing so telling of the political climate of the times as seeing the prevalence of particular words and phrases, cunningly employed to drive home a political message in speeches, press releases and party-affiliated news stories.

Scenes from the class struggle with the English language

DNCMiddleClass

One of the most noticeable things at both the Democratic and Republican conventions was a relentless focus on the middle class. At a time of economic hardship, when hard-right policies seem designed specifically to funnel money even further towards an already massively wealthy clique, this is fairly understandable. “Ours is a fight to restore the values of the middle class,” declaimed Barack Obama, as his supporters waved banners proclaiming “middle class first”. Over in the homogenous dream world of the Republicans, ultra-reactionary VP candidate Paul Ryan set out his stall: “We have a plan for a stronger middle class, with the goal of generating 12 million new jobs over the next four years”.

So what’s missing, you might ask? Well, both parties were taken to task for neglecting to cover the “poor”. But what’s interesting is that the term “poor” seems to have supplanted the term “working class”. If you’ve a “middle class”, then you must have one above and below it, by definition. The one above it is fairly clear, both here and in the US – they’re the ones with all the money, bankrolling each country’s more rightwing party to run the government for their own advantage.

But where’s the one below it? Why is “working class” now the more pejorative “poor”? “Poor” seems to carry connotations of helplessness, dependence, and inferiority. “Working class”, by contrast has overtones of decent, hardworking nobility.

It now seems quaint and old-fashioned. In part, this is because of the aspirational culture of the last few decades. “We are all middle class now,” said John Prescott in 1997. That’s John Prescott of the Labour Party, the one that was founded by and for the working class. The same party whose current leader, nerdish school prefect lookalike Ed Miliband says he wants to appeal to the “squeezed middle”. Being a “poor but honest” worker isn’t trendy any more. If you don’t have the mortgage, the two-year-old car, and the annual foreign holiday, you probably aren’t “working” anyway.

So the lowest class is not now “working”. Instead they are “poor” or even more pejoratively, with an overtone of menace, the “underclass”. Sorry to get all Godwin’s, but it’s always worrying when politicians or political journalists use terms reminiscent of “untermenschen”.

With the rightwing holding sway politically in the UK, after the riots of last summer, another word found itself attached to that – “feral”. That’s even more disturbing. Now not only are the former “working class” the “underclass”, but they’re actually animalistic and unhuman. You can see why this makes for a worrying narrative progression.

As if to emphasise that the “underclass” are no longer the “working class”, they’re now routinely conflated with the unemployed – conveniently ignoring all those full time workers here in the UK whose wages are so low they have to rely on government benefits anyway. So the “poor” are demonised as “scroungers”, part of an “entitlement culture” whose “dependency” is on money taken unwilling from virtuous, hardworking taxpayers. For added venom, the adjectives “idle” and “feckless” tend to be used in varying combinations, in government speeches, press releases and the news stories that cover them. The result is an unhealthy climate where if you’re not “middle class”, it’s your own fault for being “idle” and “dependent”. Never mind that the minimum wage is so low and the cost of living so high that often full time employment won’t pay enough to live on.

 

Rebrand the rich

RomneyTax

“For the last time, I am a job creator! You must, you will OBEY ME!!”

In tandem with the linguistic subjugation of the lower class from “working” to subhuman “scroungers” who steal from the virtuous middle class, the “upper class” have tried to twist the language describing them into more glowing, fulsome praise. The word “rich” has for many years (possibly since the French Revolution) had snobbish, uncaring and materialistic overtones. How then should the rich present themselves as altruistic and beneficial to the society whose money they’re gradually accumulating all of?

The result, initially, was the insidious term “wealth creators”. I first heard this emanating from the Republican Party in the US, and I’ve wondered ever since if somebody was actually paid to think up this asinine term. It does sound like just the sort of thing that might be focus grouped and moulded by the sort of consultants who briefly tried to rename the Post Office “Consignia”.

“Wealth creators” implied that the rich’s accumulation of material assets was good for the wealth of the country as a whole. But people cottoned on to the fact that any wealth they “created” went straight to them and stayed there, often moored in offshore tax havens so it wasn’t subject to that inconvenient burden of taxation for the good of society – “wealth hoarders” would be a more accurate description. Plus, the phrase still contained the word “wealth”, as in “wealthy”, ie “rich”. And if the wealth you’re creating is your own, you’re hardly going to be seen as contributing to the society you’re funnelling it from.

So “wealth creators”, even though it’s still in common currency, morphed into “job creators”. You can imagine some smarmy image consultant somewhere sitting back and folding his arms in satisfaction at that one. Well, if the business you’re running has made you rich, you must have “created jobs”, right? And that can only make it look like your contribution to society is more important than your employees, who pay a far greater proportion of their meagre incomes in tax than you do. Mitt Romney stated that he didn’t need to release any more tax returns; he’d definitely paid enough tax, it was a whole 13% of his $20.9 million income (2011).

But Mitt’s a “job creator”, so that’s OK .Even though most of the jobs he “created” while running Bain Capital were in India and China. Governments will find it far less acceptable to impose heavy taxes on “job creators” than they would on “the rich”. If “job creators” leave the country because tax rules aren’t favourable enough to them, who will “create the jobs”? You can see why that’s worse than “the rich” leaving the country, which by and large people don’t really care about. Ask Phil Collins.

 

Race to the bottom

NOLALady

With the upper class elevated to sainthood and the lower class reduced to the level of animals, you can see why, linguistically, “middle class” is the only uncontroversial one left. Particularly in the US. It’s been said that in the UK, the political struggle is always about class, whereas in the US, it’s always about race. That’s only half true; class does exist in the US, it’s based on money, and it often seems determined by race. Its prisons bulge at the seams with young African-Americans, many of whom turned to crime as the only refuge from a desperately poor background. Visit Southern California, and you’ll see the class divide even more starkly in racial terms. Whites have the good jobs and the nice cars; Latinos have the service jobs and the beatup but respectable older vehicles; and blacks, if they have jobs at all, may well have to travel on the bus because they can’t afford cars.

Yes, it’s a sweeping generalisation, and far from true universally. But it’s true often enough, and here in the UK too, non-white ethnicities tend to be poorer and/or jobless at a level disproportionately higher than Caucasians. In the US, where Republican state governments are passing voter ID laws that explicitly target the poor, class and race overlap. The “poor” in a state like Florida is disproportionately made up of non-Caucasians. Perhaps coincidentally, a recent poll registered African-American support for the Mitt Romney at a modest total of 0%. OK, Herman Cain and Marco Rubio will probably be voting Republican, but there’s always a margin of error. Nevertheless, that’s a poll figure that might make even the Lib Dems here in the UK feel slightly better.

 

Turn Left

Occupy01

Trying to reclaim the word “rich” from the “wealth creators”

Still, the right haven’t had the monopoly on shaping the political and class debate by distorting the English language. Since austerity (another political buzzword) bit, and income inequality (and there’s another one) became hot political topics, the left have found their own way to load words with unintended meaning. In the wake of the Occupy movement, the word “elite”, which always carried faintly nasty overtones of exclusion, took on a far more damning meaning when used to describe the tiny clique of hyper-rich people who seemed simultaneously responsible for and immune to the financial crisis engulfing the world.

In the UK, left-leaning politicos and journalists got their own back on the right by taking their pejorative adjective “feral” and applying it to that “elite”. For a while, the phrases “feral underclass” and “feral elite” were flung at each other with such frequency they ceased to have much meaning; as a result, after a brief period in the linguistic limelight, they seem to have faded somewhat into obscurity. Significantly, the terms coined by the left to describe the unfairness of the situation which stuck are not linguistic but numerical – the “elite” are “the 1%”, and the rest of us who pay a greater proportion of our income as tax are “the 99%”. Put in those terms, the injustice is hard to argue with even with any amount of “job creators” in that “1%”.

 

Language in a post-truth world

RyanPinocchio2

Politics and truth have always had a rather abusive relationship, as US journalists are finding as they struggle to adjust to the “post-truth” world in the wake of Paul Ryan’s epically inaccurate speech. The astute use of language can make an untruth seem less like an actual lie. It’s nothing new. When arch-Republican Chuck Norris claims that re-electing Barack Obama will usher in “a thousand years of darkness”, that’s hyperbole at its most extreme. Of course, Winston Churchill said something similar about Adolf Hitler, but it’s hard to equate Obama with Hitler (unless you’re Glenn Beck). Meanwhile, Fox News and other histrionic right wing news outlets pander to their sponsors by treating the words “liberal” and “progressive” as descriptions of something beneath contempt, which in turn passes into mainstream Republican discourse.

Taking poor, innocent English words and twisting them into political weapons is, of course, a longstanding practice in both the US and the UK. But in the modern era of spin doctors, image consultants , key demographics and focus groups, it’s hit an all time high that’s often ridiculous – as Nick Clegg, with his repeated meaningless blather about “alarm clock Britain” seems not to have noticed. The flexibility of the English language is both a blessing and a curse for political discourse, but it’s never less than interesting to watch. To help you out, here’s a little chart of phrases to look out for in the coming US Presidential election and UK party conference season. Have fun playing political bingo, or alternatively, use it for a drinking game. It should get you so drunk that you might stop despairing…

 

Austerity

Middle class

Feckless scroungers

Public sector waste

Illegal immigrant

Entitlement culture

Job creators

Gold-plated pensions

Socialist healthcare

Private healthcare

Underclass

Benefit fraud

Hardworking taxpayer

Big society (getting rare now, this one)

Alarm clock Britain (not rare enough)

Plan B

Terrorism

Liberal media

Conservative media

Bureaucratic excess

Deregulation

Reregulation

Small business

Big business

Lending

Family values

God

Innovation

“..and I’m not making this up.”

“…well here’s the truth.”

Concernin’ Pirates.

virgin_media_pirate_bay_block

Virgin Media’s ‘block’ display, shown when trying to access the Pirate Bay site.

Piracy! It’s the hot topic of the moment, with copyright owners worried about losing revenues, and unscrupulous governments seizing on the issue to pass draconian legislation over that uncontrollable global bugbear, the internet.

I’ve posted on this topic (tangentially) before, when the US government was salivating over controlling the internet via the SOPA and RIPA acts. But today it came up as a topic when a couple of friends on Facebook posted this blog piece from David Lowery, lecturer in music economics at the University of Georgia and founder of alt-rock band Camper Van Beethoven. Lowery wrote the piece in response to a post from 21 year old National Public Radio intern Emily White, in which (as he saw it) she seemed to feel free of guilt for having illegally obtained over 11000 songs while only having ever purchased 15 CDs. It’s an interesting piece, with a considered viewpoint from a man who has very good reason to be expert in this field. Take a moment (or two, it’s quite long) to read it.

Back yet? Good. After reading that, I responded with a few brief comments to my friends on Facebook. Brief in the sense that Tolstoy novels are – you’ll be used to that if you’ve read this blog before. But I thought them worth posting here to reach a wider audience, so here goes…

Lowery’s piece has a few thought-provoking points, but he glosses over some of the murkier areas. As it’s written from a US perspective, the specific points about copyright laws and royalty payments ignore vastly differing rules in other territories. As with so many legal/financial issues, the global nature of the internet makes concrete arguments trickier to make.

Secondly, it’s worth reading the original post from NPR’s Emily White that Lowery is responding to – he completely mischaracterises it in his own piece. Far from ‘stealing’ the 11,000 songs in her music collection, she goes into great detail about how she obtained it. Much of it was recorded from promos at the radio station she worked at (‘home taping’ but NOT illegal downloading), and a lot of it was obtained via swapping mix CDs with her friends. Sound familiar? At her age (21) that’s how I obtained a lot of my music. That was in 1991, and it didn’t kill the industry.

Probably her most ‘heinous’ act is receiving a surprise gift from her prom date – he loaded lots of music onto her iPod. OK, that does rob artists of royalties. But who’s to blame there – her, or the boy who copied the music as a surprise gift?

If you read Emily’s post to its conclusion, she’s not asking for ‘free stuff’, as Lowery rather patronisingly puts it. She’s arguing for an online service that can be subscribed to, by which any music can be played at any time. Lowery’s right about Spotify’s miserly royalties to artists, but such a service, with fair payments, is the obvious way to properly monetise downloading/streaming of all media.

It’s already working very successfully for films and TV with Netflix and Hulu. Extrapolated for other media, that’s the obvious way to go. It could easily work for ebooks as well, rather like a library service – either stream the book to your device, or be allowed to download a file that has an expiry date. If you’re paying a fiver a month to such a service, why would you bother searching out illegal, dubious quality downloads and filling your hard drive with them?

I can see why Lowery is so exercised by what he perceives in Emily’s post – she has acquired a lot of music without paying for it (though not by the means he implies). But her circumstances are unusual, particularly the access to free promos from a radio station.
Having read both pieces, neither is spot on in a workable, totally ethical way (though ethics tend to be subjective). But to me it looks like Lowery, like many others, has failed to properly grasp the way the technology has/will affect the issues at hand. Emily White’s post is far nearer the mark.

It’s true that artists – of any stripe – should no more work for free than anyone else. Sadly there’ll always be some piracy, but the problem here is that the pirates have worked out how to exploit the technology first. What’s needed is to come up with a legitimate model of giving content creators their fair share. Lowery does make some good points, but nobody’s found a definitive answer to that one yet.

And it’s worth remembering that music as an art form has existed for millennia, but recording music is a relatively recent phenomenon (c. mid 19th century). Before that, music was all about the experience of a performer and a physical audience. At that point, musicians always got paid what they deserved – even if it was rotten tomatoes! The advent of recording caused a seismic shift in how the ‘industry’ was organised, and we’re currently living through another. Problem is, many people (including, I think, Lowery) are viewing the issues very much from a 20th century perspective.

Indeed, perhaps even the 21st century issues we’re so concerned with are becoming passe now, as the culture of experiencing recorded media continues to shift faster than businesses seem able to anticipate. Apparently, one of the arguments often raised by ‘the pirates’ is that, with the huge capacities of personal media devices, piracy/theft is the only way to fill up all that space if you’re not a millionaire.

That may not be a justification, but it’s interesting to note that the memory capacity of iPods is actually tending to get smaller. Partly this is because spinning hard drives are being abandoned in favour of flash memory, which doesn’t (yet) have the same capacity.
But more significantly, because people are tending more and more towards streaming their entertainment rather than keeping a permanent copy of it clogging up their device’s memory.

That seems to be the future; both media companies and hardware manufacturers are playing catchup, but Apple seem to have figured it out already, with internet-intensive, low-memory devices like the iPad.The only stumbling block is access to the material when outside the range of Wifi networks, but 4G (and the increasing number of free public Wifi services) should help there.

But what about material that (for whatever reason) is deleted and no longer available? What about downloading or free streaming of material that you bought once and then lost in a theft? What about music bought secondhand from a charity shop, where no royalties go to the artist?

I’m by no means an innocent on such issues. I’m keenly aware of the problems caused by depriving content creators of revenue (though David Lowery’s attempt to tangentially link that to two fellow musicians’ suicides is a tasteless attempt at emotional manipulation). As I said earlier, ethics are a subjective thing, and the conscientious have to come up with some code of conduct. The trouble is, until legislators, businessmen, and even artists can come up with a consensus on how best to legitimise downloading and streaming, that code of conduct feels like it’s up for grabs.

The Spoiler Statute of Limitations

N.B. – Despite the subject matter of this piece, I’ve worked hard to ensure that it actually contains no spoilers!

SpoilersAhead

Spoilers! Don’t you just hate ‘em! Steven Moffat certainly does, as he repeatedly gets River Song to tell us in Doctor Who. It’s undoubtedly annoying, when you’re following a TV show, to be made prematurely aware of some vast, game-changing plot point that the creators had intended to come as a gobsmacking surprise. But recent developments in how we watch things have given rise to a new problem, and a new question – just how long should we wait before openly discussing (on the internet or in the pub or wherever) some major plot twist?

This came to my attention recently, when a frustrated Facebook friend in the US complained of his friends in the UK discussing openly on the site a major plot twist in that night’s Doctor Who. Now, given the fact that BBC America broadcasts the show in the US pretty quickly after the UK (not to mention the, ahem, naughty downloads), I did see his point in complaining that it wouldn’t be too much of a burden for his UK friends to refrain from discussing the plot for a little while at least.

But I can also understand that some people still think that, once a TV show has actually been broadcast, it should be fine to start talking about it. It’s an understandable assumption, particularly for those who grew up watching TV when it was a communal, even national thing; when you could be reasonably certain that your friends would have watched the same show at the same time as you. Back in 1980, for example, nobody worried about spoilering the eagerly anticipated question of ‘Who Shot JR?’ in Dallas. International communication was rare and expensive, and most people in each country who cared were watching the show at the same time.

However, ever since the advent of the video recorder, that’s not been guaranteed. And the problem has intensified; in these days of international chat on the internet, via forums and social networking sites, you have to take real care that you don’t, however unintentionally, reveal something that should have come as a surprise. But how long should you wait? What, in a nutshell, is the statute of limitations for spoilers?

The trouble is, there’s no hard and fast answer to that one. For filmmakers, it’s not a new problem at all, as films have never had the same simultaneous viewings for whole nations. Way back in 1960, Alfred Hitchcock popped up on the trailer for Psycho to plead “Don’t give away the ending, it’s the only one we have.” Fair enough, but is there really anyone left in 2012 who doesn’t know how that ends? And if there is, is it unfair of them to expect those who want to discuss it to keep silent, 52 years after the fact? It’s actually a very common problem with films – they get older and older, but there’ll always be somebody for whom they’re new. Will that person’s viewing experience be tainted by a spoiler that’s become cultural common knowledge?

There are plenty of well-known examples. The first time I saw Psycho, I already knew the ending; I still thought it was a pretty fine movie, but I wonder how much more I might have enjoyed it had the twist come as a surprise? And yet, it seems churlish of me to demand that the entirety of society should refrain from discussing a very old plot twist on the off chance that I might not have seen the film yet.

But what about more recent films? How soon is too soon? The original Planet of the Apes, for example, has often been released on video and DVD with a cover picture that actually gives away the twist ending before you’ve opened the box – again on the assumption that it is, by now, common knowledge. OK, that movie was made in 1968. How about 1980, a ‘mere’ 32 years ago? Can there be anyone left who doesn’t know the twist in The Empire Strikes Back? Apparently so, if this clip of a four year old reacting to the previously unknown revelation is for real. I saw that one not long after it was released, but that particular spoiler was already common knowledge. Would I have been as gobsmacked as that kid if it had been news to me, too?

A bit more recently, is there anyone left who doesn’t know the twist endings to M Night Shyamalan’s early movies? I was about a year late seeing 1996’s The Sixth Sense; by then, the ending had entered common culture so thoroughly that I’d found it out in, of all places, an article in Boyz magazine. I still enjoyed the movie, but again, how much better might I have enjoyed it had the end come as a surprise? Similarly, I’ve never actually watched his 2004 film The Village; though that’s less because I’ve found out the twist and more because I’d seriously started to go off his work after the nonsensical Signs. I did manage to catch David Fincher’s Fight Club (1999) before its ending became common knowledge, and that was certainly effective – but again, 13 years later, I’m willing to bet that that’s become cultural common currency.

But now, an old problem for films is very much a current problem for TV shows. Some websites have hidden text sections for spoilers, others, like Facebook and Twitter, rely on the (apparently infrequent) discretion of their users to avoid releasing spoilers into the public domain. It comes back to, how long should you wait? One friend has a self-imposed limit of a week – seems reasonably fair. Some would say not – after all, how many of us now catch up with TV shows on DVD box sets long after the original broadcast?

Reasonably, if you’re desperate to avoid spoilers, it looks like your only pragmatic choice is to stay away from the internet. Completely. Because if something’s popular enough, any major plot developments end up being referenced anywhere and everywhere. I recall rushing through the final Harry Potter book for precisely this reason, and avoiding Facebook et al for fear of finding out the ending before I reached it. It’s not ideal, I know, but unfortunately it’s a more sensible solution than expecting everyone else in the world to be sensitive to your viewing (or reading) habits.

Of course, some people take a perverse, trolling delight in spoilering. One old friend of mine had an irritating habit of flicking to the last page of whatever book I was reading in order to tell me that (character X) made it to the end. Others use it as a status-building ego reinforcement – “look how important I am, I know something you don’t, and I can prove it!” Unfortunately, if you’re a true spoiler-phobe, complaining is like a red rag to a bull for this kind of person; you’re actually better off not encouraging them. Just try to close your ears, or step away from the internet.

So can there ever be a ‘statute of limitations’ for spoilers? I’d have to conclude not, pragmatically. If you find them annoying, then your only recourse in the real world is to do as much as you can to avoid them, because, sadly, they’re not going to go away. On the flipside, if you have a friend who shares your interests, it might be courteous to refrain from discussing plot points unless you know your friend knows them too. But for how long is between you, your community, and ultimately, your conscience. Everyone has different standards, and in the end, if you want to avoid spoilers for as long as you deem fit, the final responsibility has to be your own. We could wish for a more polite, considerate world where that’s not the case, but somehow I don’t see it happening soon…

Internet of Truth

“You can’t rewrite history. Not one line.” The Doctor, The Aztecs

“Who controls the past controls the future: who controls the present controls the past.” George Orwell, 1984

image

The truth is out there…

A couple of days ago, Charlie Brooker’s sporadically brilliant Guardian column ran a piece on the current politics meme of the moment – the ‘Milliband loop’. For the one or two unfamiliar with this chortlefest, it refers to a news pool interview carried out with the less than charismatic current Labour leader, in which he manages to answer five different questions with exactly the same, verbatim answer, mixing up the order of the phrases being the only variety – “these strikes are wrong… negotiations still ongoing… government… reckless and provocative… get round the negotiating table… so it doesn’t happen again”.

Obviously all Milliband was attempting to do was to ensure the soundbite he wanted would be selected from the interview for the tiny excerpt that would undoubtedly be played out on the TV news coverage of the public sector strikes. It’s a sad indictment of the current state of political journalism that he felt the need to do it in this way, and he’s probably rueing the fact that the BBC News website chose to display the raw footage unedited as it makes him look like a robot iPod stuck on repeat. But for me, what was slightly more interesting rereading Brooker’s piece was that its headline was quite the reverse. In fact, by the end of the day, it was on its third regeneration.

What Brooker is saying in the piece is that it’s by no means new for this to happen; it is in fact an emerging trend, and he points to similar displays by both George Osborne and Alastair Darling. Logically, then, the original title of the piece didn’t single out any politician in particular – it referred to ‘Politicians’ identikit responses’. By lunchtime this had morphed into ‘Milliband’s identikit responses’, presumably to capitalise on the hapless leader’s misfortune of going viral on the internet, making him far more noticeable than the other two examples. This, however, seemed a little dishonest and misleading, when the whole point of the piece was to bemoan a trend rather than attack one particular exponent of it. By the end of the day, though, the headline had morphed again. This time the phrase ‘Milliband’s identikit responses’ had been replaced by ‘the Milliband loop’, a phrase Charlie seems to have coined himself in the article.

While I like Charlie Brooker’s work, I’m by no means an unquestioning follower of his, and this strikes me as a disturbing trend in itself, of which he is now as guilty as anyone else. In short, the increasing dominance of newspapers’ online content means that they get to rewrite history several times a day. It’s like Winston Smith’s job from 1984, at warp speed, and doable by any half-drunk journo at his desk.

Brooker – or his editor – altering his headline is probably a fairly trivial example of this. But there are worse out there. On Friday, the day after the teachers’ strike, the Daily Mail ran one of the most scurrilous headlines I had ever seen – “Tears for girl, 13, crushed to death by a falling branch as she sat on park bench because her teachers were on strike”.

Even by Daily Mail standards, this was a jaw dropping example of gutter journalism at its worst. Using the tragic accidental death of a child to score cheap political points that support your agenda really is about as low as you can get. Perhaps whoever wrote the piece had some inkling of this; rather than credit the author by name, the website simply tells us this literary masterpiece was penned by ‘Daily Mail Reporter’. As if the headline wasn’t bad enough, ‘Daily Mail Reporter’ had also gone out of his/ her way to solicit/make up quotes from heartbroken locals about how this accident was all the fault of the teachers for going on strike.

To give them credit, even regular Mail readers were astounded by the effrontery of this, and the comments thread beneath the article rapidly filled up with the sort of disgusted reaction familiar to Mail website habitues – and yet also unfamiliar, because this time the disgust was directed at the Mail itself.

Thus it was, that, by about teatime, the headline’s implication of teacher complicity in a tragic accident had been softened somewhat. It now read, “Tears for girl, 13, crushed to death by a falling branch as she sat on park bench as her teachers were on strike” – thus making the teachers’ culpability a rather less direct implication. It was still clear enough, though, and the ‘Disgusted of Hartlepool’ comments continued to flood in. So, by the next day, any reference to teachers had been excised from the headline, which was now simply “Tears for girl, 13, crushed to death by a falling branch as she sat on park bench”. Similarly, the quotes blaming the teachers in the article itself were edited or excised altogether, and a quote from the girl’s family was inserted in which they implored (rather more reasonably than I might have done under the circumstances) that “Our beloved daughter’s death was a tragic incident, which occurred only 24 hours ago, and we do not want it to be connected to any other events.”

Thus, the Daily Mail had effectively, and without comment, rewritten a massively offensive headline and article to, presumably, protect themselves from the Press Complaints Commission – although given how toothless that worthy organisation generally is, I’m surprised they felt the need to bother. Nonetheless, the comments thread was not deleted. This is most likely because outrage over the nature of the headline now seemed nonsensical, though the article’s URL betrays rather more of its original content: http://www.dailymail.co.uk/news/article-2010193/Teachers-strike-Sophie-Howard-13-killed-falling-branch-school-closed.html.

That’s a far more worrying example than Charlie Brooker (or his editor) altering the headline of a satirical piece to make it more sensationalist – the Mail’s headline was a genuinely obscene bit of journalism that they should have been held to account for. Now, they can simply claim that they altered the headline to acknowledge the offence caused – if they admit to it ever having existed in its original form at all. With no record being given of when and how the website was altered, it might well take a long and dedicated bit of cyber-detection to prove that it had been.

Yesterday, however, prompted an even more worrying example of this trend. Yet more examples had come to light, this time in an admittedly gloating piece from the Guardian, of News International’s propensity to hack the voicemails of anyone it considered likely to sell a few more copies of News of the World. This latest example, though, was rather more sinister than Sienna Miller’s love tryst texts or even Tony Blair’s confidential policy messages. NOTW, it turns out, had hacked the voicemail of the then-missing 13 year old Milly Dowler, even going so far as to delete messages when the mailbox was full so as to garner more ‘newsworthy’ material. This had, it seems, the combined effect of giving false hope to Milly’s family, who believed if she was deleting messages she must be alive, and potentially destroying valuable evidence that could have been utilised in the police investigation. The paper made no particular secret of having done this, either – contemporary articles even referred to information that had come to their attention via voicemails left on the missing teenager’s phone.

Now, it’s been notable that most of the tabloid press has been suspiciously light on coverage of the News International phone hacking stories – presumably proof of the old axiom that no-one wants to deploy a weapon that might be used against oneself. And obviously, there isn’t even a mention of the story in today’s Sun, despite Prime Ministerial condemnation and TV news saturation. Of slightly more worry, though, is the reported allegation that any such articles have now disappeared from the News of the World online archive.

Now, I must hold my hands up and say that I cannot actually verify that. Access to the NOTW web archive depends on registering with News International, something I’m not prepared to do. If true, though, it’s perhaps the most worrying example of this trend in a three day period that has thrown up just the examples I happened to come across quite casually, rather than actually looking for them. Further embarrassment for News International would be, to say the least, undesirable for them, at a time when parent company Newscorp’s full takeover of BSkyB is imminent. Not to mention the fact that News International’s Chief Executive, Rebekah Brooks, happened to be the editor of the News of the World at the time this particular bit of hacking took place.

And it could perhaps be said that, if true, the removal of these stories is a sensible measure at a time when a police investigation is still ongoing, and at a time of such sensitivity for the Dowler family. Nonetheless, if significant stories are disappearing from an online archive which apparently stretches back to 2000, deleted for political or commercial or even personal reasons – without comment – it’s a very worrying trend.

Of course, physical copies of newspapers are still sold, and those are rather harder to alter. And a dedicated researcher would be naïve to rely entirely on web archives to research news stories. But with the print media in decline, replaced by an increasing reliance on online content, how long will this be an option? And how many lazy researchers, or just plain normal people, already take what they read on a news source’s online archive at face value? Some papers at least acknowledge that web changes have been made – the Guardian is one. But even they don’t do it with any consistency – it’s usually only if a factual error has been amended, rather than an editorial change like the one to Charlie Brooker’s headline. Surely there should be, at the very least, an obligation for any organisation claiming to purvey facts to tell us when and how they’ve ‘altered the truth’ – and more importantly, why?

In 1984, Winston Smith’s job at the Ministry of Truth was to alter the past, by cosmetically changing photographs and archived newspapers – inspired by the contemporary practices of Josef Stalin, who did this as a matter of routine. Orwell depicts it as a tedious, lengthy process, that’s extremely boring and requires a degree of skill. Today’s news editors and proprietors can now do it with a couple of passes of the keyboard and a click of the mouse – and that’s very disturbing indeed.

Reality used to be a friend of mine

So, Autumn is upon us again, and with it, the glut of mass-market, cunningly edited ‘talent’ shows to fill the TV schedules, the front of every tabloid newspaper, and, every five minutes of each show’s duration, the status updates of what seems like half of Facebook.

For me, these ultra-staged ‘reality’ shows drive me up the wall. They all seem to blur into one hideous, homologised entity of tripe with a title like Strictly Dine On Ice with a Celebrity Apprentice Chef. And yet, as my boyfriend pointed out, I find myself talking about them even more than their fans. What can be the reason? My dislike of the format is probably an overreaction, and yet I can’t stay away from it. The most apt comparison would be to say that they’re like a scab I can’t stop picking.

The growth of ‘reality’ television (I use inverted commas because these shows are transparently the most faked slices of reality you’re ever likely to see) has been an insidious one over the last ten years or so, starting with Popstars and the original Big Brother. But there’s nothing new under the sun, and the irony is that most of the big shows are actually updates of ancient formats that at the time were considered massively uncool.

Strictly Come Dancing is nothing more or less than creaky old ballroom dance show Come Dancing, which the embarrassed BBC used to bury in the schedules at the dead of night while allowing an apparently tipsy Terry Wogan to gently mock the stiff contestants. What the new show does differently is bring a media-savvy propagandist’s method of presentation, all cleverly edited artificial tension and emotional manipulation. Oh, and pander to the increasingly daft cult of ‘celebrity’ by interspersing their actual dancers with the sort of Z-listers that would struggle to find a place in Heat magazine.

In taking these old formats, the shows have cross-pollinated with each other, learning from and adapting each other’s methods to try and retain the mental stranglehold on Britain’s otherwise mostly sane populace. Undoubted master of all the techniques from these last ten years of brainwashing is The X Factor, a so-called ‘talent’ show that is basically a version of the ancient Opportunity Knocks polished up by Josef Goebbels – here incarnated as the massively smug and punchable Simon Cowell.

Well might Cowell be smug though – he’s working one of the best con tricks since Barnum. He’s feeding the viewing masses rubbish, and not only are they begging him for more, they’re prepared to pay him for it. So he lines his pockets, allows his ‘discoveries’ a brief, Icarus-like shot at fame with the strategically placed Christmas release of a bland, anodyne single, then rubs his hands all the way to the bank while they shuffle off to a baffled obscurity.

“But, but,” say Cowell’s blinkered defenders, “The X Factor’s all about discovering new talent. Some of the contestants are really good musicians/have really good voices.” The tragedy is that some of them really do. But what Cowell’s trying to do is make the most money possible, and where music is concerned that means smoothing out any trace of individuality so that your product will appeal to the greatest number possible. The songs we end up with are so overproduced and bland that they serve as the musical equivalent of the Ford Mondeo.

And they can’t even be bothered to come up with original songs. The usual material available for cover is mass-produced pop that was trite enough to begin with – hardly an opportunity to display any talent the ‘star’ may have. Even when they use a song that does have some character of its won, they immediately use pitch-shifting, audio filtration, and a sub-Phil Spector production style to bludgeon it into mass-market conformity. Witness Alexandra Burke’s cover of Leonard Cohen’s classic ‘Hallelujah’. Burke genuinely does have a good voice, and the song’s an undoubted classic – albeit covered many, many times already. But her version ends up as the one that displays less genuine emotion than a sociopathic Vulcan. It may have been popular, but then so was ‘The Birdy Song’, and I’d like to think ‘Hallelujah’ has a bit more dignity than that. Elsewhere, Leona Lewis took Snow Patrol’s raw, fragile ‘Run’ and turned it into an overproduced dirge that presumably caused Gary Lightbody to take the money and run.

But The X Factor isn’t about music. It isn’t about talent. It’s about money. And the way to maximise the revenue is to shamelessly manipulate the show’s audience with the breathtaking propaganda skills of a latterday Leni Riefenstahl. Anyone who thinks success or progression within the show’s competition format has anything to do with actual talent is being startlingly naïve. The pre formulated drama of the show demands certain archetypes, and if you don’t fit into one of the pigeonholes then, talent or not, you’re out mush.

By now, many contestants seem to have learned to exploit the show’s need for caricatured archetypes. Hence the most successful at winning the audience’s sympathy, and those all-important £1 a minute phone votes, are the ones who have a dead, or dying dad/gran/dog etc. “If only he/she/it could have been here to see me,” they tearfully moan as the viewing public collectively goes “Aaah”, seemingly unaware that it’s just been had.

The X Factor though, like all these shows, is not reality. It’s actually drama that, because its characters are the unpaid members of the British public, is very cheap to produce – a godsend for an increasingly desperate and cash-starved ITV. And drama can’t function with just a hero, you need a villain too. Ever since Nick Bateman propelled himself, unwittingly or not, into this role in the original Big Brother, reality show producers have realised that they need a baddie. For every show, every year, someone is cleverly manipulated into being the one the viewers love to hate.

If the ‘villain’ is one of the contestants, the irony is that, while they won’t win, they’ll often end up better remembered – Bateman being an obvious example. But it’s more usually one of the judges, a lesson learned from Nigel Lythgoe’s unforgettably spiteful turn on Popstars and honed to sneering perfection by Cowell.

Elsewhere, we have The Apprentice – a concept that, as far as I know, isn’t derived from a creaky, ancient relic of an uncool show. But this too learns from the historical lessons of Big Brother, turning its everyday business drones into gladiatorial competitors hoping to score a ‘proper job’ as some kind of yuppie wanker. And Alan Sugar, originator of the crummy Amstrad brand, is hardly a substitute for megalomaniac tycoon Donald Trump – Sugar doesn’t have a giant skyscraper named after him that tourists come to gawp at. It’s all rather low-rent and British.

The rebirth of the humble cookery show as polished imbecile contest took place even earlier. Loyd Grossman’s 80s drivel Masterchef has been given the same slick polish as the other shows, but remains basically a way to turn food porn into cheap drama. And allows the viewing masses to bay for the blood of yet more Z list celebrities to boot. Along the way, Gordon Ramsay – who really should have been a football manager – has managed to become the food porn shows’ equivalent of Simon Cowell, though his ceaseless swearing at least makes him seem somewhat more human than Cowell’s withering, dead-eyed scorn.

Since the advent of Big Brother in 1999 and Popstars in 2001, the reality show has come to dominate British television while simultaneously reducing it to its cheapest, lowest common denominator. It’s Andy Warhol’s fifteen minutes of fame reduced to two seconds. It’s Christians fighting lions in the arena for a bloodthirsty public that distracts them from thinking about anything worthwhile. And more than anything, it’s dishonest. It’s not about ‘reality’. It’s not about ‘talent’. It’s a combination of money making exercise and latter day freak show. How many of the liberal viewers watching it ‘ironically’ would think it was acceptable if it was Siamese twins or bearded ladies put up on their screens to have fun poked at them?

From America, where the reality shows are becoming more insane and surreal by the day, I think the late Bill Hicks encapsulates the phenomenon and my feelings about it best:

“Go back to bed America, your government is in control. Here, here’s American Gladiators. Watch this, shut up, go back to bed America, here is American Gladiators, here is 56 channels of it! Watch these pituitary retards bang their fucking skulls together and congratulate you on the living in the land of freedom. Here you go America – you are free to do what well tell you!”

Rant over. For now…

How teenage girls are ruining vampires for the rest of us

“The blood is the life, Mr Renfield.” – Dracula, 1931

“This is the skin of a killer.” – Edward Cullen, Twilight

“It’s like a whole big sucking thing.” – Buffy Summers, Buffy the Vampire Slayer

With hordes of simpering teenage girls dragging their reluctant boyfriends (assuming they have any) to the latest film derived from Stephenie Meyer’s anaemic angst-fest Twilight: Eclipse, I think it’s time to remind ourselves that vampires used to be scary. I remember as a kid being terrified even of Christopher Lee in Hammer’s interminable Dracula series; he had red eyes, fearsome pointed fangs, and bewitched his victims into subjugation before drinking their blood and turning them into walking, thirsting corpses like him. All right, granted he mainly used his powers on a succession of Victorian ladies who were a smidgen too old to be playing the damsel in distress, but it made a huge impression on the 9 year old me, and my nightmares were often haunted by visions of Lee’s blood-dripping fangs as he burst into my room at night intent on slaking his unholy thirst.

Later, I and my horror-loving contemporaries had our childhoods scarred by, of all things, a vampire television show – Tobe Hooper’s 1979 adaptation of Stephen King’s classic ‘Salem’s Lot. Ferocious Nosferatu Mr Barlow made far less impression than the unspeakably creepy floating little boy scratching to be let in at his brother’s window before draining the life from him. Unlike the Gothic campery of Lugosi’s and even Lee’s Count Dracula, these were vampires living in the real world who cared not if you were a middle aged lady in a Victorian nightdress; everyone was meat and drink to them, even little boys like us.

And now what do we have? The simpering, emasculated Cullen clan, toothless, bloodless and sexually neutered, brought to us courtesy of a starstruck Mormon intent on spreading the message of romance via sexual abstinence. Edward Cullen might be the dream of millions of contemporary teenage girls, but a proper vampire he is not. The Twilight “saga” is the end result of an ever-diminishing spiral of vampire worship that appears to dominate the current reading lists of vapid teenage girls with a hint of old-fashioned goth and absolutely no sense of humour. They’re everywhere; Vampire Academy, The Vampire Diaries, and even the actually rather good True Blood are the best representations of vampires around us right now. No longer monstrous, erotically charged, walking dead men intent on draining you dry until you’re like them, the vampire of 2010 is an insipid sub-Byronic hero who, like Pinocchio, desperately wants to be human. And probably looks like he should be in one of the emo bands who provide the near-identical soundtracks for shows that are basically Dawson’s Creek with tastefully trimmed fangs.

So what changed? How did we get from the menace of Count Dracula to the whimpering, neutered high school stalker that is Edward Cullen? Well, sad to say, there are two rather talented people to blame, though I’m sure neither envisioned the end result of their innovative tinkering with a long established mythology.

The first is Anne Rice. Rice’s 1976 novel Interview with the Vampire revolutionised the genre of vampire fiction, and it’s never been the same since. For the first time, the vampire wasn’t an unknowable, nightmarish monster that had to be destroyed for the good of humanity; he was a person, trapped by his own predatory nature, with regrets and feelings like our own. Even if those feelings were mostly self-pity characterised by endless Romantic moaning like a sort of low rent Coleridge. Louis de Pointe du Lac was the first vampire we were meant to sympathise with – even if many of us had been cheering the vampires along even when they were the bad guys.

The effect of Rice’s novel on the genre was immediate and seismic, and suddenly even good old Vlad Dracula wasn’t just a monster, but a misunderstood romantic. Dan Curtis’ TV adaptation of Dracula – produced the same year and starring Jack Palance – was among the first to use the by now well-worn plot device of Mina Harker being some sort of reincarnation of Dracula’s lost love. It doesn’t really work in Curtis’ version, principally because Jack Palance has a total of two facial expressions, but it became established with variations like John Badham’s 1979 stage play adaptation and even Francis Ford Coppola’s sumptuous and otherwise very faithful 1992 film, reverentially entitled Bram Stoker’s Dracula lest we think it was written by Jackie Collins.

That’s not to say that romance had never been present in the noble count’s soul before; the very first adaptation, FW Murnau’s Nosferatu, sees the legally distinct Graf Orlok trapped by his insatiable desire for Mrs Harker, vaporising in the first rays of the morning sun. The wellspring of almost all movie vampire lore, Nosferatu was the first piece to show vampires being killed by sunlight – an Achilles heel now so firmly established, it’s easy to forget that Stoker’s novel had the villain walking around quite happily in the daytime, albeit with his power somewhat diminished.

Tinkering with the myth is fine – every vampire story has changed the creatures’ characteristics to suit its own plot. I can hardly hold it against Twilight that its vampires can move around in the daylight – though I do hold it against it that if caught in direct sunlight they look ‘magical and beautiful’. But Graf Orlok, while he may have been a romantic (or just extremely frustrated) was never going to set any lonely girl’s heart alight. He looked like a shaved rat, with his bat ears, elongated incisors and bald head.

Rice’s Louis, on the other hand, was like all of her characters – dead, but impossibly good-looking. The fact wasn’t lost on Neil Jordan when he made his rather po-faced film adaptation, casting Brad Pitt as the longlasting moaner for whom death is just an excuse to mope. Louis, naturally, gets away in the end of the novel, and a sequel seems forthcoming. And sequels there were, though it took Rice several years to work up the confidence to write one. But once writing, she seemed totally unable to stop, so that now it seems even the most minor characters from the original novel have another devoted entirely to them.

The most important of these, though, and the one that set the dynamic for conflict in every anthropomorphised vampire story since, was the subject of her very first sequel – Lestat de Lioncourt, otherwise known as The Vampire Lestat. Fun-loving, blackly humourous and utterly amoral, Lestat was everything whinging Louis was not. Having an absolute ball being undead, thrilling to the hunt and considering humans lesser beings put on the plant solely for sport, he was the very essence of the villainous vampires of the past – but now the story was being told from his point of view. Revelling in what I suppose you’d have to call joie de mort, Lestat was the polar opposite of Louis, and yet despite their frequent conflicts, nothing could quite tear them apart. They were drenched in the sort of doomed homoerotic subtext previously reserved for the incumbents of Tennessee Williams plays – and together, they set the template for how vampire stories would go from thereon in.

So – Louis and Lestat. One hates being undead, the other can’t get enough of it. They hate each other and they love each other. So far,so kinky, and horror literature seized on the concept, heightening the always present sensuality of the vampire and turning what used to be a sexual subtext into just text. SP Somtow’s excellent Vampire Junction simultaneously sexualised and castrated – literally – his vampire protagonist, while Poppy Z Brite’s more-Southern-gothic-than Anne Rice Lost Souls? has the logical progression of a vampire teenager having a homosexual relationship with his own beautiful, immortal father.

But horror literature – Stephen King and James Herbert aside – is rather a niche market, especially when it gets that kinky. The likes of Somtow and Brite took Rice’s template to an extreme, but it would take more than that to make it popular. It would take… well, let’s see, a long-running hit television series with mass appeal, smart writing and a groundbreaking mix of everyday drama and comedy with fantasy and horror. Step forward, Buffy the Vampire Slayer.

Yes, the other person primarily to blame for the glut of squeeing fangirl vampire romance – quite unintentionally – is the very talented Joss Whedon. Buffy was a surprise sleeper hit, taking Rice’s ‘mournful, brooding vampire’ template and adding a new ingredient – a totally empowered, if often shallow and vacuous, girl heroine, who was no mere damsel in distress. Buffy Summers was, basically, a superhero vampire hunter, like Marvel Comics’ Blade. But unlike Blade, she liked to flirt with the dark side, and here was where the ‘brooding, melancholy vampire’ came in. Angel was an undead creature cursed with a soul to make him regret and torture himself over all the blood he’d spilled – Rice’s Louis, almost to the life (or death).

But there was no Lestat to balance him out. That balance was redressed as early as season two, as we met William the Bloody – forever to be known as Spike. Spike was almost exactly like Lestat, even down to the (dyed) blond hair. But filtered with a modern sensibility reminiscent of Lestat’s rebirth as a rock star; Spike was deliberately reminiscent of a 1970s British punk, despite his 19th century origins and distinctly wobbly accent. Apparently defeated at the end of the season, Spike was too perfect a balance to abandon, and he returned the very next year then became a regular the year after that. Unrepentant but controlled by a chip in his head, you could rely on Spike for a sneering putdown or a bit of the old ultraviolence – providing it wasn’t against humans, or the chip would give him a blinding headache. The difference between Spike and Angel was that Angel didn’t want to be a monster but had to fight against it, while Spike wanted to and couldn’t.

And the difference between Rice and Whedon was a sense of humour, the one thing lacking in the overly earnest, angsty drivel of the Twilight series. Almost from the start, the pomposity of Rice’s vampire archetypes was constantly punctured by witty dialogue and the insightful characterisation of Joss Whedon. Angel’s brooding moods were constantly mocked, at first by the other characters and eventually by Angel himself – by the time he got his own spin off series, he’d admitted to a fondness for Barry Manilow and at one point got turned into a felt muppet, none of which undercut the believability of the character. Spike, on the other hand, was artificially neutered but lost none of the menace, even when he fell for Buffy. And the show got distinctly darker when she not only reciprocated his advances, but broke his cold heart by admitting she only wanted him for sex.

Buffy the Vampire Slayer was a constantly evolving, emotionally complex and surprisingly relevant piece of fantasy television. It perhaps dragged on two seasons too long, though even those last two seasons had gems like Once More With Feeling – a musical episode that actually addressed character motivation through song – and Normal Again, which posits the (unresolved) idea that the whole series is a dream Buffy’s been having while incarcerated in a mental institution. But after seven years Whedon called it a day, and Buffy came to a dignified end. Then the network pulled the carpet out from spinoff show Angel, and that came to a more abrupt, but still heroic, conclusion. And popular vampires retreated back into the aether – or perhaps the coffin.

But Buffy, by dint of the nature of its central character, had created a surprising new fanbase for vampire stories – teenage girls. Girls wanted to be like Buffy Summers – and while some wanted nothing more than the kickass superpowers, still more, it seemed, wanted a doomed, Byronic romance with a mopey immortal tortured by his own demonic nature. Books started to appear. LJ Smith’s Vampire Diaries series, written in the early 90s, was resurrected (pun intended) and extended, while Richelle Mead gave an unwilling world the Vampire Academy series, and Charlaine Harris weighed in with the rather better Sookie Stackhouse series, adapted for TV as True Blood. All of these, you’ll note, are written by women, generally women of an age to have been teenage viewers of Buffy. But the one that caught the imaginations of more emo-loving, self-harming teenage girls than any other was Stephenie Meyer’s dreary Twilight series – the ultimate extension to the trend of defanging the vampire to make him a safe plaything for teenage girls who wanted something a little bit more Byronic than the singer from Dashboard Confessional.

And the true nature of that defanging is to emasculate the vampire. Traditionally, vampires have been steeped in sensuality, if not outright sexuality. Stoker’s Dracula scandalised late Victorian society for its (at the time) overtly sexual tone, with the vampire protagonist playing on the repressed sexual desires of the two main female characters. Rice’s Louis and Lestat shagged like satyrs, Louis with his usual doomed, nihilistic air and Lestat with full on lust. Even rat faced old Graf Orlok in Nosferatu basically dies because he can’t resist the lure of getting his leg over.

But such things are not for Stephenie Meyer (and why can’t she spell her forename properly?). A devout Mormon, she’s been accused of writing, with Twilight, “abstinence porn”. She, conversely, claims that it’s better to show romance without sex. Why, she argues, does romance always have to equate with sex, especially graphic sex in literature? That’s actually a fair point – if you’re writing about people. But vampires aren’t people, and a heightened sexuality has been intrinsic to the legend for centuries. Take way their sexuality, and you might as well take way the fact that they drink your blood.

And in fact, Meyer does that too. Her vampire heroes, the Cullens, are as abstinent from blood drinking as they are from shagging. That’s hardly surprising, as the one is a crudely written metaphor for the other in Meyer’s world. The Cullens have to exert tremendous self control to keep from drinking blood, as once they’ve started it’s almost impossible to stop. But just in case you didn’t get the profoundly obvious metaphor, simpering hero Edward Cullen literally refuses to have sex with passive heroine Bella – a shame, as her lust for him is the only positive thing she does that contradicts the 19th century damsel in distress stereotype. In fact, Bella seems to spend 90 per cent of her time having to be rescued, if not by Edward then by thwarted would be beau and werewolf Jacob Black.

But neither man wants to have sex with her. Oh no, that would send out the wrong message to the teenagers of America. Although the net result of their refusal, coupled with their tendencies (in the films) to stand around looking buff with their shirts off, is a presumably unintentional homoerotic tension that borders on the hilarious. Presumably there’s slash fiction out there in which Edward and Jacob finally consummate their feverish lust for each other – God knows, it’s probably better written than the actual Twilight novels.

So is this the final end for the vampire? From a terrifying walking corpse that wants to kill you and drink your blood to a toothless plaything for pale girls who don’t like to go out much and have a problem getting boyfriends? There are still shreds of hope. True Blood, the TV adaptation of the Sookie Stackhouse series, is a marvellously full-blooded and overblown Southern Gothic melodrama that makes Anne Rice look like Enid Blyton. It still follows the basic Buffy formula of an empowered heroine (Sookie is a telepathic waitress!) caught between a mopey brooding vampire (Bill Compton) and a sexy blond bad boy vampire (Eric Northman). But it’s set in a fascinating world where vampires and humans uneasily coexist, and written in a style like Tennessee Williams without the restraint. Not to mention that it features massively gratuitous amounts of sex, violence, swearing and drug abuse; the dark side of Twilight, it probably gives Stephenie Meyer palpitations just thinking about it. And like Buffy, it has a sense of humour – the cardinal sin of the Twilight series is that it takes everything about itself so bloody seriously.

And while we’re on humour, us Brits have waded in with Toby Whithouse’s excellent Being Human, from BBC3. The comic/dramatic tale of a vampire, a werewolf and a ghost sharing a flat in Bristol and trying to fit into normal society, it’s produced some genuinely chilling portrayals of vampirism mixed with moments of pathos and laugh out loud humour. Mitchell is another vampire trying to be human; but he keeps failing. He’s genuinely funny when out with his mates in the pub, or trying to hold down in a menial job in a hospital; but when he gets really dark, as he does in series one when caught in a vampire civil war or series two when he’s out for revenge, he is one of the most chilling vampires you’ll have seen for ages.

And the kinkier, more niche aspects of horror literature are fighting a rearguard action against the nauseating spectacle of the Twilight series. This is often better demonstrated in the world of comics, where Steve Niles gave us the excellent (and extremely violent) 30 Days of Night (coincidentally adapted into a film directed by David Slade, who has just given us the latest Twilight movie).

So hopefully, this faddish adoption of a monster by insipid doe-eyed teenage emo girls is just a passing thing. The vampire’s been in the doldrums before, and always risen from the coffin again. All we need is to get the fangirls sexually interested in some other classic monster. I suggest they try going on a date with a flesh eating zombie…

Downing Street… The Final Frontier…

So, on Thursday evening I and a group of friends bravely gathered to boldly go where several men have gone before: to stay up all night watching the election, with only the aid of enormous quantities of alcohol.

Election coverage is always fun, as attested to by the numerous parodies of it produced over the years. We watched some of these to get us in the mood. Monty Python’s Election Night Special was followed by Blackadder the Third’s opening ‘Pitt the Younger’ episode, and then some vintage Party Political Broadcasts. Notable was the Green Party one which seemed to consist solely of children being humiliated by having chemical waste dropped on them, the Conservative one which didn’t need words, just a montage of Maggie Thatcher being great to stirring music, and the Conservative ‘car metaphor’ one, in which every party was represented by a car. Labour were of course an old fashioned VdP Princess, the SDP/Liberal Alliance were (of course) a bubble car with two steering wheels, and the Tories somehow thought it would look good if they were an Austin Montego. Plainly they’d never driven one.

Then on to the real thing! It had the potential to be one of the most interesting elections in years, with the televised debates creating a swell of support for the Lib Dems and the other major parties heavily tainted by the expenses scandals, not to mention 13 years of discreditation preceded by 18 years of discreditation. I and most of my friends were voting Lib Dem, and while not expecting them to actually win were hoping for a big increase in their share of the popular vote, and perhaps their number of seats. My young boyfriend had even been out canvassing for them and manning the local polling station.

9pm: we switched to Channel 4’s Alternative Election Night, which promised a ‘night of comedy’ relating to all things electoral. Unfortunately it was primarily presented by the annoying Jimmy Carr, a man who by dint of his very personality can make a good joke unfunny. On the bright side, he was accompanied by the ever-witty David Mitchell, and for some reason Lauren Laverne was there, perhaps as eye candy. A few varyingly funny routines were followed by a politically themed Come Dine With Me, a show that I actually can’t stand in the first place. It was amusing to see Derek Hatton squaring up to Edwina Currie yet again whatever the context though, and Rod Liddle, doing his usual impression of a supremely pissed off bloodhound, was entertainingly rude. Only Brian Paddick, the appropriate Lib Dem  voice of reason, failed to make much of an impression.

But it was 9.55 pm now, and time for the real thing. Over to BBC One we went, expecting it to be the best of the channels covering events. Immediately David Dimbleby popped up, as reassuring as a comfortable old armchair, and a sense of security was generated. Dimbleby would never steer us wrong, and surely in his capable hands the election coverage would be masterful and insightful.

Ever since Bob McKenzie introduced the Swingometer, election pundits have been trying to top this fairly basic way of patronisingly explaining events to the clueless viewer, and the advent of CG has allowed for an increasingly barmy selection of ways to realise the political situation as a largely inappropriate visual metaphor. This has tended to give election coverage an increasingly sci fi feel as years went on, and 2010 didn’t disappoint here. As soon as we saw that Dimbleby and co seemed to be wandering around the Operations Centre of Deep Space Nine, it was clear that this was going to be Star Trek: The Political Coverage.

And so it proved. For the first few minutes, sub-Next Generation music played continuously as Dimbleby introduced us to the crew. We met the Away Teams, who would be dedicatedly stalking the party leaders all night. Andrew Marr was assigned to David Cameron, while John Simpson had beamed to a location near Gordon Brown, and Kirsty Wark was to be genetically handcuffed to Nick Clegg. In the Holodeck was Jeremy Vine, ready to generate computer images to explain everything. Standing ready to scientifically analyse the incoming results was Lt Cmdr Emily Maitlis, who had been equipped with a giant touch screen iPad to illustrate her points. This device, which made intrusive noises reminiscent of a Tivo whenever touched, was quickly dubbed the ‘iPlinth’ in our house, though variants such as ‘iBelisk’ cropped up on occasion.

In a ‘historic first’ Dimbleby then projected several giant phalluses onto the clock tower of the Palace of Westminster. These were to represent how near to the ‘majority line’ each party got as the night progressed. We settled in, beer and nibbles easily to hand, as it all began.

At 10pm on the dot, exit poll results popped up on screen. All looked immediately grim (including us). A Hung Parliament? With the Tories in the lead, and the Lib Dems actually losing seats? Surely not. Notes of caution were immediately struck. “The real poll has yet to be revealed”. Sadly, the exit poll would turn out to be all too accurate.

Meanwhile, we cut to Andrew Neil, who, in a break from the Star Trek theme, was inexplicably hosting a showbiz party on a boat in the Thames, rather like the Sex Pistols famously did. Unfortunately for us, no police stormtroopers were on hand to break this party up, and we had to endure Neil soliciting the expert political opinions of the likes of Bruce Forsyth and Joan Collins. All the celebs seemed somewhat baffled as to what they were actually doing there, and Brucie even went into his “nice to see you, to see you… nice” routine as a kind of default fallback. Copious amounts of alcohol seemed to be on hand, so that by the time Neil sought the astute political analysis of Bill Wyman, the erstwhile Rolling Stone seemed incapable of speech.

We’d cut back to Neil at various times throughout the night, but back at Deep Space Nine, the real analysis was happening as results started to come in. Houghton and Sunderland East, eager to retain their record as first to declare, had enlisted teams of toned teenage athletes to pass the ballot boxes in a relay, which went down well in our house. The first few results were unsurprising; Labour, Labour, Labour. Safe Labour seats always get results in quickly because of their urban nature, and we had to explain to our election newbie that this wasn’t the encouraging sign for Labour it might have seemed.

In the mezzanine, Chief of Security Jeremy Paxman was already on the attack. First to be grilled was the ever slimy Peter Mandelson. Paxman tried bravely, but trying to pin Mandelson down was like trying to get a chokehold on liquid shapeshifter Odo. He had better luck with Lib Dem Ed Davey, who was asked the supremely awkward question, “would you be prepared to get into bed with Peter Mandelson?”

In the Holodeck, Jeremy Vine was striding over a giant map of Britain while summoning a huge vertical chart of each party’s ‘target constituencies’, complete with floating percentage indicators. It was already like being in a low rent version of Avatar, but Vine would get more bizarre as the night wore on.

Back in Operations, the ever reliable Nick Robinson was on hand for any required punditry. Given the already evident Star Trek motif, Nick was inescapably reminiscent of the Emergency Medical Hologram from Voyager: “Please state the nature of the political emergency.” There were signs early on that Nick’s excitement was interfering with his appropriateness gauges, as he began to talk of ‘”hot deals with the Ulster Unionists”.

We also saw signs of the other big story of the night beginning to emerge. It looked as though many potential voters hadn’t actually been able to get into polling stations. Some, particularly students, seemed to have been specifically excluded. Footage was shown of a bedraggled trail of voters trying in vain to vote in Nick Clegg’s constituency. “That’s the queue for a nightclub, surely?” exclaimed young James in our living room.

Back on Andrew Neil’s party boat, the political insight of Mariella Frostrup was being tapped. Mariella was worried; her concern was that “thoughts could be put into Gordon Brown’s hands”. Fortunately, a stunned looking Ian Hislop was also on hand, to ask an actually pertinent question: why, he wondered, were there no percentage breakdowns for the exit polls. Neil, apparently unprepared for a genuinely relevant question, was nonplussed. “Percentages won’t help you”, he snapped, and immediately buggered off, taking the camera with him.

The politicians fall like dominos! Back in the Holodeck, Jeremy Vine was explaining the effect of the expenses scandal with the metaphor of a giant CG domino chain in which every domino bore the face of a naughty politician. A tap of his finger and the virtual naughties fell in a nice pattern, littering the floor of Jeremy’s clean white void.

A quick glance at Twitter revealed that, apparently, no one was watching the ITV coverage. “Alastair Stewart could have just gone to bed” opined one Tweeter. Given the results we were now seeing, he actually would have been better off going to the pub.

It was indeed looking bad; looking, in fact, exactly as the exit polls had indicated. But there was still time for more pontificating. “George Osborne puts the ‘Shadow’ into ‘Shadow Chancellor’” commented one of us as the hapless Tory gloated all over the screen. Meanwhile, election expert Prof Peter Hennessy had been dragged out from a handy cupboard to explain hung Parliaments: “The Queen is only activated under certain circumstances”. This produced the immediate image of the monarch as Terminator like cyborg, waiting patiently in a lab until the ‘hung parliament’ switch was pressed.

Exciting results were coming in. Gordon Brown, predictably enough, won his Kirkcaldy seat, but all were more focussed on the weirdo candidate immediately behind him. Representing ‘Land is Power’, whatever that was, his bald, pallid, sunglass clad visage was inescapably reminiscent of one of the Agents from The Matrix, and his arm was fixed in an inexplicable Black Power salute as the results were read out. Meanwhile, David Cameron was opposed by no less a personage than Jesus Christ, at least according to his outfit and beard. It was a sad indicator of how the night would go that not even the Son of Man could defeat Cameron. Book of Revelations, anyone?

We all flagged as this wore on, hour after hour, and the exit polls more clearly became unpalatable reality. After a couple of hours doze at about 6am, it became clear that a Hung Parliament was indeed the result, and the TV coverage would run for at least another day. Dimbleby took a couple of hours off, but Paxman and Robinson continued unstoppably. Even Andrew Neil was continuing to irritate, having abandoned his drunken celebrity filled party boat for a scenic pagoda on Parliament Square.

Not even we could continue watching election coverage indefinitely, and at about 3pm we gave up and went to the pub. But not before all the party leaders  had shown up to make hotly anticipated statements. Predictably, both Gordon Brown and David Cameron were seeking the support of the Lib Dems to make a workable government. Nick Clegg, to the irritation of many Lib Dem voters (myself included) stuck to his pre election guns of offering to cooperate with whoever had the most seats. It’s fair to say that a large proportion of Lib Dem voters find the Tory party and its policies hugely unpalatable, and despite his integrity I think Clegg risks losing a lot of his core support if he helps David Cameron out in any way at all. Sure it’s a compromise that might help some of their policies into reality, but  in my view, the price of also realising Tory policies is too high to pay.

But to return to the coverage, and we did from time to time, the result had rather tainted the TV experience (the most important aspect, surely?). An election without a clear result is like a sex act without a climax; it all seems to be building to something great that never happens. So we’re stuck with Dimbleby and co for days yet, probably, and an uncertain governmental future. In some ways, it’s interesting times politically, with no clear resolution in sight and little constitutional precedent. It’s also clear that some form of electoral reform is vital to avoid this result in future. The only question is, will we get reform before we get another General Election?

What the people who read the papers say

I’m a big fan of overhyped, ill-informed media circuses – they can be so entertaining. And it was with a rosy glow of nostalgia that I followed the recent shrieking newspaper hysteria over ‘legal high’ mephedrone. Nostalgia because it almost looks like they just dug up some old articles on Ecstasy from the early 90s and changed some of the words.

Like Ecstasy, mephedrone has apparently become a staple of the club scene, and, like Ecstasy, it appears to have caused some high-profile casualties that the ravening press have seized on as mascots in their latest cause celebre. It’s hard to forget the tabloid hysteria surrounding the Ecstasy related death of Leah Betts in 1995; perhaps easier for many people to forget that she didn’t die as a result of taking the drug, but by drinking so much water that her brain swelled up inside her skull. Never ones to learn a lesson about responsible journalism, the press have leaped on, particularly, the recent deaths of two young men in Scunthorpe to bolster a crusade against mephedrone.

Without wanting to cheapen or denigrate the grief of these men’s families, it should be pointed out that every article on this story (including the usually responsible BBC) has ignored the fact that the men in question had also consumed large quantities of alcohol and methadone. The problem was compounded by the fact that ‘methadone’ sounds so similar to ‘mephedrone’ that a number of readers who did notice this seemed unaware of the difference.

I’ve taken to reading online forums of various papers when I’ve a quiet moment at work, and what was surprising – and even encouraging – was that most people chiming in on the debate thought not only that banning mephedrone was a bad idea, but that banning any drug was a bad idea. Perhaps people genuinely are starting to think that, pragmatically, drug prohibition is an expensive, counter-productive waste of time. If that’s the case, for once the tabloids may have to change their tune. But will they? It’s a chicken and egg situation: do the papers form people’s opinions or reflect them once they’ve formed them?

Obviously, it was no surprise to find that leading the charge against what they insist on referring to as “meow meow” is that bastion of common sense, the Sun. Their insistence on calling the drug something which apparently no user ever would is in itself a clue to how ill-informed the paper seems. “Meow meow” has made many people recall Chris Morris’ classic Drugs episode of Brasseye, which now looks prophetic in its depiction of Morris asking random dealers for ‘Clarky Cat’ and ‘Yellow Bentines’. The Sun have produced such calm, clear-headed pieces as ‘Legal drug teen ripped his scrotum off’ which comes as not much of a surprise, but I couldn’t help smirking at the usually earnest Times giving the world ‘Meow meow Sank its Claws Into My Mind’ . The ever reliable Charlie Brooker has pipped me to the post in a much wittier article about the hysteria in his Guardian column so I’ll content myself by stating my view on this ‘problem’.

Mephedrone almost certainly arose as an alternative to other, probably safer drugs which are criminalised. Ban it, as politicians seem intent on doing without thought, and another chemical compound will be synthesised to do the same job. I’ve done my fair share of drug experimentation, but I have no real experience of what the stuff is or what it does, so (unlike many journalists) I wouldn’t presume to speak from a position of knowledge. But as a relatively new substance, legal or not, it’s difficult to know what the risks of taking it are, and the Advisory Council on the Misuse of Drugs should certainly be doing a study. Unfortunately, as the sacking of its former director shows, they’re not going be too keen to produce a study which contradicts the politicians and the press’ preconceived ideas concerning this substance.

The bottom line is this: drug prohibition does not work. From a purely pragmatic viewpoint, there is a demand for ‘drugs’, and has been for thousands of years. And where there is a demand, there will be a supply. Make something illegal, and the people who will provide that supply are the criminals. America’s dalliance with banning alcohol in the 20s is the textbook example, and yet people still fail to learn from it. The relatively benign cannabis is seen as a ‘gateway’ drug – this might be true, but only because you have to buy it from the same shifty dealer who also sells crack and meth. Imagine if you could buy it from your local newsagent, like that government approved narcotic, tobacco.

So again from a pragmatic viewpoint, the only way to properly control drugs is to legalise them. All of them. Educate people about them, regulate their sale, and above all, tax them. The benefits are obvious, once you get off your moral high horse. People will get the drugs whether they’re legal or not – if they’re legal, the quality is guaranteed, you save billions in ineffective anti-drug enforcement and gain billions in taxation. With the added benefit that organised crime would be crippled overnight. The anti-drugs campaigners in this country and the US love to bang on about how drug sales fund terrorism – given the amount of poppies grown in Afghanistan, they’re probably right. So, want to win your ‘War on Terror’ overnight? Take control of their funding by selling the product yourself.

There are any number of other arguments in favour of legalisation, but in the interests of even handedness, I tried to come up with some logical objections, not produced by the knee jerk moralising that you might see in the Daily Mail. There are a couple of things that count against overall legalisation. Firstly, it might give people the idea that the drugs are now, somehow, ‘safe’. This is the real problem with mephedrone – its legal status seems to convince people that a relatively untried substance won’t cause the sort of damage as the illegal ones. But this is the point where education could step in. After all, we all know how bad for us tobacco is. If you somehow missed that at school, the stark ‘Smoking Kills’ notices on the packets should clue you in. If a Health warning’s good enough for Marlboro, why not for crystal meth?

Second, it will make actually getting the drugs easier. This may sound like I’m switching position, but actually the illegality of most drugs does tend to make it difficult to get hold of them. Buying from the chemist is considerably easier than locating a dealer, gaining his trust, and running the gauntlet of potential prosecution to actually purchase something which is probably cut with baby powder anyway. Even so, by removing the rebellious glamour of a drug’s illegality, you’re probably removing a lot of its temptation in the first place. Want to stick it to ‘The Man’? If he’s the one selling the stuff, you’re not going to look like any kind of anarchist buying it.

Don’t get me wrong, I’m not saying that excessive drug use is a good thing. But you could say the same thing about excessive alcohol abuse, and nobody’s calling for booze to be banned. It’s a sad thing that people feel the need to fill some perceived void in their lives by altering their minds with any substance, be it LSD or Guinness. But, again pragmatically, if people are going to do it anyway, let’s at least try and make it as safe as such an activity can be.

Sadly, while the people posting to the online debates understand the hypocrisy of legally selling the far more dangerous tobacco and alcohol, there’s still not enough people vocal about this to give any politician the courage to even mention it. Even if they did, it would only work if it were a worldwide policy, and the US are even more unlikely to put away their emotions and think logically about it. But one thing’s for sure – take away “meow meow” and something else will leap up to take its place. Perhaps we could call it “Shatner’s bassoon”…