Friday 24 July 2009

"Psychoville" or Why Thursday is the Best Day of the Week

Somebody said that the first three Star Wars movies changed their life … and the second three changed it back again. So it was with nervously crossed fingers that I sat down to watch "Psychoville" six weeks ago. There was always the possibility that Reece Shearsmith and Steve Pemberton had spent their "League of Gentleman" interrugnum drinking vente froppamochapoppachinos in fashionable Soho venues with John Lydon and Ben Elton, while rigorously keeping their gym attendances so the ‘Hello’ piccies would show them and their fragrant families at their best. You can’t blame a person for seizing their success in this life; it’s just that when you realise they are left with as much edge as a mildly underinflated beachball, the wincing can cause neuralgia.

I LOVED "League of Gentleman". Certainly, it was just plain funny: no-one could forget Chinnery the vet causing a farting dog to land in an open fire where it exploded in a convulsion of methane and shit. But its main quality was its grotesquery, the best gargoyles on TV since Rigsby, a quality that had not appeared in such a distilled form in the medium before or since. (If I’m wrong about that, please correct me – I’ll instantly buy whatever you recommend).

Medieval, grotesques were incongruous chimeras, strangely (to us) placed in churches as humour, residual traces of fertility symbolism or an hideous repulsion to bad forces. Perhaps the medieval mind more successfully lived the integration of the entirety of human experience than the post-Renaissance eras which sought to impose an order of harmony and beauty, an attempt which must ignore much of the ugliness, degradation and misery which walks hand in paw with the kindness and beauty of any normal life. It took the Romantics and the Gothic movement to elevate these dark sides of our lives to a virtue, a necessary facet, to enjoy the frisson of the darkness. This modern grotesque celebrates moral choices and conflicting agendas, between and within individuals.

I think that at the core of the grotesque is firstly that not everything is at seems: beauty is not necessarily beautiful, ugliness is not necessarily ugly, virtue isn’t always virtuous. And secondly, no spectator is permitted to enjoy a pure emotion for more than a few seconds before the situation is turned around to another. If laughter (as it seems to be) is an expression of tension followed by resolution of the incongruous in an peculiar way, then Pemberton and Shearsmith are masters. They relentlessly alternate the mundane with the atmospheric or exhalted to produce a disturbing hybrid: two clowns (in full makeup) in a broken-down clown car discuss who is blackmailing them while the breakdown mechanic clunks behind them causing the car to release a fart of bubbles; the same clowns also have a desperate fight based on a profound and hateful grudge … in the ball pit at a children’s play park; a cantankerous blind old doll collector sits in an dusty, empty mansion and lifts a chocolate bar to his ear when the phone rings (“Nobody ever phones me on this thing” he later says plaintively); a woman rushes home from the theatre before the final curtain to breastfeed a doll; a serial-murdering old crone takes the time to play Black Lace’s “Superman” to improve her son’s mood after he has done one of his “strangles” (they do the actions – you really have to see this) …

I won’t spoil any more (although you should look out for the musical number starring Jack the Ripper). Just watch “Psychoville”. It’s great entertainment but it’s also among the best of gothic art, whose point, I suspect, is that the only real ugliness is human moral failing.


PS. If you like “Psychoville”, I can’t recommend “Bad Santa” highly enough.

Tuesday 21 July 2009

The Vampire of Venice (… and, incidentally, why vampires visit in the autumn)

Max Clifford must be moonlighting (ha, ha!) on behalf of the Honourable Society of Vampires, as they’re getting some pretty highbrow press these days. Only half joking: the New Scientist carried a piece about a Venetian ‘Vampire’ that had a few of their website commentators groaning (in effect) that the publication was carrying the academic equivalent of Jade Goody’s wedding. (A little dismissive of Matteo Borrini of the University of Florence, who presented his thoughts at American Academy of Forensic Sciences in Denver).

However, the reality isn’t that bad. The woman found in the small island of Lazzaretto Nuovo in the Venice lagoon could well have been regarded as a ‘vampire’ within her own era, and here’s how.

Let’s start with the term ‘vampire’. It entered English early in the eighteenth century, accompanying a rash of reports from the creatures’ native countries. The word is Magyar (Hungarian), via Slavonic. Regional variations such as "upir" in Russia, "upier" in Poland and "vepir" in Bulgaria all relate back to "vampire". It described a type of revenant – a recently dead person who returned and threatened the well-being of the living. Flashing fangs and sucking direct from the jugular remained an idea a hundred years in the literary future at this point; only a few revenants were cited as physically rising from their graves. The rest lay in the ground and did their work ethereally. But all could be identified by the fresh blood found at their mouths when disinterred:

“Not without astonishment, I saw some fresh blood in is mouth, which, according to the common observation, he had sucked from the people killed by him”

Imperial Provisor, Gradisk District
about the vampire Peter Plogojowitz in 1729

There were also differently named revenants from other areas whose activities and identifying features were the same as those of the vampire. The German undead were called "nachzehrer" (translated as "night-waster"), for example, and their first tendency was to consume their own shroud and extremities, before they moved onto their families. Here we can already see that 'vampire' is only one word for a more widespread phenomenon. It may not have been the most appropriate word for a Venetian revenant, but it conveys a sense of the species we're dealing with.

There are two interesting themes here: one is that the revenant’s own extremities were the first to fall prey to his preternatural hunger. The second is that the next natural group of victims was the nachzehrer’s own kin.

To deal with the finger-munching first: in Germanic areas at one time, it was a popular belief that:

"certain persons after death chew in their graves and demolish anything that is near them, and that they can be heard munching like pigs"
Dom Augustin Calmet

This munching occurred especially at times of plague.

Two German publications, "De Masticatione Mortuorum" (That the Dead Chew) (1679) and "De Masticatione Moritorum in Tumulis Liber" (That Dead Men Chew in their Coffins) (1728) both discussed the phenomenon. One recounted that a woman in Bohemia (modern Czech Republic) ate half her winding sheet in 1345. In the time of Luther a man and a woman buried at the same time had "gnawed out each other's vitals" (entrails). In Moravia (also modern CR) a dead man had devoured the winding sheet of the woman buried next to him. In areas of Germanic influence, the dead often had stones and gags put in the mouths at burial to stop them biting in this way. There are other funerary practices in other places which speak of immobilising the mouth, suggesting that the issues faced by people in central Europe were shared over a wider territory. The Venetian vampire was found with a 'small brick' in her mouth. Without a better model, it seems people may have felt that impeding post-mortem munching also impeded the metaphysical act of 'vampiric' draining which would, in turn, lead to more death.

Generally, the corpse's bodily parts that were cited as consumed were the parts that one would expect to fall victim either to decomposition (such as finger ends), or carrion eaters (such as entrails).

And why were the smacking, chewing and snuffling sounds were especially heard at times of plague? During times of epidemic death, the corpses mount up at an unmanageable rate, so burials are likely to be closer to the surface where high temperatures can foster bacterial activity and where the resultant popping and movement can be heard. The Venice vampire herself was removed from a mass grave of victims of the plague of 1576, attesting to the dire funereal efficiencies necessary in such dreadful times.

And onto the second matter, of families: they are the natural food of the recently dead in folklore, because epidemic death is contagious. Those closest in contact with the recent victim and his pathogens were real candidates for disease, an easily observable fact even before the germ theory provided a scientific model of pestilence.

The National Gallery houses a painting by Canaletto (about 1735) which depicts the celebrations of the feast day of St Roch in commemoration of those who died in the 1576 outbreak (which regrettably included Titian). Thanks were also given to the saint for his intercession in bringing the pestilence to an end. St Roch was regularly invoked in relation to the plague and his feast day was August 16th, but whether this is thought to be an anniversary of the end of the outbreak is unclear. It would certainly unlikely if Venice were a little more northerly.

Plague - and vampires - were notorious in Europe for peaking in the autumn, by consequence of the fact that plague’s natural vector (the rat flea) thrived during the heat of the summer months when rats could eat and breed easily. Add a little time for incubation and a little more for the disease to transmit and peak through a human population and there you have it ... why vampires visit in the autumn.

Thursday 16 July 2009

Karen Armstrong: Should We Believe in Belief?

On Monday, Karen Armstrong addressed the Guardian’s question “Should we believe in belief” with the notion that:

“an accident of history that has distorted our understanding of religious truth”

That accident was the emphasis of the western European Christian tradition upon doctrine rather than practice. Most other religious traditions, apparently, put practice first:

“all good religious teaching … is basically a summons to action”

This statement is, in fact, a bias floating close to under the radar. It amounts to: “a religious approach that I don’t like (perhaps doctrinal) is a ‘distortion’ and ‘bad’ religion and therefore not eligible for consideration”.

There has been enough ‘bad religion’ of this variety for most of us to have a few examples to hand. But surely it’s not acceptable to refuse to consider it as valid religiosity. Are we to know ‘real’ religion only by its benign consequences, by whether religious apologists wish to own it or not?

I’ve read a couple of Karen Armstrong’s books and enjoyed their historical content, especially ‘A History of God’. But I think her most basic thinking falls down on three issues, and all are detectable in her reply to ‘Should we believe in belief?’:

Firstly, where one could usefully start with ‘what is religion?’ Armstrong starts with ‘what is religion for?’ and thus confuses causality on many issues, morality and compassion chief among them.

“All religions are designed to teach us how to live, joyfully, serenely, and kindly, in the midst of suffering”

she said in an interview for the same paper in 2007. I beg to differ. Loudly!

The primary quality of religion (or any other non-empirical belief, for that matter), is that it is a metaphysical model of the world. All other things must be secondary qualities. Morality and compassion are certainly not primary features of all religious practice, as burnt witches, religious terror victims and heresy defendants will attest.

This leads to the second point that, for Armstrong, religious practice trumps doctrine:

“Religious doctrines are a product of ritual and ethical observance”

But how can this possibly be? If we started with practice and evinced doctrine therefrom, why would eating a wafer in the name of Jesus Christ be better than picking lilac flowers in a clockwise direction, or hopping every morning until you see a black and white dog? Religion can be something you do, but there must be a set of a priori assumptions in order to inform your practice – a metaphysical map of the universe that instructs your deliberations. This is why religion is not intrinsically moral (although it often is co-incidentally). It isn’t unreasonable to regard practice as the principal religious activity for many, but just because the inherent assumptions or doctrine is sometimes unconscious, it doesn’t mean it’s not there.

Thirdly, I believe that Karen Armstrong makes the error of assuming that all people function religiously in an identical manner to the way she does. Perhaps it’s where her touching view of religion as benign comes from. I think she too often regards religion as necessarily an attempt at experience of the transcendent.

Over a century ago, William James suggested that the religious impulse in humans was informed by religious ‘geniuses’ and transmitted to we more earthbound creatures in duller, more suitable, forms. Allport and Ross (1967) proposed different religious types: ‘intrinsics’ who are more prone to personal numinous experiences and ‘extrinsics’ who, for example, see the social value in attending church. The meaning of their scale (and others like it) is vigorously disputed but the whole area highlights a fact easily observed: that different people receive different types of gratification from religion, and that these sometimes dictate the type of doctrine & observance to which they are attracted. In addition, analysis of transcendent experiences suggests that biology may have a role to play in different levels of receptivity to the transcendent experience: some people may simply not be wired that way. Where one person may experience the divine in prayer, another may find comfort in the conformity that rigorous religious practice brings to his community.

Armstrong’s proposed tension between ‘mythos’ and ‘logos’ makes an appearance too. She claims it as the source of fundamentalism, an attempt to change a tyre with an egg whisk, an inappropriate tool. But it should be remembered that the Roman Church was arguing with Copernicus about physics well before the Reformation or the Enlightenment, and has surrendered its provenance over most matters slowly and painfully. It has learnt a lesson that has yet to be learned by the eager young evangelical Protestant churches which still foolishly challenge science on its own turf. As Armstrong says:

“in ‘creation science’ we have bad science and inept religion”

Perhaps Armstrong’s insistence on the importance of religious practice derives from her own Catholic background. The Reformation, after all, was fought in substantial part upon the issue of whether your passport to heaven was dependent upon good works (the Roman approach) or predestined/based upon the acceptance of Jesus as your saviour (the Lutheran/Calvinist approach).

Newton and Descartes may have claimed it was possible to prove God's existence, as Armstrong says, but Newton also believed in alchemy which we no longer do. The beliefs of scientists are not always science. Science stands apart from a person or supernatural entity, and that is its strength.

Armstrong's insistence on religion as a force for good is touching. If only it were true.

Sunday 12 July 2009

Should we believe in belief?

This week, the Guardian will be asking “Should we believe in belief?

The postulate is that ‘belief’ (for which read ‘religious belief’) does a certain amount of good and that fact alone is sufficient for its encouragement. This is even when the substance of the belief is regarded, with good reason by many, to be spurious - a ‘noble lie’.

The Guardian credits Daniel Dennett for the modern backlash against ‘belief in belief’, an affliction of many, but hypocritical in ‘soft atheists’. His position is challenged on two grounds, both, as far as I can see, problematic.

The first challenge is that societies need myths for cohesion and a sense of purpose. Perhaps without the binding effect of religion/s, the national sense of ennui would overcome whatever structure, cohesion and morality we have.

But a cursory glance at mythology would tell you that not all myths are religious. The mythic approach seems to be as near as intrinsic as you can get to the human condition, and stripped of our supernatural models we would still look for mythic constructs. A societal consensus of meaning does not need to be supernatural to be compelling. This one falls over quicker than a giraffe in high heels. For that reason it’s not one that I expect this week’s religious commentators to tackle on the nose.

The second challenge is that atheists are patronising to assuming that their approach is correct and that others are at fault for holding one of their own. As the Guardian puts it:

“There is no room in Dennett's scheme for "I think you're wrong, but I cannot prove this, and entirely accept you're right to be wrong".”

Ignoring for a moment that science does not set out to prove negatives, it is fair to say that Dennett is not a fluffy atheist. But to embody the whole movement in the approach of one man is unnecessarily harsh on the rest of us. For many (perhaps most?) atheists and free-thinkers, there is sufficient room for the unproven (and unprovable) mental constructs of fellow human beings. After all, I regularly hear the mistaken postulate that Eddie Izzard is the sexiest man in the world - many thanks to my good friend Lisa - when we all know that it’s actually Brad Pitt.

I believe there is plenty of room for the acceptance of others to be ‘wrong’, subject to 2 provisions:

Firstly, an acceptance of the right to free speech. Rights and protections are for humans, not ideas. I can hear Christian evangelists shouting their propaganda on my local high street on Saturdays, and they must accept my right speak as freely. Religion is not a special case and there is no place for the righteous wounded, a posture which represents the change from a position of strength (“You may not blaspheme or I’ll burn you”) to the status of victim (“How could you? This is a sacred area which you shouldn’t touch”). Many mythographers, incidentally, have tried to claim the same exception, that religion can only be commented upon meaningfully from 'inside'. This approach has been rightly identified on more than one occasion as nothing more than professional protectionism.

Secondly, where others’ unprovable metaphysical models have influence in the public domain, the empirical approach must prevail. I have personally spoken to people who truly believe they have been sexually assaulted by demons. I believe they believe, respect their right to believe, don’t even think they’re mad. But I assert my right to disagree with them, and don’t think that the existance of demons should be taught as fact in science lessons, taken as read in legislation, nor demon-slayers given grants of public money.

Believing in belief must be, in my opinion, a strictly private affair. I look forward to reading the commentaries this week