Some Yoricks

Posted in Uncategorized with tags , on June 8, 2009 by colindickey
Laurence Olivier as Hamlet

Laurence Olivier as Hamlet

NPR and others reported last week on the recent staging of “Hamlet” with Jude Law in the title role; Law, ever the Stansilavsky man, requested use of a real skull for the Yorick scene.  As NPR explained:

Actor Jude Law is appearing in London’s West End as Hamlet, using a real human skull instead of a fake one. The production bought the skull for $400 from a dealer in Salt Lake City. Barry Edelstein, director of The Public Theater Shakespeare Initiative, says this is not the first time a real skull has been used in a production of the Shakespeare play. “Some actors want to go for authenticity at all costs, and if that means having a real human skull in their hands when they are speaking to Yorick, they’re going to do what they can to make that happen,” he tells NPR’s Melissa Block. Edelstein says that sometimes having a real skull can make a real difference to the actor. “It’s very much like an actor in a film who’s going to play a policeman, saying, you know, ‘I want to ride around with some cops on the streets of New York for a couple of nights,’” he says. But, Edelstein says, Shakespeare wanted the skull in Hamlet to be handled roughly, and Edelstein says he wouldn’t want his own cranium to be knocked around for all eternity. Still, that doesn’t prevent people from donating their skulls to theater companies. The main donors: actors, Edelstein says. “Theater people can be odd sometimes,” he says.

Is it really the same as riding around with a cop for a few nights?  When I was writing the skull book, a couple of people offered to get me a real skull for contemplation purposes (thanks, mom), but I declined.  It’s not that I’m unnerved by the presence of a real head, but it implies a level of responsibility that I wasn’t yet ready to accept.

Anyway, Law’s experience with the anonymous Salt Lake City skull calls to mind various other uses of real heads for the Yorick prop, including this rather famous story, which appeared last year in the Daily Telegraph:

When André Tchaíkowsky died of cancer in 1982 aged 46 he donated his body for medical science. But he added the proviso that his skull “shall be offered by the institution receiving my body to the Royal Shakespeare Company for use in theatrical performance.” Since then it has only been used in rehearsals because no actor felt comfortable enough using it on stage in front of an audience. David Howells, curator of the RSC’s archives, said: “In 1989 the actor Mark Rylance rehearsed with it for quite a while but he couldn’t get past the fact it wasn’t Yorick’s, it was André Tchaíkowsky’s.” Now, unbeknown to the paying public, Dr Who actor Tennant has used the skull in 22 performances of Hamlet in Stratford-upon-Avon. Director Greg Doran explained why he didn’t want anyone to know. He said: “I thought it would topple the play and it would be all about David acting with a real skull.” Polish-born Tchaíkowsky was smuggled out of the Warsaw ghetto in 1942 to the city of Lodz, before settling in Paris and later England. He lived in Oxford for a time and loved going to the theatre in Stratford-upon-Avon. The skull will now travel with the Hamlet production to the Novello Theatre in London.

Interestingly, Doran’s comments turn out to foreshadow the story of the Law production: the news stories about this most recent performance (see also this one in The New Yorker) are not about the production, but about are (to paraphrase Doran) “all about Jude Law acting with a real skull.”

Of course, it also depends on whether or not the identity of the skull is known; Mark Rylance couldn’t get past the fact that he knew his prop’s identity—Tchaikowsky—whereas Law’s skull is nameless, anonymous.  Why wouldn’t Law (or anyone else) have similar compunctions? Whomever’s skull it is, it’s not Yorick’s.

Some Distinguished Skulls

Posted in Uncategorized with tags , , on May 18, 2009 by colindickey

I’m finally getting around to posting some surreptitious shots of the heads of Johann Spurzheim and Phineas Gage; the museum that houses these fine craniums doesn’t allow photography, so these were taken through clandestine means, thus the poor quality.

The skull of Johann Spurzheim

The skull of Johann Spurzheim

One thing you don’t get a sense of here is the size of Spurzheim’s head; that man had a huge cranium. This was, of course, important for phrenology, and the prevailing notion that brain volume could directly be correlated to intelligence. Spurzheim’s teacher, Franz-Joseph Gall, the founder of phrenology (or organology, or cranioscopy—whatever you want to call it) had an embarrassingly small head; his brain was recorded as weighing a mere 1,198 grams, well below the average of 1,400 grams for a European white male and even below the average for Africans or Native Americans (these “averages”, of course, are total nonsense; as Stephen Jay Gould demonstrated—and which I discuss in the book—these averages were contrived to conform to a pre-existing belief in the hierarchy of races). Spurzheim, however, had a good-sized brain weight of about 1,700 grams, thus “proving” his genius. And it’s not hard to believe, when you see his head, which dwarfed the other skulls on display.

Gage is interesting; the first person to accidentally receive a frontal lobotomy, when a iron rod was blasted through his skull. The following high-quality description I took from Wikipedia (Gage isn’t in the book, though perhaps he should have been), though it’s not for the squeamish:

On September 13, 1848, twenty-five-year-old Gage was foreman of a work gang blasting rock while preparing the roadbed for the Rutland & Burlington Railroad outside the town of Cavendish, Vermont. After a hole was drilled into a body of rock, one of Gage’s duties was to add gunpowder, a fuse, and sand, then compact (“tamp down”) the charge using a large iron rod. Possibly because the sand was omitted, around 4:30 PM: “the powder exploded, carrying an instrument through his head an inch and a fourth in [diameter], and three feet and [seven] inches in length, which he was using at the time. The iron entered on the side of his face, shattering the upper jaw, and passing back of the left eye, and out at the top of the head.”

Nineteenth-century references to Gage as “the American crowbar case” can be misleading. For Americans of the time a crowbar did not have the bend or claw sometimes associated with that term today. Gage’s iron was something like a javelin, “round and rendered comparatively smooth by by use”: “The end which entered first is pointed; the taper being [twelve] inches long…circumstances to which the patient perhaps owes his life. The iron is unlike any other, and was made by a neighbouring blacksmith to please the fancy of its owner.” Weighing 13-1/4 lb (6 kg), this “abrupt and intrusive visitor” was said to have landed some 80 feet (25 m) away.

Amazingly, Gage spoke within a few minutes, walked with little or no assistance, and sat upright in a cart for the 3/4-mile ride to town. The first physician to arrive was Dr. Edward H. Williams: “I first noticed the wound upon the head before I alighted from my carriage, the pulsations of the brain being very distinct. Mr. Gage, during the time I was examining this wound, was relating the manner in which he was injured to the bystanders. I did not believe Mr. Gage’s statement at that time, but thought he was deceived. Mr. Gage persisted in saying that the bar went through his head….Mr. G. got up and vomited; the effort of vomiting pressed out about half a teacupful of the brain, which fell upon the floor.”

Dr. John Martyn Harlow took charge of the case about an hour later: “You will excuse me for remarking here, that the picture presented was, to one unaccustomed to military surgery, truly terrific; but the patient bore his sufferings with the most heroic firmness. He recognized me at once, and said he hoped he was not much hurt. He seemed to be perfectly conscious, but was getting exhausted from the hemorrhage. Pulse 60, and regular. His person, and the bed on which he was laid, were literally one gore of blood.”

The fact that Gage survived this unfortunate incident and went on to live another twelve years became a watershed in modern neurology. Many friends and commentators noticed that Gage’s behavior changed dramatically after the accident (though reports on how extensively differ), and it was Gage that first gave anatomists the notion that behavioral tendencies could be altered by affecting different parts of the brain. For many, Gage’s experience ultimately led to the discovery and implementation of frontal lobe lobotomies.

The skull of Phineas Gage

The skull of Phineas Gage

From the photo you can see the path of the iron rod–it went up through the bottom of his head, under his left cheek, angled slightly–coming out the top where the flap of bone subsequently re-grew.  I had long heard an apocryphal story that the pathway of the rod was somehow cleared and maintained, and that Gage subsequently toured as a circus freak, allowing people to insert things through the length of his injury.  I can’t remember how long ago I heard that story, or how long I’ve been carrying it around, but clearly no such performances took place.  One of the lessons that one does learn from Gage’s skull, though, is the way in which medical lore and mythology can quickly develop and take on a life of its own.

The Obsolescence of Mourning

Posted in Uncategorized with tags on April 28, 2009 by colindickey

As Kate mentioned regarding vampires, it may be that a need for mourning has become “culturally irrelevant” these days. Along those lines, I was actually thinking of W. G. Sebald’s essay “Campo Santo,” which discusses mourning practices in Corsica: “The doors and shutters of the house afflicted by misfortune were closed, and sometimes the whole façade was painted black. The corpse, washed and freshly dressed, or in the not uncommon case of a violent death left in its bloodstained condition, was laid out in the parlor, which was usually less a room intended for the use of the living than the domain of dead members of the family, who were known was the antichi or antinati. This was where, after the introduction of photography, which in essence, after all, is nothing but a way of making ghostly apparitions materialize by means of a very dubious magical art, the living hung pictures of their parents, grandparents, and relations either close or more distant, who although or even because they were no longer alive were regarded as the true heads of the family.” In cataloging these archaic rituals of mourning, Sebald subtly links that work to photography, suggesting that the photograph has replaced the corpse in our lives, and the act of looking at photographs (or being looked at by them) has replaced the act of mourning. He thus concludes his essay with a paraphrase of Pierre Bertraux: “To remember, to retain and to preserve, Pierre Betraux wrote of the mutation of mankind even thirty years ago, was vitally important only when population density was low, we manufactured few items, and nothing but space was present in abundance. You could not do without anyone then, even after death. In the urban societies of the late twentieth century, on the other hand, where everyone is instantly replaceable and is really superfluous from birth, we have to keep throwing ballast overboard, forgetting everything that we might otherwise remember: youth, childhood, our origins, our forebears and ancestors.”

So, then, perhaps we don’t need mourning in the way we once did, but in this regard Jonathan Shay’s “Achilles in Vietnam” becomes vitally interesting: in it Shay (a psychiatrist who works with vets and PTSD) argues that the major components of PTSD come from a lack of ritual that was once associated with war. Turning to “The Iliad” he traces the way in which soldiers were openly mourned by their comrades (for example, Achilles’ funeral games for Patroclus), and suggests that modern military training has suppressed this need for mourning into a “berserker rage,” such that when one’s comrades are killed, one is encouraged not to weep or mourn for them but to turn on the enemy that much more savagely, substituting bloody revenge for a loss. Shay calls for a recognition of the importance of mourning and the need to incorporate it into the military as a means of healing psychological wounds among those subjected to combat.

Certainly of all those in our society that we fail to mourn the loss of, none is more acute than veterans, who are given treacly tributes and nonsensical platitudes by politicians and newscasters before being swept under the rug. And to continue on Sebald’s theme of linking mourning with photography, we as a culture were even (and most pointedly) denied the images of the returning dead—as if to say, there’s no corpse here whatsoever, you can have war without the dead body, without even its ghostly resonance in the photograph.

Joe Dante’s 2005 film “Homecoming” remains to me, in this light, one of the best ideas and biggest missed opportunities along this theme: in it the Iraq war dead come back to life as zombies and plague the United States. They don’t want to eat brains; they only want to vote. The movie too quickly devolves into cheap shots which, while cathartic (the Karl Rove stand-in having his head devoured by zombies is priceless), miss I think a larger anxiety about the way in which we as a culture produce dead bodies in search of abstract symbols, and what to do with those bodies once mourning has been foreclosed to us. It may be that they’re still coming back as vampires and zombies, and that we’ve come full circle once more.

Dead Soldiers Returning from Iraq

Dead Soldiers Returning from Iraq

Brick Eaters

Posted in Uncategorized with tags , , on April 22, 2009 by colindickey

Vampire Skull

Several weeks ago National Geographic reported the discovery of a “vampire” skull, that is, the remnants of a plague victim into whose mouth a brick had been stuffed, a form of postmortem exorcism which would indicate the first remains of a suspected vampire: “as the human stomach decays, it release a dark ‘purge fluid.’ This bloodlike liquid can flow freely from a corpse’s nose and mouth, so it was apparently confused with traces of vampire victims’ blood. The fluid sometimes moistened the burial shroud near the corpse’s mouth, enough that it sagged into the jaw, creating tears in the cloth….Italian gravediggers saw these decomposing bodies with partially ‘eaten’ shrouds….” In such a situation, a brick was inserted to keep the corpse from eating and spreading the plague (other techniques included burying the body of a suspected vampire up to its chin in dirt so that it could not eat its way out of the grave).

I remember learning these various origin myths of vampires when I was in high school: I was told, for example, that malnourished people (as were plentiful in the middle ages) who were acutely iron deficient could smell it in the air, and thus when someone accidentally cut him/herself, the iron deficient might literally smell the iron in the blood and began uncontrollably to salivate—thus further perpetuating the notion of a bloodsucker. Likewise, extreme curvature of the spine (through scoliosis or similar) on a particularly hirsute individual might give the mistaken impression that this person was turning into a four-legged, furry animal—lycanthropy. In other words, the most fearsome were those who had poor eating habits.

The National Geographic story follows a similar trajectory, providing eminently plausible physical reaction that could be mistaken for something far more sinister. But there are other ways to think about the origins of vampires, which lately I’ve found a bit more compelling. Lawrence Rickels’ The Vampire Lectures (which, I hasten to point out, is nearly unreadable—stuffed with way-too-arch puns and witticisms juxtaposed uncomfortably aside theory jargon and lacking much in the way of cohesion; I could barely get through it) points out that vampires come from a rather different source: not so much the un-nourished, but the un-mourned. Prior to Stoker’s Dracula, when much of the modern mythology of vampirism was solidified into its most recognizable tropes, vampires were a common and recurrent feature of most cultures. Rickels describes any number of possible individuals who might be suspected of being vampires: suicides, for example, or widows/widowers were likely to be accused, postmortem, of vampirism, as were unbaptized children or apostates. Likewise with entire families that perished at once (say, in a house fire), or bachelors, who were particularly suspect. What Rickels points out is that in each case what unites these candidates was a lack of a surviving family or community who could properly mourn the dead. It’s this lack of mourning which makes a given corpse dangerous, and thus what casts the suspicion of vampirism on it. Thus the connection of vampires to plagues makes a certain amount of sense, since the plague would wipe out a large enough segment of the population so as to obviate mourning. Rickels writes that “vampirism not only serves the exclusion of the different (a kind of double exclusion of whatever is already on the margin), but that it also always covers the need to mourn. That the vampire is someone who was buried improperly also meant…that this special someone was not mourned properly.” Exorcism was a kind of substitute for mourning, then; the dead body demands some kind of rite, one way or another.

It would be great to rescue the vampire from the treacle of kitsch and nonsense that its suffered of late—from True Blood to Twilight to Blade ad nauseum—and at least make an attempt to reconnect to this older question of mourning, and how we treat the dead body. Since Stoker the vampire has been thought of almost exclusively in terms of a very obvious sexual metaphor—not just Dracula and Twilight but Interview with a Vampire and too many others to mention—and while that’s great as far as it goes, it seems that there’s a certain potency to marrying this anxiety about sexual contact and transference with a equally troubling anxiety about the corpse and its contagion upon the living.

(Rickels also mentions that the brothers of somnambulists were also candidates for vampirism. I’m still puzzling that one out, though I’m thinking it’s going to make a good title for the eventual work on vampires that I get around to writing.)