It is St George's Day in England today, although why St George should have become the patron saint of England is somewhat of a mystery. However, he is a man that does get about a bit, holding a revered status in places as diverse as Georgia, Egypt, Bulgaria, Catalonia, Rio de Janeiro, Lviv, and Victoria among many others too numerous to mention.
According to tradition, George (Georgios) was a Greek who was born in either (Roman controlled) Greek Asia Minor or the Palestinian city of Lydda (also part of the Roman Empire) in the latter part of the third century AD. Like his father before him, he joined the Roman legions and, so tradition would have it, rose to the rank of Tribune under the Emperor Diocletian; his father is noted as being one of Diocletian's favourite generals.
Diocletian tends to get a bit of a bum deal as an emperor, largely because, in 'Christian' countries he is seen as a major persecutor of the early Christians, however he almost certainly saved the Roman Empire from imminent collapse in the late third century with his reformations of the army and the bureaucracy. He was also, as far as I know, the only Emperor to have willingly given up the position 'pour cultiver son jardin'*, although he was, in effect, only a ruler of one quarter of the Empire, having given day-to-day control of the remaining portion to three 'mini-Caesars'.
In 302AD, Diocletian, a devout Pagan, made his first move against the religious beliefs, which he believed were undermining the Empire, by persecuting the Manicheans who were followers of the 'prophet' Mani, a third century religious fanatic who had preached a gnostic religion which had its roots in early Christianity and Judaism but which came with all kinds of added enhancements revealed to him by divine revelation; usual story of a prophet there then. Following the persecution of the Manicheans, Diocletian moved onto the Christians.
It is, to my mind at least, unclear as to whether Diocletian originally intended to martyr the Christians or merely confiscate their property and bar them from office in the army and the Roman administration. It does not much matter now as their fate was to become the same as the Manicheans; death....and the confiscation of their property.
Tradition has it that George was raised a Christian by Christian parents and therefore, likely as not, would not engage in apostasy to try and salvage his position as Tribune. Needless to say, he did not and was put to death after the obligatory torture beforehand; the Romans seldom did things by halves, just ask Jesus of Nazareth.
In 311, Galerius rescinded the persecution edicts and twenty or so years later, Constantine would not only make Christianity the Empire's preferred religion but also handed back the confiscated property to the Christians.
George was canonized as a saint by Pope Gelasius I in 494 and from there on in, his fame was assured, although you do have to feel a little sorry for him; if only he had been born just that little bit later.
While there is a little evidence to suggest that a 'cult' of St George existed in Anglo-Saxon times, he really only comes to the fore with the middle ages; returning Crusaders bought back his tales, not the least of which was the slaying of the dragon in defence of the 'maiden', which appealed to the romantic and chivalric notions of the day. As early as 1190, English ships en route to the Holy Land were flying the Cross of St George as protection from the warships of Genoa. (The English made an annual payment to the Doge of Genoa in this earlier Italian 'Protection racket'.)
Whether the dragon slaying has any basis in fact, a chivalrous deed to protect women drawing water from a river infested with Nile crocodiles perhaps, or whether it is a simple allegory of the saint (George) defeating Satan (the dragon) thereby protecting the Church (the maiden) is both largely irrelevant and hugely relevant; that tale has assured him of such widespread fame that it will live long in the minds of the Western world.
* Yes I know that it is 'cultiver notre jardin'; allow me the artistic licence.
Tuesday, 23 April 2013
Richie, Jimi and Max
So, farewell, Richie Havens, master of the one chord tune. You always seemed to me to look older than you actually were.
You opened Woodstock with 'Freedom': Jimi closed it with an apocalyptic national anthem. It seems mildly ironic now that a 3 day festival dedicated to the musical plagiarism of black delta blues by middle class, white, high school drop outs should be bookended by two such iconic representatives of the very people whose inheritance they were ripping off.
"Sometimes I feel like a motherless child, a long way from my home," that was your catchphrase.
Postscript:
In case any of you are too young to remember Dick Holler's 'Abraham, Martin and John' to which the title of today's blog makes reference to (Max is Max Yasgur on whose Bethel farm Woodstock was staged), the lyric is here. The Marvin Gaye version is undoubtedly the best but I have a soft spot for the original by Dion DiMucci (who co-wrote 'Runaround Sue' and 'The Wanderer').
You opened Woodstock with 'Freedom': Jimi closed it with an apocalyptic national anthem. It seems mildly ironic now that a 3 day festival dedicated to the musical plagiarism of black delta blues by middle class, white, high school drop outs should be bookended by two such iconic representatives of the very people whose inheritance they were ripping off.
"Sometimes I feel like a motherless child, a long way from my home," that was your catchphrase.
Postscript:
In case any of you are too young to remember Dick Holler's 'Abraham, Martin and John' to which the title of today's blog makes reference to (Max is Max Yasgur on whose Bethel farm Woodstock was staged), the lyric is here. The Marvin Gaye version is undoubtedly the best but I have a soft spot for the original by Dion DiMucci (who co-wrote 'Runaround Sue' and 'The Wanderer').
Monday, 22 April 2013
Time, he flexes like a whore, falls wanking to the floor; his trick is you and me, boy
The Penguin once noted (back in 2008) that time was a genuinely 'slippery' customer; it is very difficult to pin down a definition of what it actually is. For theoretical physicists and mathematicians, it is simply another 'dimension' to be measured in an infinite multi-dimensionality; no more worthy of a privileged place that we accord to it than any other dimension. For human beings in general, it is a unique, measurable mechanism which charts our passage from birth unto death, from order into chaos. For animals, birds, insects etc, it is unknowable, for us at least, a perennial 'what is it to be a bat' question (passim), although some at least manage to track time as we perceive it even if they do not do so in the same way that we do now; my cat is usually bothering me around the same time of each day for food even if there is not a clock around (so we can safely assume that they, or Mugwump if you do not want generalise, cannot tell the time that way) and Magicicadas (see previous post) do it on, what seems to us to be, an inordinately lengthy scale.
By and large, quantum mechanical equations are time-invariant; it makes little difference whether the value for time ('t') is a positive or negative number. This raises spectres of time travel at a quantum level but not necessarily at a 'macroscopic level', the realm of normal human experience. Feynman, and later Wheeler, both sought to argue that viewing the positron, the anti-particle to the electron, was mathematically equivalent to viewing an electron moving backwards in time. The macroscopic, linear scenario of 'cause' followed by 'effect' do not have any real meaning in the quantum world.
Strangely enough, although the novel is a fairly recent addition to the literary canon (around 1700 in English), there were few stories written about time travel until the middle of the nineteenth century and almost all of those were written from a present perspective looking into the future. Possibly the earliest known which travels into the past is a short story written anonymously for the Dublin University Magazine entitled 'Missing One's Coach: An Anachronism'; for a novel, one has to wait for Mark Twain's 'A Connecticut Yankee in King Arthur's Court' in 1889, although the main purpose of Twain's book was, likely as not, a thinly veiled satire on the works of Sir Walter Scott, for example 'Ivanhoe' and its imitators.
Perhaps, the dearth of visits to the past, as opposed to the future, had to wait for the rise in the disciplines of paleontology, geology and the early stirrings of the multiverse in popular culture to lend credence to a tale of actual time travel into the past, although the so-called 'grandfather paradox' presents a problem - you step back into the past and shoot your grandfather before he sires your father thus precluding your birth. I have never seen this as a paradox because, so long as one believes in the veracity of cause and effect at the human level, you clearly do not go back with the express intention of murdering your grandfather nor will you be 'allowed' to; at best it can only be a paradox in that most paradoxical of realities, to human minds at least, the quantum world.
At the very worst, all that a time traveller can do in the past has already happened simply because the world is as it is; history has already witnessed your actions in the past and has written them in 'tablets of stone'. So, even if time travel into the past were possible by some means, you could still not effect even the minutest of changes to your past behaviour or alter events.
However the position becomes much more interesting if you consider that with every 'event', quantum or macroscopic, the universe 'buds' another universe in the infinity of universes that make up the multiverse. You kill your grandfather, as in the scenario above and the universe changes into an alternate version in which you do not exist but since you do exist, when you return to your own present, you return to the universe that you left not to the alternate one. Despite having a clear and vivid memory of pushing your grandfather off the 4ieme étage of the Tour Eiffel and seeing him splatter like so much strawberry jam on the pavement below, you 'know' that you did not do itl The 'protected' nature of the past is preserved.
I often, in the quiet of my time, muse on such things as time, its nature and its passage, but, as Alexander Pope once famously pointed out*, 'A little learning is a dangerous thing, Drink deep, or taste not the Pierian spring.' I find that it is primarily a function of age as you realise that three quarters (at best) of your life is past and you should have at least tried to do more than warm your hands under her fur coat.
The title owes its provenance to Mr David Bowie, Esq ('Time' from 'Aladdin Sane') which miraculously escaped an edit in the US release of the single; the poor record company bozos did not perhaps understand the word 'wanking'. That, 'Aladdin Sane' that is and 'Ziggy Stardust and the Spiders from Mars' probably constitute the high point of Bowie's career, although I have a soft spot for 'Scary Monsters' but only if Fripp is on guitar.
Postscript:
Anyone who has read even a small proportion of this blog will know that the Penguin and I share a deep affection for coincidence. Well, here is another one. The well-known, and loved, BBC series Dr Who, about the time-travelling 'Time Lord', has, since its reboot, been accustomed to having a Christmas 'special' as a way of bridging the six to nine month gap between one series and the next.
Between the third series and the fourth, a further mini-episode was inserted, in support of the BBC's 'Children in Need' campaign to raise money for worthy causes. This was composed of a five minute episode in which the then current Doctor, the tenth, played by David Tennant, meets up with the fifth Doctor played by, a now much older (and paunchier), Peter Davidson
The sixth episode of the fourth series, called 'the doctor's daughter' (actually a female 'clone') featured one Georgia Moffett, an English actress who just happens to be the daughter of the afore-mentioned Peter Davidson and Sandra Dickinson, who played Trillion in the TV version of the 'Hitch Hikers Guide to the Galaxy'. David Tennant and Georgia Moffett were married in 2012! Just how spooky is that?
* 'Essay on criticism' which was written in 'heroic couplets' and which also includes 'To err is human, to forgive divine' and 'Fools Rush In Where Angels Fear to Tread'. Damned fine poet was our Mr Pope for all that he could, at times, be such a pompous ass.
By and large, quantum mechanical equations are time-invariant; it makes little difference whether the value for time ('t') is a positive or negative number. This raises spectres of time travel at a quantum level but not necessarily at a 'macroscopic level', the realm of normal human experience. Feynman, and later Wheeler, both sought to argue that viewing the positron, the anti-particle to the electron, was mathematically equivalent to viewing an electron moving backwards in time. The macroscopic, linear scenario of 'cause' followed by 'effect' do not have any real meaning in the quantum world.
Strangely enough, although the novel is a fairly recent addition to the literary canon (around 1700 in English), there were few stories written about time travel until the middle of the nineteenth century and almost all of those were written from a present perspective looking into the future. Possibly the earliest known which travels into the past is a short story written anonymously for the Dublin University Magazine entitled 'Missing One's Coach: An Anachronism'; for a novel, one has to wait for Mark Twain's 'A Connecticut Yankee in King Arthur's Court' in 1889, although the main purpose of Twain's book was, likely as not, a thinly veiled satire on the works of Sir Walter Scott, for example 'Ivanhoe' and its imitators.
Perhaps, the dearth of visits to the past, as opposed to the future, had to wait for the rise in the disciplines of paleontology, geology and the early stirrings of the multiverse in popular culture to lend credence to a tale of actual time travel into the past, although the so-called 'grandfather paradox' presents a problem - you step back into the past and shoot your grandfather before he sires your father thus precluding your birth. I have never seen this as a paradox because, so long as one believes in the veracity of cause and effect at the human level, you clearly do not go back with the express intention of murdering your grandfather nor will you be 'allowed' to; at best it can only be a paradox in that most paradoxical of realities, to human minds at least, the quantum world.
At the very worst, all that a time traveller can do in the past has already happened simply because the world is as it is; history has already witnessed your actions in the past and has written them in 'tablets of stone'. So, even if time travel into the past were possible by some means, you could still not effect even the minutest of changes to your past behaviour or alter events.
However the position becomes much more interesting if you consider that with every 'event', quantum or macroscopic, the universe 'buds' another universe in the infinity of universes that make up the multiverse. You kill your grandfather, as in the scenario above and the universe changes into an alternate version in which you do not exist but since you do exist, when you return to your own present, you return to the universe that you left not to the alternate one. Despite having a clear and vivid memory of pushing your grandfather off the 4ieme étage of the Tour Eiffel and seeing him splatter like so much strawberry jam on the pavement below, you 'know' that you did not do itl The 'protected' nature of the past is preserved.
I often, in the quiet of my time, muse on such things as time, its nature and its passage, but, as Alexander Pope once famously pointed out*, 'A little learning is a dangerous thing, Drink deep, or taste not the Pierian spring.' I find that it is primarily a function of age as you realise that three quarters (at best) of your life is past and you should have at least tried to do more than warm your hands under her fur coat.
The title owes its provenance to Mr David Bowie, Esq ('Time' from 'Aladdin Sane') which miraculously escaped an edit in the US release of the single; the poor record company bozos did not perhaps understand the word 'wanking'. That, 'Aladdin Sane' that is and 'Ziggy Stardust and the Spiders from Mars' probably constitute the high point of Bowie's career, although I have a soft spot for 'Scary Monsters' but only if Fripp is on guitar.
Postscript:
Anyone who has read even a small proportion of this blog will know that the Penguin and I share a deep affection for coincidence. Well, here is another one. The well-known, and loved, BBC series Dr Who, about the time-travelling 'Time Lord', has, since its reboot, been accustomed to having a Christmas 'special' as a way of bridging the six to nine month gap between one series and the next.
Between the third series and the fourth, a further mini-episode was inserted, in support of the BBC's 'Children in Need' campaign to raise money for worthy causes. This was composed of a five minute episode in which the then current Doctor, the tenth, played by David Tennant, meets up with the fifth Doctor played by, a now much older (and paunchier), Peter Davidson
The sixth episode of the fourth series, called 'the doctor's daughter' (actually a female 'clone') featured one Georgia Moffett, an English actress who just happens to be the daughter of the afore-mentioned Peter Davidson and Sandra Dickinson, who played Trillion in the TV version of the 'Hitch Hikers Guide to the Galaxy'. David Tennant and Georgia Moffett were married in 2012! Just how spooky is that?
* 'Essay on criticism' which was written in 'heroic couplets' and which also includes 'To err is human, to forgive divine' and 'Fools Rush In Where Angels Fear to Tread'. Damned fine poet was our Mr Pope for all that he could, at times, be such a pompous ass.
Wednesday, 17 April 2013
Guy Henry, John Searle and the Universal Soldier
Before I start, massive kudos to Guy Henry, an actor too excellent for words. To make 'Henrik Hanssen' both believable and worthy of our empathy. Staggering!
Apropos nothing in particular today, I was reminded of early evenings spent in front of the televisual magic lantern, in the days when I worked a vaguely normal day and could somehow be home by six and my partner would have my slippers warmed by the fire, a pipeful of ready rubbed shag (I kid you not, it was really called ready rubbed shag*) tobacco and the chips (that's fries for you yanks wot dont spik da kings English) frying in the pan. That was a time of true relaxation, a time spent in the company of chips, deep fried in lard, eggs fried in beef dripping and Heinz baked beans and Lea & Perrins, topped with duck fat croutons and, holy of holies, Star Trek: TNG on the goggle box. Star Trek, and Patrick Stewart ordering 'Make it so', in the eighties was the only known antidote to a heart attack and it also dissolved the rotting fat in your arteries. It is no accident that I only had a stroke after giving up Star Trek: TNG and moving on to Wallander and Battlestar Galatica.
The episode in question is one in which, in the space of a mere fifteen minutes, Picard (Stewart) has to defend, by legal argument, Commander Data, the android, against the man that wants to dismantle him in order to find out how he works and therefore build more like him. (Shades of Tolkien's Gandalf there; "he that breaks a thing to find out how it works has left the path of wisdom") It is the officer's right because, as an android, Data cannot be afforded any of the rights of the Federation's constitution, essentially the same as the American one - no surprise there then - because he is not 'alive', a living, feeling being.
This, in a superficial way, although not so superficial as it turns out, goes to heart of an eternal dilemma; how do you tell the humans from the zombies****, in what way can you tell the man from the machine? How do you know something is alive in the way that each and every one of us knows that we are alive? The matter was a largely academic one before the machine age since zombies were merely a theoretical construct, a 'what if' parable, the province of the professional, and sometimes the amateur down the pub, philosopher. However, with the rise of 'machine intelligence' and a growing belief in the 'intelligence' of other animals, do we need a better way of grappling with this question than the 'Turing test'?
At root, the fundamental question is 'Am I alive?', to which the answer is, as Descartes famously said, 'cogito ergo sum', a resounding 'Yes'; by our very definition of what being alive means this must be so, no life, no definition! However, is this definition of life valid? Just because we ask the question does not necessarily mean we know what the answer is or what it might mean. The mice assume, as we do, that when they give 'Deep Thought' the perennial question to answer, the question of life, the universe and everything, that they will know what the answer means. When it happens to be '42', they have no more idea as to what the answer might mean than we do.
John Searle once conducted a 'thought experiment' in which we feed a Chinese text, say 'The art of war' by Sun Tzu, into a closed room with which we have no contact except for a small chute out of which a translation into English will appear in time. Eventually, the book appears duly translated. Does this room exhibit conscientiousness, life? No, argues Searle, because the room lacks 'intentionality, even if there is a little man inside, duly consulting the translations of every pictogram in the Chinese language, and rendering the text into grammatical English, it is still an automaton-like behaviour which, lacking intent, cannot be alive, cannot have conscientiousness. Most people would, I think, agree the basic principles with Searle. However, is this valid?
Whether one is 'a Catholic, a Hindu, an Atheist, a Jain, a Buddhist, a Baptist or a Jew' (to sort of quote Buffy Saint-Marie*****), the sense of 'soul', 'spirit', 'mind' pervades every society; that there is something other than the raw materials of a physical body, a 'something' which makes us alive. However, all of the evidence (yes, I know a rational argument simply uses the a priori assumption that I am alive, it is one, of many, reasons that I am not a philosopher) suggests that this is not the case. No evidence has been found for a soul or spirit, a life force which transcends the moment when we are consigned to the furnace to save space in the graveyard, no heaven and no hell, no giant cosmic wheel of reincarnation, no journey to enlightenment, and no evidence to suggest that our wildest dreams and creations are not just the product of the passage of electrically charged ions across a synaptic gap formed by that most wonderful of creation of evolution; the brain cell.
Is thought, belief not, at a basic level, merely automaton-like behaviour and, if it is, what gives us the right to pronounce on 'alive, not alive'? Is our translation of Sun Tzu any better, any more valid, then Searle's closed room because we apply a further 'layer' or 'layers' to it; seek to define what was in Sun Tzu's mind (there it is again!) when he first penned the work.
This is, of course, a largely academic, and pointless, exercise, the province of professional philosophers with too much time on their hands, since it is an intractable problem; the definition of the problem already makes too many a priori assumptions, not the least of which is that we know what it is to be alive. However, I do wonder sometimes whether the notion that thought, ethics, morality, justice might just be a product of automaton-like behaviour, might just make us consider that 'alive', 'life' extends beyond the confines of the strictly human and thereby make us somehow more responsible, more caring, more aware of that which we hold in our hands.
By its very nature, science-fiction forces us to confront these issues. Whether it be Data's 'humanity' (and Picard does a good job in justifying that - the hologram of 'Tasha Yar is surely the defining moment of his address), the 'aliveness' of the 'Crystal Entity' or the ship that is 'Tin Man', the question as to whether a cloud of gas or a planet can be alive, whether or not it is capable of conscious thought, impels us to a position where we no longer deem our human-centric view to be the most valid.
Should I consider my computer alive? It seems to be. Or at least as much as an amoeba. I provide the stimulus via a keyboard or a mouse, it provides a response. It matters not a jot whether the response is a product of 'man's invention'; we have become the gods of man's invention.
* I do not expect to have to point out what 'rubbed' might mean in this context; 'shag' is a more polite word for 'fuck'. As in the phrase, 'you shag a slag**' but you make love to your missus*** :)
** Slag. A euphemism for a 'worldly girl'; one who is prepared to 'give it away' for half a bottle of Cava and a bag of pork scratchings, or lardons as they are known in French.
*** The wife.
**** 'Zombie' is a staple of philosophical discussion; the being that appears human but is not.
***** 'The universal soldier', a song, written at the height of the US war in Viet-Nam, which goes to the heart of what it is to be existentially responsible, hippie-chick cod philosophy notwithstanding; the full lyric is here.
Apropos nothing in particular today, I was reminded of early evenings spent in front of the televisual magic lantern, in the days when I worked a vaguely normal day and could somehow be home by six and my partner would have my slippers warmed by the fire, a pipeful of ready rubbed shag (I kid you not, it was really called ready rubbed shag*) tobacco and the chips (that's fries for you yanks wot dont spik da kings English) frying in the pan. That was a time of true relaxation, a time spent in the company of chips, deep fried in lard, eggs fried in beef dripping and Heinz baked beans and Lea & Perrins, topped with duck fat croutons and, holy of holies, Star Trek: TNG on the goggle box. Star Trek, and Patrick Stewart ordering 'Make it so', in the eighties was the only known antidote to a heart attack and it also dissolved the rotting fat in your arteries. It is no accident that I only had a stroke after giving up Star Trek: TNG and moving on to Wallander and Battlestar Galatica.
The episode in question is one in which, in the space of a mere fifteen minutes, Picard (Stewart) has to defend, by legal argument, Commander Data, the android, against the man that wants to dismantle him in order to find out how he works and therefore build more like him. (Shades of Tolkien's Gandalf there; "he that breaks a thing to find out how it works has left the path of wisdom") It is the officer's right because, as an android, Data cannot be afforded any of the rights of the Federation's constitution, essentially the same as the American one - no surprise there then - because he is not 'alive', a living, feeling being.
This, in a superficial way, although not so superficial as it turns out, goes to heart of an eternal dilemma; how do you tell the humans from the zombies****, in what way can you tell the man from the machine? How do you know something is alive in the way that each and every one of us knows that we are alive? The matter was a largely academic one before the machine age since zombies were merely a theoretical construct, a 'what if' parable, the province of the professional, and sometimes the amateur down the pub, philosopher. However, with the rise of 'machine intelligence' and a growing belief in the 'intelligence' of other animals, do we need a better way of grappling with this question than the 'Turing test'?
At root, the fundamental question is 'Am I alive?', to which the answer is, as Descartes famously said, 'cogito ergo sum', a resounding 'Yes'; by our very definition of what being alive means this must be so, no life, no definition! However, is this definition of life valid? Just because we ask the question does not necessarily mean we know what the answer is or what it might mean. The mice assume, as we do, that when they give 'Deep Thought' the perennial question to answer, the question of life, the universe and everything, that they will know what the answer means. When it happens to be '42', they have no more idea as to what the answer might mean than we do.
John Searle once conducted a 'thought experiment' in which we feed a Chinese text, say 'The art of war' by Sun Tzu, into a closed room with which we have no contact except for a small chute out of which a translation into English will appear in time. Eventually, the book appears duly translated. Does this room exhibit conscientiousness, life? No, argues Searle, because the room lacks 'intentionality, even if there is a little man inside, duly consulting the translations of every pictogram in the Chinese language, and rendering the text into grammatical English, it is still an automaton-like behaviour which, lacking intent, cannot be alive, cannot have conscientiousness. Most people would, I think, agree the basic principles with Searle. However, is this valid?
Whether one is 'a Catholic, a Hindu, an Atheist, a Jain, a Buddhist, a Baptist or a Jew' (to sort of quote Buffy Saint-Marie*****), the sense of 'soul', 'spirit', 'mind' pervades every society; that there is something other than the raw materials of a physical body, a 'something' which makes us alive. However, all of the evidence (yes, I know a rational argument simply uses the a priori assumption that I am alive, it is one, of many, reasons that I am not a philosopher) suggests that this is not the case. No evidence has been found for a soul or spirit, a life force which transcends the moment when we are consigned to the furnace to save space in the graveyard, no heaven and no hell, no giant cosmic wheel of reincarnation, no journey to enlightenment, and no evidence to suggest that our wildest dreams and creations are not just the product of the passage of electrically charged ions across a synaptic gap formed by that most wonderful of creation of evolution; the brain cell.
Is thought, belief not, at a basic level, merely automaton-like behaviour and, if it is, what gives us the right to pronounce on 'alive, not alive'? Is our translation of Sun Tzu any better, any more valid, then Searle's closed room because we apply a further 'layer' or 'layers' to it; seek to define what was in Sun Tzu's mind (there it is again!) when he first penned the work.
This is, of course, a largely academic, and pointless, exercise, the province of professional philosophers with too much time on their hands, since it is an intractable problem; the definition of the problem already makes too many a priori assumptions, not the least of which is that we know what it is to be alive. However, I do wonder sometimes whether the notion that thought, ethics, morality, justice might just be a product of automaton-like behaviour, might just make us consider that 'alive', 'life' extends beyond the confines of the strictly human and thereby make us somehow more responsible, more caring, more aware of that which we hold in our hands.
By its very nature, science-fiction forces us to confront these issues. Whether it be Data's 'humanity' (and Picard does a good job in justifying that - the hologram of 'Tasha Yar is surely the defining moment of his address), the 'aliveness' of the 'Crystal Entity' or the ship that is 'Tin Man', the question as to whether a cloud of gas or a planet can be alive, whether or not it is capable of conscious thought, impels us to a position where we no longer deem our human-centric view to be the most valid.
Should I consider my computer alive? It seems to be. Or at least as much as an amoeba. I provide the stimulus via a keyboard or a mouse, it provides a response. It matters not a jot whether the response is a product of 'man's invention'; we have become the gods of man's invention.
* I do not expect to have to point out what 'rubbed' might mean in this context; 'shag' is a more polite word for 'fuck'. As in the phrase, 'you shag a slag**' but you make love to your missus*** :)
** Slag. A euphemism for a 'worldly girl'; one who is prepared to 'give it away' for half a bottle of Cava and a bag of pork scratchings, or lardons as they are known in French.
*** The wife.
**** 'Zombie' is a staple of philosophical discussion; the being that appears human but is not.
***** 'The universal soldier', a song, written at the height of the US war in Viet-Nam, which goes to the heart of what it is to be existentially responsible, hippie-chick cod philosophy notwithstanding; the full lyric is here.
Monday, 15 April 2013
Magicicada, prime numbers and an orgy of synchronicity
It looks as if the US East Coast is due a massive influx of visitors this spring; not tourists but Cicadas.
I have been fascinated with cicadas ever since I discovered the weird life cycle of a few species. Rather than a life span of between two and five years, some species (Magicicada) have life spans of a much longer duration. However, almost all of their life is spent at the larval, or nymph, phase. After hatching, the larvae drop to the ground and burrow below its surface; there they remain for either 13 or 17 years, depending on the species, feeding on tree root sap. After the allotted time, they all burst from the ground, shed their final larval skin and emerge as fully mature, sexually reproducing adults in an orgy of synchronicity. I have only ever seen video footge of it but it is as scary as hell. Once they have mated and the eggs have been laid, they die.
Quite why they have such a bizarre life cycle is a matter for some conjecture. It may be that with a long larval stage and the fact that 13 and 17 are prime numbers, only divisible by one and themselves, predators are unable to 'track' them, increasing their own numbers to take advantage of the abundance of cicadas in certain years; the predators quickly becomed sated by the number of cicadas emerging simultaneously and the vast bulk of the insects breed in peace, relatively free from predation. There are other prime numbers lower than 13 or 17, for example 5,7.11, but these may be easier for a predator to track, offering a larger number of instances where the potential for a 'crossover' is greater. For example, if cicadas emerged every five years there would be potential for a bonanza for the predator every 15 years, if the predator was to become locked into a three year cycle of population growth and decline. With the same predator population growth and the cicada on a 17 year cycle, opportunities to synchronise would only happen every 51 years. Whether Mother Nature had indeed been so clever, we shall probably never know for certain but such synchronous 'super-abundance' is quite common as a smart move to foil predators; good tricks once they are discovered are too valuable not to be shared by as many species as possible.
Species of coral release their eggs and sperm simultaneously over an entire reef, although how they manage that trick without even a semblance of a brain is anybody's guess. Horseshoe crabs do something similar, marching up the selfsame beach at the selfsame time as do turtles when they lay their eggs. It is a well worn adage in the kingdom of Animalia; there is safety in numbers and the greater the number, the greater the safety. It is why fish shoal, why birds flock, why ruminants herd, why people band together in communities despite the fact that the greater the number in one place, the more finite is the food resource or mating opportunity for the single individual. However, as my granny always said: "Don't knock it if it works!" She also said: "Don't make love at the garden gate 'cos love may be blind but the neighbours ain't." I never listened to that one either. (Ah, those days of youthful innocence, when you would snog at the gate, rather than on the porch for fear of alerting or waking the 'dreaded bugblatter beast of Thrall' - her mother. We used to get quite an audience of pre-pubescent young girls peering out of first floor bedroom windows. Taking notes?)
It is always difficult to write about wildlife, be they insects, fish, reptiles, birds, monotremes, marsupials, mammals. Bacteria, protozoa, jellyfish (almost but not quite), plants, fungi are easy. Nobody gives it a second's thought that there might actually be intent, a purpose to their behavious in any real sense in the way that we understand the word 'purpose'. However, as soon as a definable body plan becomes apparent, we cannot help but take 'an intentional stance' however we might resist it. We even seek the purpose in a blind algorithm based on chance; evolution by natural selection. However many times Dawkins, Gould, Horner, Bakker, Attenborough et al tell us that we should distance ourselves somewhat, take the scientific approach, still we, and they writing for us, make the same basic error time and time again; we just can't help ourselves.
Dinosaurs grew so big to help them conserve heat by reducing the surface area to volume ratio. No they did not, that would imply intent. They by chance got bigger, there was enough food to sustain them, the climate was warmer, they were too big for all but the largest predators, therefore they survived for eons.
Falcons have baffles in the nostrils to disrupt the air flow and so that they do not have their lungs inflating to bursting point when they launch in to a power dive at over 250kph. No, the first falcon who tried that trick without baffles blew him/herself up; the first falcon to try it with baffles did not. He/she was thus able to move faster through the air and probably caught more prey as a result. Survival.
The cheetah has non-retractable claws unlike other cats which give it greater traction across the savannah, like a sprinter's spikes. No. For reasons best known to God, a freak was born who could not retract his/her claws. He/she found that his/her hunting success went up from 1 in 15 hunts successful to 1 in 10, therefore more cubs raised and fitter animals all round for those that inherit non-retractable claws.
The single most important question to human beings is 'why?' It is why we spend so much time, money and energy on theorising and 'proving' that theorising. When it come to life, it is really difficult to do that. You can no more account for the disappearence of archeopteryx from the fossil record than you can find God in a grain of sand. If an animal, in the broadest sense, is alive now then it is at least passibly suited to its environment. If that situation should change then perhaps the animal will become extinct, or it may profilerate in numbers unheard of, or it will remain, just as it is.
In many ways, popular writing on Life on Earth is merely so many 'Just So Stories', tell the stories that have some 'explanation' and sweep everything else under the carpet of 'neutral mutation'. Of course, there may be all kinds of pressures on mutations but the mutations have to happen first, quite randomly or maybe not so randomly, before evolution, the environment can test them and prove them fit or otherwise. It is this point that I think 'populist' writers about science do not emphasis enough.
There is nice animated gif of a cicada shedding its old skin which I have nicked from Wikipedia for convenience. The usual creative commons blurb applies.
I have been fascinated with cicadas ever since I discovered the weird life cycle of a few species. Rather than a life span of between two and five years, some species (Magicicada) have life spans of a much longer duration. However, almost all of their life is spent at the larval, or nymph, phase. After hatching, the larvae drop to the ground and burrow below its surface; there they remain for either 13 or 17 years, depending on the species, feeding on tree root sap. After the allotted time, they all burst from the ground, shed their final larval skin and emerge as fully mature, sexually reproducing adults in an orgy of synchronicity. I have only ever seen video footge of it but it is as scary as hell. Once they have mated and the eggs have been laid, they die.
Quite why they have such a bizarre life cycle is a matter for some conjecture. It may be that with a long larval stage and the fact that 13 and 17 are prime numbers, only divisible by one and themselves, predators are unable to 'track' them, increasing their own numbers to take advantage of the abundance of cicadas in certain years; the predators quickly becomed sated by the number of cicadas emerging simultaneously and the vast bulk of the insects breed in peace, relatively free from predation. There are other prime numbers lower than 13 or 17, for example 5,7.11, but these may be easier for a predator to track, offering a larger number of instances where the potential for a 'crossover' is greater. For example, if cicadas emerged every five years there would be potential for a bonanza for the predator every 15 years, if the predator was to become locked into a three year cycle of population growth and decline. With the same predator population growth and the cicada on a 17 year cycle, opportunities to synchronise would only happen every 51 years. Whether Mother Nature had indeed been so clever, we shall probably never know for certain but such synchronous 'super-abundance' is quite common as a smart move to foil predators; good tricks once they are discovered are too valuable not to be shared by as many species as possible.
Species of coral release their eggs and sperm simultaneously over an entire reef, although how they manage that trick without even a semblance of a brain is anybody's guess. Horseshoe crabs do something similar, marching up the selfsame beach at the selfsame time as do turtles when they lay their eggs. It is a well worn adage in the kingdom of Animalia; there is safety in numbers and the greater the number, the greater the safety. It is why fish shoal, why birds flock, why ruminants herd, why people band together in communities despite the fact that the greater the number in one place, the more finite is the food resource or mating opportunity for the single individual. However, as my granny always said: "Don't knock it if it works!" She also said: "Don't make love at the garden gate 'cos love may be blind but the neighbours ain't." I never listened to that one either. (Ah, those days of youthful innocence, when you would snog at the gate, rather than on the porch for fear of alerting or waking the 'dreaded bugblatter beast of Thrall' - her mother. We used to get quite an audience of pre-pubescent young girls peering out of first floor bedroom windows. Taking notes?)
It is always difficult to write about wildlife, be they insects, fish, reptiles, birds, monotremes, marsupials, mammals. Bacteria, protozoa, jellyfish (almost but not quite), plants, fungi are easy. Nobody gives it a second's thought that there might actually be intent, a purpose to their behavious in any real sense in the way that we understand the word 'purpose'. However, as soon as a definable body plan becomes apparent, we cannot help but take 'an intentional stance' however we might resist it. We even seek the purpose in a blind algorithm based on chance; evolution by natural selection. However many times Dawkins, Gould, Horner, Bakker, Attenborough et al tell us that we should distance ourselves somewhat, take the scientific approach, still we, and they writing for us, make the same basic error time and time again; we just can't help ourselves.
Dinosaurs grew so big to help them conserve heat by reducing the surface area to volume ratio. No they did not, that would imply intent. They by chance got bigger, there was enough food to sustain them, the climate was warmer, they were too big for all but the largest predators, therefore they survived for eons.
Falcons have baffles in the nostrils to disrupt the air flow and so that they do not have their lungs inflating to bursting point when they launch in to a power dive at over 250kph. No, the first falcon who tried that trick without baffles blew him/herself up; the first falcon to try it with baffles did not. He/she was thus able to move faster through the air and probably caught more prey as a result. Survival.
The cheetah has non-retractable claws unlike other cats which give it greater traction across the savannah, like a sprinter's spikes. No. For reasons best known to God, a freak was born who could not retract his/her claws. He/she found that his/her hunting success went up from 1 in 15 hunts successful to 1 in 10, therefore more cubs raised and fitter animals all round for those that inherit non-retractable claws.
The single most important question to human beings is 'why?' It is why we spend so much time, money and energy on theorising and 'proving' that theorising. When it come to life, it is really difficult to do that. You can no more account for the disappearence of archeopteryx from the fossil record than you can find God in a grain of sand. If an animal, in the broadest sense, is alive now then it is at least passibly suited to its environment. If that situation should change then perhaps the animal will become extinct, or it may profilerate in numbers unheard of, or it will remain, just as it is.
In many ways, popular writing on Life on Earth is merely so many 'Just So Stories', tell the stories that have some 'explanation' and sweep everything else under the carpet of 'neutral mutation'. Of course, there may be all kinds of pressures on mutations but the mutations have to happen first, quite randomly or maybe not so randomly, before evolution, the environment can test them and prove them fit or otherwise. It is this point that I think 'populist' writers about science do not emphasis enough.
There is nice animated gif of a cicada shedding its old skin which I have nicked from Wikipedia for convenience. The usual creative commons blurb applies.
Blodwyn Pig, piercings & tattoos and FGM
I remarked a few posts ago about a chance viewing of a programme about the blues which had included a band which was included in the line up for a 3 band gig from my youth, Ten Years, Blodwyn Pig and Stone the Crows (The post is here). Well, would you believe it? I was watching a documentary about 'Are you being served', a hoary old sit-com about a department store, starring amongst others the very camp and effete and now late John Inman. Towards the end of the programme, there was a brief explanation of the colourising software they had used to turn a recording of the pilot episode, which was shot in colour, but the recording was only in black and white, into full colour. They demonstrated part of the process with a group of musicians, probably 'Top of the Pops'; the band was Blodwyn Pig in what was possibly their only TV appearance. How weird is that!
Having got that little coincidence out of the way, humanity never seems to be satisfied with the the body it has been given.
Tattooing has a long history and has recently undergone a genuine renaissance in the past twenty years or so. Why this should be so escapes me. Make up or war paint serves much the same function, if you wish to adorn your body with designs; I do not understand why anybody would wish to 'paint' their bodies with something so permanent. Whilst I do not have a particular aversion to tattoos per se, or to tattooists, I just find difficult to comprehend why someone would do something so permanent, or at least damned difficult to eradicate completely in the future, something that you had better get used to because it is going to be staring back at you in the mirror for the next fifty or sixty years.It is difficult to know where this new found craze, once the province of seamen and prison inmates, emanated from, although it perhaps stems from the 'gangsta' culture which built up in eigthties and through the nineties and influenced all manner of fashions.
Hand in hand with the rise in tattoos, came the rise in body piercings, as if in search of some mystical desire to a more 'primitive' culture. Whilst the craze for pierced ears had shown no sign of abating; they are the only option if you want to wear some kinds of earring. During my irregular forays into dressing up in drag, I can assure you that the alternative, 'clip-ons', hurt like hell after a while. However I do not yet fully understand the notion of placing rings or studs on any available piece of cartilage, although in the case of piercings, at least the holes will heal by and large if you grow tired of looking like the back of a biker's leather jacket. And please do not get me started on the more personal items of 'jewelery', lidocaine or no lidocaine (and I do not care if Prince Albert had one or not).
Having got that little coincidence out of the way, humanity never seems to be satisfied with the the body it has been given.
Tattooing has a long history and has recently undergone a genuine renaissance in the past twenty years or so. Why this should be so escapes me. Make up or war paint serves much the same function, if you wish to adorn your body with designs; I do not understand why anybody would wish to 'paint' their bodies with something so permanent. Whilst I do not have a particular aversion to tattoos per se, or to tattooists, I just find difficult to comprehend why someone would do something so permanent, or at least damned difficult to eradicate completely in the future, something that you had better get used to because it is going to be staring back at you in the mirror for the next fifty or sixty years.It is difficult to know where this new found craze, once the province of seamen and prison inmates, emanated from, although it perhaps stems from the 'gangsta' culture which built up in eigthties and through the nineties and influenced all manner of fashions.
Hand in hand with the rise in tattoos, came the rise in body piercings, as if in search of some mystical desire to a more 'primitive' culture. Whilst the craze for pierced ears had shown no sign of abating; they are the only option if you want to wear some kinds of earring. During my irregular forays into dressing up in drag, I can assure you that the alternative, 'clip-ons', hurt like hell after a while. However I do not yet fully understand the notion of placing rings or studs on any available piece of cartilage, although in the case of piercings, at least the holes will heal by and large if you grow tired of looking like the back of a biker's leather jacket. And please do not get me started on the more personal items of 'jewelery', lidocaine or no lidocaine (and I do not care if Prince Albert had one or not).
Of course, marching lock-step is the growth of elective cosmetic surgery amongst the 'developed' nations. Plastic surgery has come on in leaps and bounds since the early work on burns victims during the second world war and much valuable work is done in the areas such as reconstruction after surgery, in genetic disorders such as pectus excavatum or Marfan's syndrome or in cases of deformity, whether skeletal or dermal. However, a huge amount of work is undertaken which is surely unnecessary. You can get just about any part of your body enlarged (except THAT bit :), reduced, straightened, pulled, stretched, 'undimpled'; you name it and there is a surgeon out there who is willing do it for a fee. Women can even get their labia remodelled (and their anus bleached but that it perhaps a different kettle of fish)!
I find it hard to put this all down to modern society's obsession with 'image'; how women, and to a large extent men, must conform to some nebulous 'ideal' which is neither clearly defined nor, for that matter, necessarily desirable. Western societies had largely dispensed with permanent 'mutilation' of body parts, the binding of children's feet, the elongation of the neck, the distension of the lower lip, circumcision, and replaced it with that most throwaway of notions, fashion clothing. From the alarmingly pointed shoes of the medieval period, the cod-piece, the ruff, the whalebone corset, the bustle, the wonderbra, clothes took on the semblance of who you wanted, or were coerced into believing you wanted, to be.
Societies have always considered some 'forms' to be more desirable than others and it has been to western society's credit that we no longer exhibit children or adults in travelling 'freak shows' for the normal to gawp at, no longer lock them away* but in the last twenty years or so, it appears that we want to forget how we often change our minds during the course of a life and so we strike out for permanence when it is seldom other than a whim, a fancy, which would be much better served by something more transitory.
It would go against all my principles to actually tell people that they could not exercise their freedom of choice and have a pair of 54DD hooters, courtesy of the great God silicone, or that I would deny a seventy year old the option of ironing out their wrinkles with a nip and a tuck and a gather of the excess skin at the nape of the neck with an elastic band any more than I wish to deny the right of the woman to choose to terminate a pregnancy (although I would hope that she would at least canvass the father's views). I just think it is sad that some people do not seem to be able to come to terms with who they are and wish to be a different person.
I was sometimes asked when I was younger, people do not seem to care any more, about why I never got my face 'fixed' since it would have been easy to convince the medical profession of the lasting psychological damage which my appearance would endow me with and, as a result, I would get it all done for free on the National Health Service :) There were three reasons; plastic surgery around the eyes is tricky because there is no skin on your body that matches the thin skin around the eyes**; I felt uncomfortable about lying with regard to my psychological state of mind, which was OK, thank you very much but most importantly, who I am, the person that I became, have become, is inextricably tied up with what I look like. If you took that away, 'fixed' it, how would I ever justify to myself or to somebody else, why I am me, not somebody else.
I had every intention of writing a diatribe today about FGM (female genital mutilation, or female circumcision) following a drama production over the last couple of weeks about the subject. I decided against it not because I do not have anything to say but it is difficult for someone who believes that male circumcision is fundamentally immoral, babies are in no position to offer reasoned and informed consent over the 'briss', that how do you begin to put a reasoned and sober argument against the kind of butchery visited upon young women in certain parts of Africa and Asia.
The people responsible do not listen to reasoned and sober arguments and merely hide behind the thinnest of patinas of cultural identity and tradition..
* I believe that there was a permanent 'freak show' at 'Dreamland' on Coney Island, NYC until about 1910. It appears to have been taken over in the eighties or nineties by those who pastiche the kind of performances which could be seen there at the turn of the century.
** Anybody who 'knows' Simon Weston, the sailor dragged off the burning ship during the Falklands War will know what I mean. He, of course, had no choice in the matter; his eyelids had been burnt off in the fire.
The people responsible do not listen to reasoned and sober arguments and merely hide behind the thinnest of patinas of cultural identity and tradition..
* I believe that there was a permanent 'freak show' at 'Dreamland' on Coney Island, NYC until about 1910. It appears to have been taken over in the eighties or nineties by those who pastiche the kind of performances which could be seen there at the turn of the century.
** Anybody who 'knows' Simon Weston, the sailor dragged off the burning ship during the Falklands War will know what I mean. He, of course, had no choice in the matter; his eyelids had been burnt off in the fire.
Friday, 12 April 2013
Freedom of the press, protection of sources and fundamental moral dilemmas
I actually wrote two of the most recent blogs in complete ignorance over the freedom of the press issue that has once again raised its ugly head in respect of the 'The Dark Knight rises' shootings last year when 12 people were killed at a cinema in Aurora, Colorado..
Jana Winter, a New York journalist, wrote a piece last year in which she cited 'two law enforcement sources' and detailed an alleged notebook sent to his psychiatrist by the man accused of the shooting, James Holmes, prior to the bloodbath which he appeared to indicate his intent, filled as it was with stick figures shooting other stick figures. What is at stake, it appears, is perhaps six months in prison for Ms Winter but, perhaps more importantly, the well-worn adage of journalists; protect your sources at all costs.
However it seems to me that this is not the only issue here. The protection of sources issue is only likely to become an issue because Winter defied a court gagging order on pre-trial publicity; it still might have become an issue but with a court ban it seems to be a certainty. There is also the question of whether or not the 'law enforcement sourdes' had any right, moral and legal, to be speaking about something which should have been solely the province of the prosecution and defence councils and the accused. There is also the question of how far should doctor/patient confidentiality over-ride the obligations of citizens to do their civic duty and report suspicion of potential crime in order to prevent it.
That last aspect is probably the easiest to answer; never. It is a very slippery slope which doctors would have to negotiate were they to be, morally or legally, impelled to voice their own personal suspicions, whether or not they are backed by expertise and experience; that way leads to a 'Minority Report'* situation which can surely never be justified, even on practical grounds, except in a totalitarian regime.
In the same way, there is a paramount need for journalists to protect their sources IN EVERY CASE. Prosecution lawyers and judges and even the Lord Chancellor may disagree but without a guarantee of anonymity, people will be far less inclined, may be totally disinclined, to speak to journalists, especially investagative journalists, which undermines not only the journalists' profession and the freedom of the press but also cuts to the heart of what the general public actually wants journalists to do; probe and get to the truth of the matter.
As far as I am aware, the sanctity of that anonymity has been preserved in the UK, although it took an appeal to the European Court to do so (see Bill Goodwin's famous case in the late 80s & the 90s, that one ran and ran), however I believe that this 'rule' may be overturned in the US 'in the interests of national security', which is just another way of saying that if the President and his/her advisors want to know where the information came from then 'spill the beans' otherwise it is a brief spell in chokey for you, sunny Jim/sunny Jane. Journalists, at least the responsible among them, tread a hair's breadth tightrope between the need to disclose, possibly sensitive or personal, information and the need to make the story public to the very public that want, and need, to be informed about events.
Ms Winter's alleged breach of a court order banning pre-trial publicity is very straightforward. If she believes the ban wrong in some way, then she can defy the ban. However, she must then suffer the consequences of a breach of the law. There are many instances of people defying an unjust ruling or law but it is not a defence to say that the ruling or law was unjust to avoid the consequences; one must always be prepared to take the responsibilty.
I do, however have a problem with the 'law enforcement sources' who disclosed the information in the first place. If the doctor and client, the journalist and source and the lawyer and client privileges are to be maintained , it is surely hypocritical for one not to maintain the concept that the details of a potential criminal case should only be confined to the suspect, the police department and the officers involved and the lawyers until the case comes to court and, only then, if the judge does not decide to hear the case 'in camera'.
Perhaps, 'the law enforcement sources' believed they were merely hastening a guilty man to the gallows; they knew he was guilty, ergo a conviction by any means, within reason, was justifiable. I do not know. But what I do know for sure is that is wrong to seek to change a public's perception of an issue because of a strongly held belief of a small group of inviduals. I followed the Casey Anthony trial in the American media, much to my dismay, and to an outsider, nigh on everyone was saying she caused the death of her daughter, whether by neglect or outright murder. She was acquited. The jury refused to accept the prosecution's case. However just suppose that the police had leaked letters, text messages, emails to the press which implied, or inferred, an intent to kill. Would the jury have been able to divorce that from their minds? Would they have been influenced in a way that they, ultimately, were not?
As I say, it is a fine line that journalists walk. I sincerely hope that Ms Winter gets no more than she deserves, although I am at a loss to know exactly what that is.
* A novel by the inestimable Phillip K Dick in which precognicients fortell murders before they occur prividung the police with the opportunity to arrest the 'culprit' before the crime and thus saving lives. The novel was subsequently made into a film starring Tom Cruise.
Jana Winter, a New York journalist, wrote a piece last year in which she cited 'two law enforcement sources' and detailed an alleged notebook sent to his psychiatrist by the man accused of the shooting, James Holmes, prior to the bloodbath which he appeared to indicate his intent, filled as it was with stick figures shooting other stick figures. What is at stake, it appears, is perhaps six months in prison for Ms Winter but, perhaps more importantly, the well-worn adage of journalists; protect your sources at all costs.
However it seems to me that this is not the only issue here. The protection of sources issue is only likely to become an issue because Winter defied a court gagging order on pre-trial publicity; it still might have become an issue but with a court ban it seems to be a certainty. There is also the question of whether or not the 'law enforcement sourdes' had any right, moral and legal, to be speaking about something which should have been solely the province of the prosecution and defence councils and the accused. There is also the question of how far should doctor/patient confidentiality over-ride the obligations of citizens to do their civic duty and report suspicion of potential crime in order to prevent it.
That last aspect is probably the easiest to answer; never. It is a very slippery slope which doctors would have to negotiate were they to be, morally or legally, impelled to voice their own personal suspicions, whether or not they are backed by expertise and experience; that way leads to a 'Minority Report'* situation which can surely never be justified, even on practical grounds, except in a totalitarian regime.
In the same way, there is a paramount need for journalists to protect their sources IN EVERY CASE. Prosecution lawyers and judges and even the Lord Chancellor may disagree but without a guarantee of anonymity, people will be far less inclined, may be totally disinclined, to speak to journalists, especially investagative journalists, which undermines not only the journalists' profession and the freedom of the press but also cuts to the heart of what the general public actually wants journalists to do; probe and get to the truth of the matter.
As far as I am aware, the sanctity of that anonymity has been preserved in the UK, although it took an appeal to the European Court to do so (see Bill Goodwin's famous case in the late 80s & the 90s, that one ran and ran), however I believe that this 'rule' may be overturned in the US 'in the interests of national security', which is just another way of saying that if the President and his/her advisors want to know where the information came from then 'spill the beans' otherwise it is a brief spell in chokey for you, sunny Jim/sunny Jane. Journalists, at least the responsible among them, tread a hair's breadth tightrope between the need to disclose, possibly sensitive or personal, information and the need to make the story public to the very public that want, and need, to be informed about events.
Ms Winter's alleged breach of a court order banning pre-trial publicity is very straightforward. If she believes the ban wrong in some way, then she can defy the ban. However, she must then suffer the consequences of a breach of the law. There are many instances of people defying an unjust ruling or law but it is not a defence to say that the ruling or law was unjust to avoid the consequences; one must always be prepared to take the responsibilty.
I do, however have a problem with the 'law enforcement sources' who disclosed the information in the first place. If the doctor and client, the journalist and source and the lawyer and client privileges are to be maintained , it is surely hypocritical for one not to maintain the concept that the details of a potential criminal case should only be confined to the suspect, the police department and the officers involved and the lawyers until the case comes to court and, only then, if the judge does not decide to hear the case 'in camera'.
Perhaps, 'the law enforcement sources' believed they were merely hastening a guilty man to the gallows; they knew he was guilty, ergo a conviction by any means, within reason, was justifiable. I do not know. But what I do know for sure is that is wrong to seek to change a public's perception of an issue because of a strongly held belief of a small group of inviduals. I followed the Casey Anthony trial in the American media, much to my dismay, and to an outsider, nigh on everyone was saying she caused the death of her daughter, whether by neglect or outright murder. She was acquited. The jury refused to accept the prosecution's case. However just suppose that the police had leaked letters, text messages, emails to the press which implied, or inferred, an intent to kill. Would the jury have been able to divorce that from their minds? Would they have been influenced in a way that they, ultimately, were not?
As I say, it is a fine line that journalists walk. I sincerely hope that Ms Winter gets no more than she deserves, although I am at a loss to know exactly what that is.
* A novel by the inestimable Phillip K Dick in which precognicients fortell murders before they occur prividung the police with the opportunity to arrest the 'culprit' before the crime and thus saving lives. The novel was subsequently made into a film starring Tom Cruise.
Thursday, 11 April 2013
Balham Barnets (again), Darkness in El Dorado and the value of the scientific method
I said yesterday that I would try and write about what had prompted that day's blog; well here it is.
There is, I think, a great deal of difference between what I 'publish' here on this little blog, whether the author is the Penguin or whether it is MG, and either published non-fiction (in book form) or journalism. ( I exclude publications such as the 'National Enquirer' or the 'Sunday Sport' from my definition of journalism for what I hope are obvious reasons.) This blog is simply the ravings of an unbalanced mind and while some of it may be deadly serious, I do not expect any more credence to be given to it than one might give to a drunk propping up the bar at 10:30pm in one's local hostelry. I do not earn a living from it, I am scarcely committed to most of the ideas presented and time precludes any more than the most cursory research and fact checking.
On the other hand journalists are expected, the public do have this expectation, to check facts, make reasoned arguments and clearly display the measure of objectivity or subjectivity contained in the piece. The same strictures apply to the writing of non-fiction; too little objectivity turns the work into faction or out-and-out fiction.
In 2000, Patrick Tierney published a book entitled 'Darkness in El Dorado' which not only probably ruined two academic careers and reputations but cast doubt on the motives and methods of the academic anthropology community in the USA. Essentially the readership of the book was being led to imply that the actions of an anthropologist, Napoleon Chagnon and an accompanying geneticist, James Neel, had 'caused' a measles epidemic among the Yanomami people of Venezuela and Northern Brazil for somewhat dubious purposes. The Yanomami people are some of the most studied group of indigenous people in the Americas and elsewhere and were often held to be a good representation of a so-called 'primitive culture' living in isolation.
In 1967 Neel and Chagnon planned a field trip for 1968 to Venezuela and Brazil which sought, among other things, to determine possible reasons for the increase in lethality of 'common' diseases amongst 'westerners' when these were 'introduced by infected individuals who lived outside the normal range of the indigenous populations. The story of the colonisation of the Americas by first the Spaniards and Portuguese and later by the English is littered with tales of the decimation of local populations by disease which the indigenous people had no resistance to whatsoever.
In October 1968, when Neel and Chagnon had arrived in South America, armed with a vaccine for measles, which disease, if left untreated and with little or no immunity killed more than twice as many people as the disease caused among those unvaccinated 'westerners', a measles epidemic had arisen among the the Yanomami people. In keeping with the old adage that there is no smoke without fire, Tierney, along with many others prompted by Tierney's book, effectively laid the blame on Neel and Chagnon's administration of the vaccine, which like all 'live' vaccines can cause the symptoms of the disease being vaccinated against.
Neel had also been using a older version of the vaccine, which was still in use in the USA but usage was declining due to the development of more attenuated, less virulent, vaccines which offered a better chance of not developing symptoms. Whether cost was a factor in the choice, the newer version were perhaps more expensive, I have been unable to determine. Tierney effectively tarred the two researchers with the brush of dubious ulterior motives because of that 'supposed' smoking gun of the vaccine.
One of the many lessons taught to us by the scientific method is to weigh all of the evidence before making a judgement, not just the evidence which supports your hypothesis. The classic example is a double blind trial (no-one knows who got what until the results are tabulated) where a placebo is involved. A new drug has a 'cure rate' of 25% in the patients treated with that drug, with 3% of patients experiencing adverse side effects. This is better than the 15% success rate of an older, competing drug, with higher adverse effects, therefore you should use the new drug. This is quite obvious, would you not agree? On that evidence, it is a no-brainer. However, we introduced a placebo into the trial and that has a success rate of 25% with zero side effects. Now which one do you use? It is not quite so simple now, is it?
What Tierney did not take into account because it did not accord with what he believed, was that the outbreak must have started prior to Neel's and Chagnon;s arrival because of the relatively long period between infection and manifest symptoms, between 7 and 18 days; the likelihood that Neel was vaccinating individuals after 3-4 days post infection, at which point the vaccine is largely ineffectual and the fact that the progress of the epidemic was in the opposite direction to Neel's progress through the country. While the vaccine may have contributed to the extent of the epidemic in producing symptoms as a result of the vaccine, which rarely if ever kills, it is impossible to determine the extent of such an effect since the inhabitants were not necessarily free from infection when they were vaccinated. There was also an implied cross infection by vaccinated subjects which is unheard of in the literature; you cannot pass the infection caused by the attenuated 'live' virus by normal means to another human being.
Wiki lists the claims made by Tierney in his book as:
* Highly improbable. Chagnon may have overstated the case but many researchers attest to the normal, for human beings, willingness to engage in violence to secure finite resources, both food and women. In this, he may have been influenced by a common sentiment; that indigenous peoples lived in some kind of 'natural grace', 'the noble savage', at one with their environment. Goodall experienced much the same reaction when she punctured the myth of the peace-loving chimpanzee.
** Aside of the ill-wisdom of getting that involved with the subjects of your research, this was a normal age for Yanomami women to be married.
Postscript:
A little photograph of proud Lisa's new addition to 'la famille Booker'; aah, ain't she cute!
There is, I think, a great deal of difference between what I 'publish' here on this little blog, whether the author is the Penguin or whether it is MG, and either published non-fiction (in book form) or journalism. ( I exclude publications such as the 'National Enquirer' or the 'Sunday Sport' from my definition of journalism for what I hope are obvious reasons.) This blog is simply the ravings of an unbalanced mind and while some of it may be deadly serious, I do not expect any more credence to be given to it than one might give to a drunk propping up the bar at 10:30pm in one's local hostelry. I do not earn a living from it, I am scarcely committed to most of the ideas presented and time precludes any more than the most cursory research and fact checking.
On the other hand journalists are expected, the public do have this expectation, to check facts, make reasoned arguments and clearly display the measure of objectivity or subjectivity contained in the piece. The same strictures apply to the writing of non-fiction; too little objectivity turns the work into faction or out-and-out fiction.
In 2000, Patrick Tierney published a book entitled 'Darkness in El Dorado' which not only probably ruined two academic careers and reputations but cast doubt on the motives and methods of the academic anthropology community in the USA. Essentially the readership of the book was being led to imply that the actions of an anthropologist, Napoleon Chagnon and an accompanying geneticist, James Neel, had 'caused' a measles epidemic among the Yanomami people of Venezuela and Northern Brazil for somewhat dubious purposes. The Yanomami people are some of the most studied group of indigenous people in the Americas and elsewhere and were often held to be a good representation of a so-called 'primitive culture' living in isolation.
In 1967 Neel and Chagnon planned a field trip for 1968 to Venezuela and Brazil which sought, among other things, to determine possible reasons for the increase in lethality of 'common' diseases amongst 'westerners' when these were 'introduced by infected individuals who lived outside the normal range of the indigenous populations. The story of the colonisation of the Americas by first the Spaniards and Portuguese and later by the English is littered with tales of the decimation of local populations by disease which the indigenous people had no resistance to whatsoever.
In October 1968, when Neel and Chagnon had arrived in South America, armed with a vaccine for measles, which disease, if left untreated and with little or no immunity killed more than twice as many people as the disease caused among those unvaccinated 'westerners', a measles epidemic had arisen among the the Yanomami people. In keeping with the old adage that there is no smoke without fire, Tierney, along with many others prompted by Tierney's book, effectively laid the blame on Neel and Chagnon's administration of the vaccine, which like all 'live' vaccines can cause the symptoms of the disease being vaccinated against.
Neel had also been using a older version of the vaccine, which was still in use in the USA but usage was declining due to the development of more attenuated, less virulent, vaccines which offered a better chance of not developing symptoms. Whether cost was a factor in the choice, the newer version were perhaps more expensive, I have been unable to determine. Tierney effectively tarred the two researchers with the brush of dubious ulterior motives because of that 'supposed' smoking gun of the vaccine.
One of the many lessons taught to us by the scientific method is to weigh all of the evidence before making a judgement, not just the evidence which supports your hypothesis. The classic example is a double blind trial (no-one knows who got what until the results are tabulated) where a placebo is involved. A new drug has a 'cure rate' of 25% in the patients treated with that drug, with 3% of patients experiencing adverse side effects. This is better than the 15% success rate of an older, competing drug, with higher adverse effects, therefore you should use the new drug. This is quite obvious, would you not agree? On that evidence, it is a no-brainer. However, we introduced a placebo into the trial and that has a success rate of 25% with zero side effects. Now which one do you use? It is not quite so simple now, is it?
What Tierney did not take into account because it did not accord with what he believed, was that the outbreak must have started prior to Neel's and Chagnon;s arrival because of the relatively long period between infection and manifest symptoms, between 7 and 18 days; the likelihood that Neel was vaccinating individuals after 3-4 days post infection, at which point the vaccine is largely ineffectual and the fact that the progress of the epidemic was in the opposite direction to Neel's progress through the country. While the vaccine may have contributed to the extent of the epidemic in producing symptoms as a result of the vaccine, which rarely if ever kills, it is impossible to determine the extent of such an effect since the inhabitants were not necessarily free from infection when they were vaccinated. There was also an implied cross infection by vaccinated subjects which is unheard of in the literature; you cannot pass the infection caused by the attenuated 'live' virus by normal means to another human being.
Wiki lists the claims made by Tierney in his book as:
- That Napoleon Chagnon and James Neel directly and indirectly caused a genocide in the region through the introduction of a live virus measles vaccine.
- That the whole Yanomami project was an outgrowth and continuation of the Atomic Energy Commission's secret program of experiments on human subjects.
- That Chagnon's account of the Yanomami are based on false, non-existent or misinterpreted data, and that Chagnon actually incited violence among them.*
- That French researcher Jacques Lizot, protégé of anthropology icon Claude Levi-Strauss engaged in sex acts with Yanomami boys (including oral and anal sex, as well as having the boys masturbate him).
- That a researcher married a Yanomami girl who was barely entering her teens.**
* Highly improbable. Chagnon may have overstated the case but many researchers attest to the normal, for human beings, willingness to engage in violence to secure finite resources, both food and women. In this, he may have been influenced by a common sentiment; that indigenous peoples lived in some kind of 'natural grace', 'the noble savage', at one with their environment. Goodall experienced much the same reaction when she punctured the myth of the peace-loving chimpanzee.
** Aside of the ill-wisdom of getting that involved with the subjects of your research, this was a normal age for Yanomami women to be married.
Postscript:
A little photograph of proud Lisa's new addition to 'la famille Booker'; aah, ain't she cute!
J S Mill, E B Hall and Balham Barnets
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a rederess of grievances.
As much as I may deride the US and its citizens for their transgressions, minor as well as major, there is probably no more succinct an injunction against the basic tenets of democracy than the First Amendment's clearly stated instruction to its elected legislature from the general population. When one considers what has been enacted in European states, for whatever the reason, in the name of so-called tolerance, the banning of written or spoken 'incitements to racial hatred', the illegality of 'pro-Nazi' tracts or speeches in modern day Germany, the 'dress code' in France, it is reassuring to note that the basic freedons of speech have been preserved in that most reactionary of modern industrialised states, the United States of America. As John Stuart Mill remarked in 'On Liberty':
If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.
Or to quote Evelyn Beatrice Hall, paraphrasing Voltaire:
I disapprove of what you say, but I will defend to the death your right to say it.
It has become fashionable, nay almost mandatory, to attempt to quash any, or all, voices which dissent from the mainstream. Rather then give people the opportunity and the education to apply critical thinking to any statements made, society now prefers to legislate against, or discriminate against, those sections of the population who hold views considered unethical or immoral by the mainstream and to brand those who support the individual's right to genuinely free speech as supporters of those views, irrespective of whether these so-called supporters actually agree or not.
Governments, legislatures take, as nature herself does, the path of least resistance, the path of least energy. For those who wield the power, elected or otherwise, it is easier, and uses far fewer resources, to control the extent to which essential freedoms are allowed to be exercised than it is to give individuals the intellectual tools to make a considered and informed judgement.
David Irving is, quite rightly in my view, largely discredited as a historian, mainly due to the establishment's view that he is (a) pro-Nazi and (b) his views do not accord with facts. However, in one important respect, he is absolutely and unequivably correct; there is no extant and signed order from Hitler either to inaugerate the Wannsee conference or to order the extermination of European Jewry. Nearly all commentators share my own view that the circumstantial evidence is so strong that to deny that Hitler had, at the very least, knowledge of what was proposed stretches credulity to breaking point. From 'Mein Kampf', the Enabling Act, 'Night of the Long Knives, the Reich's Racial Purity laws, official 'party sponsored' newspapers and films, Krystallnacht, the Einstazgruppen in Russia etc etc, all point to the fact that this was Hitler's intended policy, whether explicitly ordered or not. For the Germans to make him 'persona non grata' denies Irving the right to try to make a case for his beliefs and everyone else the right, after some cursory research, to laugh out loud at him; surely a worse punishment than a fine or a short term in prison.
I shall try and write about what prompted this little diatribe later but, in the meantime, I leave you with a further quote from Mill:
The beliefs which we have most warrant for have no safeguard to rest on but a standing invitation to the whole world to prove them unfounded. If the challenge is not accepted, or is accepted and the attempt fails, we are far enough from certainty still; but we have done the best that the existing state of human reason admits of; we have neglected nothing that could give the truth a chance of reaching us.
Postscript:
To my neice, congratulations on the birth (19 March 2013) of a new addition to the family; Balham Barnets Booker. It's nice to see you've taken the plunge at last; all good wishes!
As much as I may deride the US and its citizens for their transgressions, minor as well as major, there is probably no more succinct an injunction against the basic tenets of democracy than the First Amendment's clearly stated instruction to its elected legislature from the general population. When one considers what has been enacted in European states, for whatever the reason, in the name of so-called tolerance, the banning of written or spoken 'incitements to racial hatred', the illegality of 'pro-Nazi' tracts or speeches in modern day Germany, the 'dress code' in France, it is reassuring to note that the basic freedons of speech have been preserved in that most reactionary of modern industrialised states, the United States of America. As John Stuart Mill remarked in 'On Liberty':
If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.
Or to quote Evelyn Beatrice Hall, paraphrasing Voltaire:
I disapprove of what you say, but I will defend to the death your right to say it.
It has become fashionable, nay almost mandatory, to attempt to quash any, or all, voices which dissent from the mainstream. Rather then give people the opportunity and the education to apply critical thinking to any statements made, society now prefers to legislate against, or discriminate against, those sections of the population who hold views considered unethical or immoral by the mainstream and to brand those who support the individual's right to genuinely free speech as supporters of those views, irrespective of whether these so-called supporters actually agree or not.
Governments, legislatures take, as nature herself does, the path of least resistance, the path of least energy. For those who wield the power, elected or otherwise, it is easier, and uses far fewer resources, to control the extent to which essential freedoms are allowed to be exercised than it is to give individuals the intellectual tools to make a considered and informed judgement.
David Irving is, quite rightly in my view, largely discredited as a historian, mainly due to the establishment's view that he is (a) pro-Nazi and (b) his views do not accord with facts. However, in one important respect, he is absolutely and unequivably correct; there is no extant and signed order from Hitler either to inaugerate the Wannsee conference or to order the extermination of European Jewry. Nearly all commentators share my own view that the circumstantial evidence is so strong that to deny that Hitler had, at the very least, knowledge of what was proposed stretches credulity to breaking point. From 'Mein Kampf', the Enabling Act, 'Night of the Long Knives, the Reich's Racial Purity laws, official 'party sponsored' newspapers and films, Krystallnacht, the Einstazgruppen in Russia etc etc, all point to the fact that this was Hitler's intended policy, whether explicitly ordered or not. For the Germans to make him 'persona non grata' denies Irving the right to try to make a case for his beliefs and everyone else the right, after some cursory research, to laugh out loud at him; surely a worse punishment than a fine or a short term in prison.
I shall try and write about what prompted this little diatribe later but, in the meantime, I leave you with a further quote from Mill:
The beliefs which we have most warrant for have no safeguard to rest on but a standing invitation to the whole world to prove them unfounded. If the challenge is not accepted, or is accepted and the attempt fails, we are far enough from certainty still; but we have done the best that the existing state of human reason admits of; we have neglected nothing that could give the truth a chance of reaching us.
Postscript:
To my neice, congratulations on the birth (19 March 2013) of a new addition to the family; Balham Barnets Booker. It's nice to see you've taken the plunge at last; all good wishes!
Wednesday, 10 April 2013
Cancer, champagne and 'The Return of the Bubbles'
One reads a lot (in the right kinds of newspapers and magazines) about how fewer and fewer children/young adults are studying science related subjects in schools and universities; physics, chemistry, mathematics, engineering and the like. Speaking from personal experience (because I didn't and couldn't), one of the reasons that this may be so is not that people are not interested in the subjects but that it is seldom possible to combine arts subjects like history or languages with science subjects like physics or chemistry in UK schools at the 'pre-university' stage, ie AS and A level. Of course, there is one very good practical reason for this; most pupils have a tendency to excel in either arts subjects OR science subjects, a good English scholar has, on the whole , different interests and skills than someone with a bent for quantum mechanics and set theory. The difficulty of timetabling lessons to accomodate those few pupils who have more wide ranging interests and aptitudes in a school of 2,000 is just too great.
For this reason, you may have noticed that the Penguin and I share a deep interest in science, I gobble up anything and everything that I can about science's investigation of the physical world. I was therefore intrigued by 'POP! The science of bubbles', presented by Helen Czerski, a researcher into bubbles. Ms Czerski is one of that new breed of science presenters on TV; someone who actually knows what they are talking about and has the qualifications to prove it. Think 'female Brian Cox', PhD, tenure at a university, good speaker, photogenic and comes from Manchester, altthough she is now based at Southampton.
I have spent much of my life doing my own research into bubbles, mostly the bubbles generated by Guinness and Krug or Bollinger (for an explanation of why bubbles in Guinness appear to travel downwards go to the July 2010 post here) and so I was more than a little curious to learn more about champagne glasses and nucleation sites (the small imperfections in the glass, often deliberately made, which provide a place for bubbles to form).
I do not intend to bore you with the entire contents of the documentary (champagne did indeed feature and I hope that the researcher drunk the champagne after every experiment; seems such a waste not to) but a little snippet caught my eye because it was almost too bizarre to be true; research into delivering drugs to cancer tumours using bubbles and ultrasound and thus avoidung unwelcome side effects to the rest of the body.
One of the remarkable things about bubbles is that they have a tendency to carry additional baggage outside of the expected gas encased in liquid. Some liquid-hating molecules have an inclination, when in solution, to seek out the driest place that they can find; in a liquid this just happens to be on the surface of a bubble. (This was rather neatly demonstrated by having bubbles in water flow through a small pile of 'glitter', the stuff that children put on their faces; you could quite clearly see the grains of glitter sticking to the underside of the bubble.)
So far, all well and good; design your drug so that it is attracted to the surface of bubbles and you have a mechanism to transport the drug. The next tricky part is to get your bubble to the tumour site. The researchers simply bound each bubble to some iron, it was not clear whether the iron is bound to the drug or whether it is separate and gets bound to the bubble in similar fashion to the glitter above, and then used magnets around the site of the tumour to make sure that the bubbles were attracted to only that site.
The pièce de résistance? They used ultrasound over the site to burst the bubble and so release the drug which was being carried along with it. The ultrasound effectively heats up the gas in the bubble by imparting energy to it, making it vibrate more vigourously and move about much faster, and the resulting pressure of the gas molecules eventually punctures the skin of the bubble causing loss of bubbelicity.
Now this may seem not to have much connection with the Large Hadron Collider at CERN, Brian Cox's particular field of interest, but actually, in one way at least, it does. People often bemoan science for science's sake and worry about the vast sums spent on basic research which appears to have little or no practical application and, while one may see perhaps the excitement of uncovering the innermost secrets of the universe in a bound pair of quarks and the Higgs boson, who would want to undertake, or fund, fundamental research into the behaviour of bubbles?
Me, for one!
Postscript
While doing my (fundamental) research on said Ms Czerski, I came across a little site which is here which, at the moment, appears to be a reaction to the EU's infamous 'movie trailer' which was intended to promote women in science; the site however may offer more encouragement or advice in the future.
For this reason, you may have noticed that the Penguin and I share a deep interest in science, I gobble up anything and everything that I can about science's investigation of the physical world. I was therefore intrigued by 'POP! The science of bubbles', presented by Helen Czerski, a researcher into bubbles. Ms Czerski is one of that new breed of science presenters on TV; someone who actually knows what they are talking about and has the qualifications to prove it. Think 'female Brian Cox', PhD, tenure at a university, good speaker, photogenic and comes from Manchester, altthough she is now based at Southampton.
I have spent much of my life doing my own research into bubbles, mostly the bubbles generated by Guinness and Krug or Bollinger (for an explanation of why bubbles in Guinness appear to travel downwards go to the July 2010 post here) and so I was more than a little curious to learn more about champagne glasses and nucleation sites (the small imperfections in the glass, often deliberately made, which provide a place for bubbles to form).
I do not intend to bore you with the entire contents of the documentary (champagne did indeed feature and I hope that the researcher drunk the champagne after every experiment; seems such a waste not to) but a little snippet caught my eye because it was almost too bizarre to be true; research into delivering drugs to cancer tumours using bubbles and ultrasound and thus avoidung unwelcome side effects to the rest of the body.
One of the remarkable things about bubbles is that they have a tendency to carry additional baggage outside of the expected gas encased in liquid. Some liquid-hating molecules have an inclination, when in solution, to seek out the driest place that they can find; in a liquid this just happens to be on the surface of a bubble. (This was rather neatly demonstrated by having bubbles in water flow through a small pile of 'glitter', the stuff that children put on their faces; you could quite clearly see the grains of glitter sticking to the underside of the bubble.)
So far, all well and good; design your drug so that it is attracted to the surface of bubbles and you have a mechanism to transport the drug. The next tricky part is to get your bubble to the tumour site. The researchers simply bound each bubble to some iron, it was not clear whether the iron is bound to the drug or whether it is separate and gets bound to the bubble in similar fashion to the glitter above, and then used magnets around the site of the tumour to make sure that the bubbles were attracted to only that site.
The pièce de résistance? They used ultrasound over the site to burst the bubble and so release the drug which was being carried along with it. The ultrasound effectively heats up the gas in the bubble by imparting energy to it, making it vibrate more vigourously and move about much faster, and the resulting pressure of the gas molecules eventually punctures the skin of the bubble causing loss of bubbelicity.
Now this may seem not to have much connection with the Large Hadron Collider at CERN, Brian Cox's particular field of interest, but actually, in one way at least, it does. People often bemoan science for science's sake and worry about the vast sums spent on basic research which appears to have little or no practical application and, while one may see perhaps the excitement of uncovering the innermost secrets of the universe in a bound pair of quarks and the Higgs boson, who would want to undertake, or fund, fundamental research into the behaviour of bubbles?
Me, for one!
Postscript
While doing my (fundamental) research on said Ms Czerski, I came across a little site which is here which, at the moment, appears to be a reaction to the EU's infamous 'movie trailer' which was intended to promote women in science; the site however may offer more encouragement or advice in the future.
The Brontës, TB and you've never had it so good
Very occasionally, I remove the cardboard from the broken windows and lying on my bed of long-haired stray cats and covered with sheets of the 'Daily Telegraph' and 'the Guardian', I gaze up in awe and wonder at the night sky, the tiny jewels twinkling through the haze of diesel fumes and charcoal smoke, and I remind myself of how lucky I am!
I was reminded of this by the Brontë sisters who were born in the first half of the nineteenth century. I must confess to not having an awful lot of time for the literary endeavours of that trio of Brontës or for that matter Austen or Gaskill, although I suspect that it has less to do with their feminine nature and more to do with a dislike of nineteenth century English prose in general, although there are a few exceptions; I accept that it is good, I just do not read them for pleasure. What struck me about the Brontës was how short their lives had been, with the notable exception of Papa Brontë, who lived to be eighty-four.
Mama Brontë, no doubt worn out with six children died at thirty-eight; Maria died at the age of eleven; Elizabeth at age ten; Charlotte managed to make it all the way to thirty-eight, although her unborn child died with her; Patrick only managed to get to thirty-one, although the alcohol and laudanum addictions may well have had something to do with that; Emily only just managed to beat her younger sister Anne by a single year, thirty to twenty-nine. Out of eight members of a family, all but one did not manage to get to forty. It is likely that five of the children, Maria, Elizabeth, Patrick, Emily and Anne, died of tuberculosis and Charlotte of complications with her pregnancy or possibly typhus.
It is often difficult to conceive of how much we owe to the likes of Jenner, Lister, Calmette and Guérin, Banting and Best, Florey and Chain, Jonas Salk, Christian Barnard* and all those unnamed chemists and doctors who have given us such an abundance of ways in which we might combat the infections and diseases to which the human body is prey. In little more than fifty years, so much of what killed our forefathers no longer kills us. Of course, it is not only drugs or surgical procedures which now save lives which would have been unheard of in my great-grandfather's time. Pasteur's pioneering work on microbacteria led to theories that it was such bacteria which caused infection and disease and which slowly raised the importance of public health and hygene during the twentieth century in western industrialised nations and which now forms a major part of foreign aid to developing countries. It is difficult to imagine that something which we take so much for granted that we do not give it a second thought, fresh, disease-free, running water, is actually a luxury in many parts of the world.
For much of humankind's history, and isolated historical examples notwithstanding**, life has been, to quote Hobbes, 'nasty, brutish and short'. Even in these so-called enlightened and technologically advanced times there are still large tracts of the globe in which Hobbes still applies and yet, we in the west, continue to demand more with each successive generation. It is, of course, not the fault of the generations that came after my generation to expect more of life; my expectations were necessarily less than theirs now because the wealth was not available to purchase all that you might desire and, perhaps more importantly, what is available now was not available then. We are rapidly becoming a society in which your every desire should be satisfied and you should, at the very least, expect to live forever!
Prior to the twentieth century, imagined views of a society in the future were largely Untopian in nature and had occupied moral and political philosophers since Plato; a dystopian*** view of a future society, except in a few isolated cases, 'Rasselas' by Samuel Johnson springs to mind, is almost unknown until after 1918. This is scarcely suprising, life for many in industrialised countries was still 'nasty, brutish and short' at the turn of the twentieth century and no-one had to imagine a less than perfect world; if it had been a dog it would have bitten most of the population. With the exception of 'Brave New World', most dystopian views of a future world mostly postdate the second world war. It is possible to have a pessimistic view of the future of the world when all is sunshine and roses; to to so when the world is actually like that is much more difficult!
In the end, so much depends on the continuing creativity and ingenuity of humankind, a ready supply of raw materials and an avoidance of conflict over resources. Personally, I do not hold out a lot of hope that any of these things will actually materialise in the future. I think that the best we, in the west can hope for, is that it is not too cold in the Chinese concentration camps and that we can get used to a diet of rice.
* The relevance of the list, which is not intended to be comprehensive is as follows:
Edward Jenner in 1796 made the first successful vaccination against smallpox. In 1979, WHO declared smallpox an eradicated disease; surely the success story of the twentieth century;
Joseph Lister introduced the first widespread use of antiseptic (carbolic acid) into surgery in the 1860s;
Albert Calmette and Camille Guérin who discovered the vaccine against tuberculosis in 1908 which bears their name (bacilli Calmette and Guérin, BCG) and which is still the only one in widespread and effective use;
Frederick Banting and Charles Best engineered the first successful extraction of insulin from the pancreas of the deceased, human and bovine, in 1920 which led, in a few short years, to large scale production and the long-term, effectuve treatment of Type 1 diabetes;
Howard Florey and Ernst Chain for the development, by 1938, of pencillin, the first broad-spectrum antibiotic - Alexander Fleming gets no kudos from me, shared Nobel prize notwithstanding; despite having noticed it in 1929, he did little or nothing with it;
Jonas Salk who developed the first effective vaccine against polio in 1955, just in time for me to get one of the first shots;
Christian Barnard who performed the first human-to-human heart transplant in 1967 in South Africa.
** Patrician Romans, Royalty, the nobility from around 1700 and merchant classes in Europe from around 1800.
*** A term first attributed to John Stuart Mill in the House of Commons speaking against the Irish Land Bill of 1868: 'they ought rather to be called dys-topians, or caco-topians. What is commonly called Utopian is something too good to be practicable; but what they appear to favour is too bad to be practicable.'
I was reminded of this by the Brontë sisters who were born in the first half of the nineteenth century. I must confess to not having an awful lot of time for the literary endeavours of that trio of Brontës or for that matter Austen or Gaskill, although I suspect that it has less to do with their feminine nature and more to do with a dislike of nineteenth century English prose in general, although there are a few exceptions; I accept that it is good, I just do not read them for pleasure. What struck me about the Brontës was how short their lives had been, with the notable exception of Papa Brontë, who lived to be eighty-four.
Mama Brontë, no doubt worn out with six children died at thirty-eight; Maria died at the age of eleven; Elizabeth at age ten; Charlotte managed to make it all the way to thirty-eight, although her unborn child died with her; Patrick only managed to get to thirty-one, although the alcohol and laudanum addictions may well have had something to do with that; Emily only just managed to beat her younger sister Anne by a single year, thirty to twenty-nine. Out of eight members of a family, all but one did not manage to get to forty. It is likely that five of the children, Maria, Elizabeth, Patrick, Emily and Anne, died of tuberculosis and Charlotte of complications with her pregnancy or possibly typhus.
It is often difficult to conceive of how much we owe to the likes of Jenner, Lister, Calmette and Guérin, Banting and Best, Florey and Chain, Jonas Salk, Christian Barnard* and all those unnamed chemists and doctors who have given us such an abundance of ways in which we might combat the infections and diseases to which the human body is prey. In little more than fifty years, so much of what killed our forefathers no longer kills us. Of course, it is not only drugs or surgical procedures which now save lives which would have been unheard of in my great-grandfather's time. Pasteur's pioneering work on microbacteria led to theories that it was such bacteria which caused infection and disease and which slowly raised the importance of public health and hygene during the twentieth century in western industrialised nations and which now forms a major part of foreign aid to developing countries. It is difficult to imagine that something which we take so much for granted that we do not give it a second thought, fresh, disease-free, running water, is actually a luxury in many parts of the world.
For much of humankind's history, and isolated historical examples notwithstanding**, life has been, to quote Hobbes, 'nasty, brutish and short'. Even in these so-called enlightened and technologically advanced times there are still large tracts of the globe in which Hobbes still applies and yet, we in the west, continue to demand more with each successive generation. It is, of course, not the fault of the generations that came after my generation to expect more of life; my expectations were necessarily less than theirs now because the wealth was not available to purchase all that you might desire and, perhaps more importantly, what is available now was not available then. We are rapidly becoming a society in which your every desire should be satisfied and you should, at the very least, expect to live forever!
Prior to the twentieth century, imagined views of a society in the future were largely Untopian in nature and had occupied moral and political philosophers since Plato; a dystopian*** view of a future society, except in a few isolated cases, 'Rasselas' by Samuel Johnson springs to mind, is almost unknown until after 1918. This is scarcely suprising, life for many in industrialised countries was still 'nasty, brutish and short' at the turn of the twentieth century and no-one had to imagine a less than perfect world; if it had been a dog it would have bitten most of the population. With the exception of 'Brave New World', most dystopian views of a future world mostly postdate the second world war. It is possible to have a pessimistic view of the future of the world when all is sunshine and roses; to to so when the world is actually like that is much more difficult!
In the end, so much depends on the continuing creativity and ingenuity of humankind, a ready supply of raw materials and an avoidance of conflict over resources. Personally, I do not hold out a lot of hope that any of these things will actually materialise in the future. I think that the best we, in the west can hope for, is that it is not too cold in the Chinese concentration camps and that we can get used to a diet of rice.
* The relevance of the list, which is not intended to be comprehensive is as follows:
Edward Jenner in 1796 made the first successful vaccination against smallpox. In 1979, WHO declared smallpox an eradicated disease; surely the success story of the twentieth century;
Joseph Lister introduced the first widespread use of antiseptic (carbolic acid) into surgery in the 1860s;
Albert Calmette and Camille Guérin who discovered the vaccine against tuberculosis in 1908 which bears their name (bacilli Calmette and Guérin, BCG) and which is still the only one in widespread and effective use;
Frederick Banting and Charles Best engineered the first successful extraction of insulin from the pancreas of the deceased, human and bovine, in 1920 which led, in a few short years, to large scale production and the long-term, effectuve treatment of Type 1 diabetes;
Howard Florey and Ernst Chain for the development, by 1938, of pencillin, the first broad-spectrum antibiotic - Alexander Fleming gets no kudos from me, shared Nobel prize notwithstanding; despite having noticed it in 1929, he did little or nothing with it;
Jonas Salk who developed the first effective vaccine against polio in 1955, just in time for me to get one of the first shots;
Christian Barnard who performed the first human-to-human heart transplant in 1967 in South Africa.
** Patrician Romans, Royalty, the nobility from around 1700 and merchant classes in Europe from around 1800.
*** A term first attributed to John Stuart Mill in the House of Commons speaking against the Irish Land Bill of 1868: 'they ought rather to be called dys-topians, or caco-topians. What is commonly called Utopian is something too good to be practicable; but what they appear to favour is too bad to be practicable.'
Tuesday, 9 April 2013
So, farewell Margaret. 'The lady's not for turning', that was your catchphrase.
So, Margaret Thatcher finally goes the way of all flesh, at last goes off to meet her maker, swings that last kick at the proverbial bucket. Is is difficult to think of a more divisive political figure in British history; William I, Oliver Cromwell, Lord Liverpool, perhaps but is difficult to exclude Mrs Thatcher from that 'elite' class. Like Marmite on toast, you either loved her or hated her in equal measures.
She has long been the butt of simplistic jokes and parody by comedians and social commentators keen to display their (champagne) socialist credentials and yet Mrs Thatcher, as with all historical figures who have an impact on the society in which, and with which, they engage, was merely a product of the particular circumstances which marked British, and to some extent western democratic, society at the time.
Britain had approached the 1970s in a spirit of optimism, counter-culture morality and ethics were being adopted in the mainstream; equality of opportunity was becoming a very real possibility, however imperfect and fragile; Britain had rapidly left behind the after effects of an economically debilitating war; industry was on the rise, forged in the 'white heat' of Wilson's 1960s technological revolution. However, as the decade wore on, the early successes of currency decimalisation, entry into the EEC (now the EU) and a growing sense of well-being, started to be stymied by inflation which was climbing into double figures; the worsening of industrial relations as the workforce tried to claw back some, if not all, of the depreciation of their earnings power with ever higher wage demands and its attendant industrial action leading to strikes on a nationwide scale; unemployment, especially among the young, reached alarming proportions.
These general conditions fed a disillusionment which was palpable; what made it worse was that it was being presided over by a largely inept 'socialist' Government; 'Sunny Jim' Callaghan was probably the most ineffectual Prime Minister of the twentieth century and ushered in memories of shortages, rations and failure of public services on a scale not seen since the Second World War; living in Britain after 1975 had all the hallmarks of living in a third world banana republic.
The Liberals (now the Lib-Dems) had ceased to be a viable political alternative since the Great War of 1914-18 and so Britain was faced with a stark choice at the General Election in 1979; to continue with a Government which was clearly unable to govern in any meaningful sense (the Tories had been ousted for much the same reason in 1974 in the wake of the national coal miners' strike) or to choose a more radical, but right-wing alternative. Unfortunately, for anyone with a glimmer of a social conscience, the nation chose self-interest above all else, or at least the 20-25% of 'floating voters' did, those not wedded to the ideals of a particular party.
The national press played an important role in Thatcher's victory in 1979, probably greater then in Blair's election in 1997. In truth, they could scarce not stand behind Thatcher and the Tories, the newspapers of the preceding couple of years had been filled to overflowing with tales of labour disputes, wildcat strikes, secondary picketing, clashes with the police and double figure wage demands.
Thatcher in Britain and Reagan in the States were simply a manifestation of a increasing disillusionment with organised labour in the shape of a few very large and strong Trades Unions, a growing sense that it was 'every man/woman for himself' and to hell with social responsibility or conscience, a desire to return in short order to the earlier decade of supposed prosperity. Thatcher, or Reagan for that matter, did not cause the flowering of the 'me' generation, they just encouraged it by their policies.
Politics, and economics, are always a tightrope act. However, Thatcher could, quite justly, whatever I or you might believe, claim that she did indeed have a mandate, insofar as it is possible to have one in the constituency-based system of electing officials to office that we have in the UK, for change.
As a Trade Unionist, the anti-union legislation of Thatcher's first two terms of office were, to a large extent, and in my view, excessive, however there was little widespread support, except among staunch Trade Unionists, for any watering down of the proposals; dialogue had failed during the Callaghan years and so Thatcher might be forgiven for not wasting her and the nation's time in trying to foster a dialogue with the Unions. Pragmatically, the Government, of whatever colour, needed to attempt to curb the worst excesses of certain TU leaders and it is interesting that the so-called 'socialist' government from 1997 to 2010 hardly repealed any of the so-called 'union-bashing' legislation of the eighties. We do, after all, live in a capitalist democracy; certain compromises have to be made if we do not wish to ferment a socialist or anarchist revolution ourselves.
Whilst the introduction of the Poll Tax caused widespread condemnation, demonstrations and, on occasion, riots, the system whereby local government collects local taxes was, as it is now, grossly unfair. I am forced to pay a high price for the property in which I live and as a result I have to pay a higher local (Council) tax than someone in a similar property elsewhere in the country because that tax is not determined by how much I earn or my ability to pay or the extent to which I use, either in reality or hypothetically, local services but merely on the (rateable) value of my property. The Poll Tax was misguided simply because it was not based on an ability to pay, merely a fixed rate per individual, which therefore disadvantaged the most vulnerable; yet I suspect that because of the memories of its introduction, it has slowed down considerably the introduction of a 'local income tax' which surely must not be beyond the scope of computer programmers nowadays.
Margaret Thatcher's final 'crime' was to become embroiled in the what became known as 'the Falklands War', although 'the Falklands Skirmish' might be a better description. As far as I remember it, the actual fighting lasted for less time than it took to make the journey from Southampton to Port Stanley. While she undoubtedly made much political capital out of the (foregone conclusion of) victory, it is difficult to see a viable alternative reaction to the Argentine invasion. Self-determination is a key tenet in any democracy and the inhabitants of the Falklands, whatever territorial claims the Argentines may legitimately have had on the islands and I suspect about as much as the British, wanted to remain British citizens.
We have now had two leaders, Thatcher and Blair, who have both had long periods in power in the past 30 or so years and both have damaged their legacy badly by (a) adopting an almost dictatorial approach and (b) by clinging onto power when reason should have counselled letting it go. It will be some time before an objective and reasoned summary of both those leaders' legacies can be assessed.
Thanks go to E J Thribb whose masterly poetic style leads to the title today.
She has long been the butt of simplistic jokes and parody by comedians and social commentators keen to display their (champagne) socialist credentials and yet Mrs Thatcher, as with all historical figures who have an impact on the society in which, and with which, they engage, was merely a product of the particular circumstances which marked British, and to some extent western democratic, society at the time.
Britain had approached the 1970s in a spirit of optimism, counter-culture morality and ethics were being adopted in the mainstream; equality of opportunity was becoming a very real possibility, however imperfect and fragile; Britain had rapidly left behind the after effects of an economically debilitating war; industry was on the rise, forged in the 'white heat' of Wilson's 1960s technological revolution. However, as the decade wore on, the early successes of currency decimalisation, entry into the EEC (now the EU) and a growing sense of well-being, started to be stymied by inflation which was climbing into double figures; the worsening of industrial relations as the workforce tried to claw back some, if not all, of the depreciation of their earnings power with ever higher wage demands and its attendant industrial action leading to strikes on a nationwide scale; unemployment, especially among the young, reached alarming proportions.
These general conditions fed a disillusionment which was palpable; what made it worse was that it was being presided over by a largely inept 'socialist' Government; 'Sunny Jim' Callaghan was probably the most ineffectual Prime Minister of the twentieth century and ushered in memories of shortages, rations and failure of public services on a scale not seen since the Second World War; living in Britain after 1975 had all the hallmarks of living in a third world banana republic.
The Liberals (now the Lib-Dems) had ceased to be a viable political alternative since the Great War of 1914-18 and so Britain was faced with a stark choice at the General Election in 1979; to continue with a Government which was clearly unable to govern in any meaningful sense (the Tories had been ousted for much the same reason in 1974 in the wake of the national coal miners' strike) or to choose a more radical, but right-wing alternative. Unfortunately, for anyone with a glimmer of a social conscience, the nation chose self-interest above all else, or at least the 20-25% of 'floating voters' did, those not wedded to the ideals of a particular party.
The national press played an important role in Thatcher's victory in 1979, probably greater then in Blair's election in 1997. In truth, they could scarce not stand behind Thatcher and the Tories, the newspapers of the preceding couple of years had been filled to overflowing with tales of labour disputes, wildcat strikes, secondary picketing, clashes with the police and double figure wage demands.
Thatcher in Britain and Reagan in the States were simply a manifestation of a increasing disillusionment with organised labour in the shape of a few very large and strong Trades Unions, a growing sense that it was 'every man/woman for himself' and to hell with social responsibility or conscience, a desire to return in short order to the earlier decade of supposed prosperity. Thatcher, or Reagan for that matter, did not cause the flowering of the 'me' generation, they just encouraged it by their policies.
Politics, and economics, are always a tightrope act. However, Thatcher could, quite justly, whatever I or you might believe, claim that she did indeed have a mandate, insofar as it is possible to have one in the constituency-based system of electing officials to office that we have in the UK, for change.
As a Trade Unionist, the anti-union legislation of Thatcher's first two terms of office were, to a large extent, and in my view, excessive, however there was little widespread support, except among staunch Trade Unionists, for any watering down of the proposals; dialogue had failed during the Callaghan years and so Thatcher might be forgiven for not wasting her and the nation's time in trying to foster a dialogue with the Unions. Pragmatically, the Government, of whatever colour, needed to attempt to curb the worst excesses of certain TU leaders and it is interesting that the so-called 'socialist' government from 1997 to 2010 hardly repealed any of the so-called 'union-bashing' legislation of the eighties. We do, after all, live in a capitalist democracy; certain compromises have to be made if we do not wish to ferment a socialist or anarchist revolution ourselves.
Whilst the introduction of the Poll Tax caused widespread condemnation, demonstrations and, on occasion, riots, the system whereby local government collects local taxes was, as it is now, grossly unfair. I am forced to pay a high price for the property in which I live and as a result I have to pay a higher local (Council) tax than someone in a similar property elsewhere in the country because that tax is not determined by how much I earn or my ability to pay or the extent to which I use, either in reality or hypothetically, local services but merely on the (rateable) value of my property. The Poll Tax was misguided simply because it was not based on an ability to pay, merely a fixed rate per individual, which therefore disadvantaged the most vulnerable; yet I suspect that because of the memories of its introduction, it has slowed down considerably the introduction of a 'local income tax' which surely must not be beyond the scope of computer programmers nowadays.
Margaret Thatcher's final 'crime' was to become embroiled in the what became known as 'the Falklands War', although 'the Falklands Skirmish' might be a better description. As far as I remember it, the actual fighting lasted for less time than it took to make the journey from Southampton to Port Stanley. While she undoubtedly made much political capital out of the (foregone conclusion of) victory, it is difficult to see a viable alternative reaction to the Argentine invasion. Self-determination is a key tenet in any democracy and the inhabitants of the Falklands, whatever territorial claims the Argentines may legitimately have had on the islands and I suspect about as much as the British, wanted to remain British citizens.
We have now had two leaders, Thatcher and Blair, who have both had long periods in power in the past 30 or so years and both have damaged their legacy badly by (a) adopting an almost dictatorial approach and (b) by clinging onto power when reason should have counselled letting it go. It will be some time before an objective and reasoned summary of both those leaders' legacies can be assessed.
Thanks go to E J Thribb whose masterly poetic style leads to the title today.
Saturday, 6 April 2013
Arwen, Irulan and planetary ecology
I was struck today by one of those comparisons you sometimes make to events or people that do not, on the surface, appear to have very much in common. The comparison I made was between Arwen (in Tolkiien's 'Lord of the Rings') and Irulan (in Herbert's 'Dune'); both characters end up married to a 'hero' (Aragorn and Mu'ad Dib respectively) and yet scarcely appear in the respective novels as characters in their own right. Arwen is largely confined to someone Aragorn uses to fend off Eowyn's attentions in Lord of the Rings* and Irulan is, in the main, reduced to quotes from her writings as chapter headings; both characters presage an ending to the novels which does not make much sense until you do indeed reach the end.
Of course, the two characters marry for different reasons, Arwen for love and Irulan for political expediency, however it still feels quite strange to me that neither author felt it necessary to flesh out the characters. Arwen is simply a cardboard cut-out until you read the appendices and then the parallels with the tale of Beren and Lúthien Tinúviel in the Silmarillion, which was not published until much later, become apparent. Irulan is scarcely much better; the chapter headings, which are quite obviously a device to provide more information and a perspective on the narrative, could have been written by anybody.
This may have, in both cases, been done as a 'literary device' by both authors, although the reason for it escapes me, and yet in both Jackson's trilogy of films of 'Lord of the Rings' and in the three part TV mini-series of 'Dune' (Lynch's film is too short to do any kind of justice to the book), the directors or producers felt the need to expand on the characters; give them some depth. Most of the time, I am largely dissatisfied with adaptations of books which run to more the 300 or so pages; too much often gets omitted. Although I have issues with what was left out of both adaptations, the additions in these cases seem to me to be entirely appropriate and fitting.
That comparison aside, the books, though widely different in tone, style and subject matter, one 'high fantasy' the other sci-fi, do share at least one commonality; the well being of the planet. Tolkien, raised as he was, in part, in the English countryside had a great love of the 'unspoilt' England, much more in evidence in the years prior to the Great War (1914-1918), although the landscape even then had already been blighted to some extent by the effects of the industrial revolution. It is scarcely an accident that evil in Tolkien's world is not only measured by the pain and suffering done to other people of middle earth but also in the damage that is done to environment. Mordor is a volcanic wasteland, devoid of green; Isengard is surrounded by pits containing forges and breeding dens; the Shire, Tolkien's picture of an idealised 'Albion', is transformed after Saruman's expulsion from Orthanc and the hobbits' lengthy absence into a landscape of smoking sheds and desolation. Good, in Tolkien, is always represented by a love of growing things. Elrond, Galadriel, Thranduil, the Ents have their forests, their trickling streams, their wild flowers; Aragorn is well versed in herbology and the ways of the wild places in Middle-Earth; even Faramir is stationed in the Vale of Ithilien, with its woodland, ideal for ambush, and, of course, coneys!
Herbert, on the other hand, takes a more pragmatic approach, more in keeping with a writer of sci-fi. While Herbert is content to view Arrakis, the planet-wide desert that it is, as something natural, a product of sand worm activity, still he seeks to change it for the better; at least for humankind. Liet Kynes' work, in using simple technologies to condense water from the atmosphere and so start a process to hold back the desert from selected areas, not only gives the Fremen a common goal, something to unite behind, long before Mu'ad Dib gave them a reason and a way to fight the Harkonnen but is also a 'call to arms' for our own dilemma; the Sahara has been steadily advancing south since Roman times and we, so far, have done little or nothing to stop it. ** Perhaps it was not the first book to highlight ecological disaster and simple technologies to alleviate or overcome it (Silent Spring was published 4 years earlier, albeit non-fiction) but it was probably one of the first to establish ways in which an entirely dystopian vision of the future could be avoided.
* I do not care what the consensus view is; I think Miranda Otto's performance in Peter Jackson's trilogy of films of Lord of the Rings is spot on. It is not easy to play the love struck teenager, especially when Tolkien himself did not make that good a job of it.
** There was an interesting idea some years ago, which never took off as far as I know, which involved building cities or towns from the dunes at the southern border of the Sahara to halt the march a of the desert. Using a binder, eg concrete, to stabilise the dune on the south side and coarse grass to stabilise the dune on the north side, each dwelling would, in effect, be 'carved' into the face of the sand dunes.
Of course, the two characters marry for different reasons, Arwen for love and Irulan for political expediency, however it still feels quite strange to me that neither author felt it necessary to flesh out the characters. Arwen is simply a cardboard cut-out until you read the appendices and then the parallels with the tale of Beren and Lúthien Tinúviel in the Silmarillion, which was not published until much later, become apparent. Irulan is scarcely much better; the chapter headings, which are quite obviously a device to provide more information and a perspective on the narrative, could have been written by anybody.
This may have, in both cases, been done as a 'literary device' by both authors, although the reason for it escapes me, and yet in both Jackson's trilogy of films of 'Lord of the Rings' and in the three part TV mini-series of 'Dune' (Lynch's film is too short to do any kind of justice to the book), the directors or producers felt the need to expand on the characters; give them some depth. Most of the time, I am largely dissatisfied with adaptations of books which run to more the 300 or so pages; too much often gets omitted. Although I have issues with what was left out of both adaptations, the additions in these cases seem to me to be entirely appropriate and fitting.
That comparison aside, the books, though widely different in tone, style and subject matter, one 'high fantasy' the other sci-fi, do share at least one commonality; the well being of the planet. Tolkien, raised as he was, in part, in the English countryside had a great love of the 'unspoilt' England, much more in evidence in the years prior to the Great War (1914-1918), although the landscape even then had already been blighted to some extent by the effects of the industrial revolution. It is scarcely an accident that evil in Tolkien's world is not only measured by the pain and suffering done to other people of middle earth but also in the damage that is done to environment. Mordor is a volcanic wasteland, devoid of green; Isengard is surrounded by pits containing forges and breeding dens; the Shire, Tolkien's picture of an idealised 'Albion', is transformed after Saruman's expulsion from Orthanc and the hobbits' lengthy absence into a landscape of smoking sheds and desolation. Good, in Tolkien, is always represented by a love of growing things. Elrond, Galadriel, Thranduil, the Ents have their forests, their trickling streams, their wild flowers; Aragorn is well versed in herbology and the ways of the wild places in Middle-Earth; even Faramir is stationed in the Vale of Ithilien, with its woodland, ideal for ambush, and, of course, coneys!
Herbert, on the other hand, takes a more pragmatic approach, more in keeping with a writer of sci-fi. While Herbert is content to view Arrakis, the planet-wide desert that it is, as something natural, a product of sand worm activity, still he seeks to change it for the better; at least for humankind. Liet Kynes' work, in using simple technologies to condense water from the atmosphere and so start a process to hold back the desert from selected areas, not only gives the Fremen a common goal, something to unite behind, long before Mu'ad Dib gave them a reason and a way to fight the Harkonnen but is also a 'call to arms' for our own dilemma; the Sahara has been steadily advancing south since Roman times and we, so far, have done little or nothing to stop it. ** Perhaps it was not the first book to highlight ecological disaster and simple technologies to alleviate or overcome it (Silent Spring was published 4 years earlier, albeit non-fiction) but it was probably one of the first to establish ways in which an entirely dystopian vision of the future could be avoided.
* I do not care what the consensus view is; I think Miranda Otto's performance in Peter Jackson's trilogy of films of Lord of the Rings is spot on. It is not easy to play the love struck teenager, especially when Tolkien himself did not make that good a job of it.
** There was an interesting idea some years ago, which never took off as far as I know, which involved building cities or towns from the dunes at the southern border of the Sahara to halt the march a of the desert. Using a binder, eg concrete, to stabilise the dune on the south side and coarse grass to stabilise the dune on the north side, each dwelling would, in effect, be 'carved' into the face of the sand dunes.
Wednesday, 3 April 2013
The silly season comes early!
I do not usually comment on the news either here in the UK or from around the world; there are far more vox-pop analyses of the news than anybody can usefully digest and I do not usually propose adding my tuppence worth into the ring, however there are two stories, one which has been simmering for a while and one recently released which I feel I must comment on.
The first is the laughable attempt by North Korea to posture itself into a corner from which it will have no alternative but to back down. While one can sympathise with the new leader, after all any new dictator has to posture to a greater extent than their predecessor and Kim Jong Un is no exception, one has to temper the posturing and sabre rattling with a modicum, the merest soupçon, of realism and common sense.
Does North Korea seriously believe that China will come to its aid, as they did in the 50s; for all that North Korea thinks that it expounds the same political philosophy as China. Reality check, people, it does not! China is now too far down the 'beholden to western capitalism' road to turn back now and they are probably 10 or 20 years away from economic hegemony. Would they seriously think of jeopardising that for the sake of a potential war which would simply see a 5mm shift in the border one way or another, assuming North Korea is not annihilated in the process.
When all is said and done, the position is not the same as in the 1950s. The US, all of the military hype notwithstanding, is in a far better position to strike tactically without necessarily endangering combat troops than at any time since the second world war and the prospect of any kind of strategic war would surely do more damage than the North Koreans would be able to bear. It is difficult to imagine that China (or Russia or even Iran) would risk anything other than minor tactical support for North Korea in the event one or other of the sides involved should find, fabricate, a reason to go to war.
There seems to be little reason for the North Koreans not to try to emulate the Chinese; China has demonstrated that a certain degree of pragmatism can pay substantial dividends in the medium to long term without destabilising the internal powerbases, at least in the short term. I suspect that it will be a while yet before China marches down the same road as the Russians and allows the oligarchs to wield so much power.
There is only one minor snag with all of this; Kim just might be as barmy as everyone says, and thinks, he is. In which case, Heaven help us all for there is little you can do when the lunatics are in charge of the asylum!
The second piece of news I came across today was the report that the state of North Carolina has a bill before the legislature which seeks to have a state-mandated religion in defiance of the First Amendment. I am not sure what the purpose of this bill is except to waste some time and try to screw a few more expenses out of the state government. Even if it is passed by the state, both houses I believe are controlled by the Republicans, it will get overturned in the federal supreme court in about as much time as it takes to stamp a cockroach underfoot.
I do not live in North Carolina, I do not even live in the USA, so it makes not a blind, practical difference to me whether the state enacts a bill to demand that people are only allowed to worship 'the curly-wurly' or that everyone there must pray to the great God 'Snickers bar' three times a day while facing Raleigh. What I do get concerned about is how such tosh is viewed around the world by nations that might think that such a law or bill should be applied to their own jurisdiction; it does sort of cut the moral high ground from under the US's feet when supposedly sane senators or whatever can even think about proposing such a bill.
When the USA was first formulated, it was important to allow the states some independence to govern themselves but to over-ride state law with Federal law to (a) protect the fundamentals of the written constitution and the amendments and (b) to make a bulwark against secession. Since the divorce between the political arm and the religious arm(s) are explicitly stated in the First Amendment, it seems that the only course open to the bozos in Raleigh (nice English name that!) is to propose an amendment to the Amendment or to secede; I wonder if the rest of America would even notice a secession. At the very least it would raise the national IQ average by a significant margin.
The first is the laughable attempt by North Korea to posture itself into a corner from which it will have no alternative but to back down. While one can sympathise with the new leader, after all any new dictator has to posture to a greater extent than their predecessor and Kim Jong Un is no exception, one has to temper the posturing and sabre rattling with a modicum, the merest soupçon, of realism and common sense.
Does North Korea seriously believe that China will come to its aid, as they did in the 50s; for all that North Korea thinks that it expounds the same political philosophy as China. Reality check, people, it does not! China is now too far down the 'beholden to western capitalism' road to turn back now and they are probably 10 or 20 years away from economic hegemony. Would they seriously think of jeopardising that for the sake of a potential war which would simply see a 5mm shift in the border one way or another, assuming North Korea is not annihilated in the process.
When all is said and done, the position is not the same as in the 1950s. The US, all of the military hype notwithstanding, is in a far better position to strike tactically without necessarily endangering combat troops than at any time since the second world war and the prospect of any kind of strategic war would surely do more damage than the North Koreans would be able to bear. It is difficult to imagine that China (or Russia or even Iran) would risk anything other than minor tactical support for North Korea in the event one or other of the sides involved should find, fabricate, a reason to go to war.
There seems to be little reason for the North Koreans not to try to emulate the Chinese; China has demonstrated that a certain degree of pragmatism can pay substantial dividends in the medium to long term without destabilising the internal powerbases, at least in the short term. I suspect that it will be a while yet before China marches down the same road as the Russians and allows the oligarchs to wield so much power.
There is only one minor snag with all of this; Kim just might be as barmy as everyone says, and thinks, he is. In which case, Heaven help us all for there is little you can do when the lunatics are in charge of the asylum!
The second piece of news I came across today was the report that the state of North Carolina has a bill before the legislature which seeks to have a state-mandated religion in defiance of the First Amendment. I am not sure what the purpose of this bill is except to waste some time and try to screw a few more expenses out of the state government. Even if it is passed by the state, both houses I believe are controlled by the Republicans, it will get overturned in the federal supreme court in about as much time as it takes to stamp a cockroach underfoot.
I do not live in North Carolina, I do not even live in the USA, so it makes not a blind, practical difference to me whether the state enacts a bill to demand that people are only allowed to worship 'the curly-wurly' or that everyone there must pray to the great God 'Snickers bar' three times a day while facing Raleigh. What I do get concerned about is how such tosh is viewed around the world by nations that might think that such a law or bill should be applied to their own jurisdiction; it does sort of cut the moral high ground from under the US's feet when supposedly sane senators or whatever can even think about proposing such a bill.
When the USA was first formulated, it was important to allow the states some independence to govern themselves but to over-ride state law with Federal law to (a) protect the fundamentals of the written constitution and the amendments and (b) to make a bulwark against secession. Since the divorce between the political arm and the religious arm(s) are explicitly stated in the First Amendment, it seems that the only course open to the bozos in Raleigh (nice English name that!) is to propose an amendment to the Amendment or to secede; I wonder if the rest of America would even notice a secession. At the very least it would raise the national IQ average by a significant margin.
Grünewald, Dürer and the vagaries of life
Chance can be enormously fickle sometimes. Why does 'X' win Euromillions or the Powerball jackpot running into hundreds of millions of Euros or Dollars and 'Y' has to survive on £53 (c$80) per week because they are disabled or old? Is it just chance, luck, or does one make one's own luck? Is it my fault that I am poor? Or is it just the way the cookie crumbles? It seems to me that there is no easy answer, black or white, right or left, and even the divine had few, if any, answers; 'For ye have the poor always with you' - KJV, Matthew 26:11 - which does seem to be a mite strong on the 'ducking the poverty issue' front. On the one hand, there will always be poor people suffering in this life while on the other hand 'It is easier for a camel to go through the eye of a needle, than for a rich man to enter into the kingdom of God' - KJV Matthew 19:24 - which is not what I would call a 'win-win' situation for the average punter!*
I was reminded of this not because of the wrecking ball that the Conservatives (and the Mugwumps of the LibDems) are sending through the UK benefits system at the moment, although that is extremely dire, but rather how two broadly contemporary artists have such wildly differing legacies; works that survived their deaths and have made it down to the present day intact and largely attributable after nigh on half a millennium.
Mathis Gothardt Neithardt, known as Matthias Grünewald, although it is not clear whether they were in fact two distinct artists working on the same project at the same time**, worked out of the Main/Rhein region of Germany between about 1503 and 1528 or 1532; only 13 paintings survive and about 25 extant drawings can be confidently attributed to him. The works are exclusively religious in nature, which was not unusual for the period. It is said that some works were lost in the Baltic Sea en route for Sweden as war booty but I cannot track down either the exact number or a description of the works in question. Like the numbered Swiss bank accounts opened by high ranking Nazis both before and during the war, we are unlikely to find out the contents anytime soon.
Even allowing for the flotsam and jetsam in the Baltic, this seems to be an unusually small number of extant works in comparison with other artists at the turn of the sixteenth century, although Grünewald did eschew the trappings of the renaissance. While Grünewald did experience bouts of poverty during his life, of which we know very little, and he may not have recovered the full price for the Isenheim Altarpiece, he appears to have been a respected member of the artistic community and was unlikely to have suffered from a lack of commissions any more than any other 'old master' at the time.
Contrast this, if you will, with Albrecht Dürer who was active from about 1490 to 1528; nearly 200 paintings in oil and watercolour (Dürer was perhaps the first 'en plein air' watercolour landscape artist)***; about 370 woodcut prints, not including the illustrations for the 4 books on proportion nor the treatise on fortifications; close on a hundred engravings; a small handful of etchings and drypoints and over 900 drawings, ranging from the quick pen and sketch done with a few lines (eg 'mein Agnes') to the highly finished pen and ink working drawings heightened with white which form the so-called 'Green Passion' (because they are drawn on green tinted paper).
Although wildly different in style, Grünewald stayed true to his late Gothic heritage while Dürer wholeheartedly embraced the humanist and Lutheran doctrines of his friend Willibald Pirckheimer and Philipp Melancthon; the Renaissance styles of Mantegna and the Venetian colourists; he visited Northern Italy twice at the start of the sixteenth century and was the embodiment of the 'northern renaissance' man, it is, however, still difficult to explain the wide disparity in the extant works.
Perhaps it was only luck or location. Dürer was closer to 'the action' in Nürnberg, lying closer to Northern Italy and the revolution in art taking place there than Grünewald in the Rhein/Main valleys. Perhaps his desire to concentrate so much of his output into engravings and woodcuts meant that so many of his prints were in circulation that the law of averages meant that at least one of each print would survive. Perhaps what would seem to be an obsession with never throwing his working drawings away meant that Hans von Kulmbach, one of his assistants, had the opportunity to catalogue them all; it is likely that a good proportion of them passed to Dürer's brother but he appears to have sold them since they do not appear on either his wife's or his own estate after their deaths. Some, at least (more than a hundred) went to the Albertina in Vienna, possibly via the collection of Willibald Imhoff, who appears to have been the owner of many of the watercolours; the 'hare' and the 'roller' included.
Perhaps in the end, it was a combination of so many things that Dürer enjoyed as a result of his skill but also of location and his willingness to travel that gave us such a rich bounty of treasures; the multiple copies of woodcuts and engravings; the society that Dürer moved in amongst some of Luther's most staunch supporters, the intellectuals, collectors and burghers of the city; a desire to travel which saw him visit, for months at a time, the Netherlands, Basel in Switzerland and Venice twice.
Perhaps Grünewald was just not modern enough to make his own luck and his own legacy or perhaps he just did not care.
The answers to the recent quiz:
1. The first line of the song 'Question' by the Moody Blues
2. Likewise, the first line of Bob Dylan's 'Blowin in the wind'
3. A novel by Primo Levi
4. A common tongue twister
5. A non-sequitor; the answer is 'None, phone support; it's a hardware problem! '
6. A poem by me without the (necessary) punctuation
7. The chorus to Tom Jones' number one single in 1960 something
8. Sonnet no. 18 by William Shakespeare (Or Edward de Vere, Earl of Oxford or Kit Marlowe or Ben Johnson or Uncle Tom Cobbly and all etc etc)
9. Track 2, side 1, 'Chicago Transit Authority' by Chicago
10.Short story collection by Heinrich Böll
* Be honest. Has it ever occurred to you that Matthew wasn't perhaps the life and soul of the wedding party at Cana.
** Unlikely, the individual panels appear to be very consistent in style, but possible, given the project was the Isenheim Altarpiece which measures some 11' X 19' and comprises 11 panels. It appears that the work took a little over two years. I would urge anyone visiting South West Germany or South East France to pay a visit to the Unterlinden Museum in Colmar and take a peek at the most harrowing and disturbing vision of the 'crucifiction' in western art.
*** One altarpiece (the 'Heller') is known to be lost to fire in the eighteenth century; a copy by Jobst Harrich still exists. It is a pity that the original is lost for the preparatory drawings for it (most notably, 'betende Hände', 'praying hands'), in ink on blue tinted paper heightened with white, are some of the best preparatory drawings in Dürer's oeuvre
I was reminded of this not because of the wrecking ball that the Conservatives (and the Mugwumps of the LibDems) are sending through the UK benefits system at the moment, although that is extremely dire, but rather how two broadly contemporary artists have such wildly differing legacies; works that survived their deaths and have made it down to the present day intact and largely attributable after nigh on half a millennium.
Mathis Gothardt Neithardt, known as Matthias Grünewald, although it is not clear whether they were in fact two distinct artists working on the same project at the same time**, worked out of the Main/Rhein region of Germany between about 1503 and 1528 or 1532; only 13 paintings survive and about 25 extant drawings can be confidently attributed to him. The works are exclusively religious in nature, which was not unusual for the period. It is said that some works were lost in the Baltic Sea en route for Sweden as war booty but I cannot track down either the exact number or a description of the works in question. Like the numbered Swiss bank accounts opened by high ranking Nazis both before and during the war, we are unlikely to find out the contents anytime soon.
Even allowing for the flotsam and jetsam in the Baltic, this seems to be an unusually small number of extant works in comparison with other artists at the turn of the sixteenth century, although Grünewald did eschew the trappings of the renaissance. While Grünewald did experience bouts of poverty during his life, of which we know very little, and he may not have recovered the full price for the Isenheim Altarpiece, he appears to have been a respected member of the artistic community and was unlikely to have suffered from a lack of commissions any more than any other 'old master' at the time.
Contrast this, if you will, with Albrecht Dürer who was active from about 1490 to 1528; nearly 200 paintings in oil and watercolour (Dürer was perhaps the first 'en plein air' watercolour landscape artist)***; about 370 woodcut prints, not including the illustrations for the 4 books on proportion nor the treatise on fortifications; close on a hundred engravings; a small handful of etchings and drypoints and over 900 drawings, ranging from the quick pen and sketch done with a few lines (eg 'mein Agnes') to the highly finished pen and ink working drawings heightened with white which form the so-called 'Green Passion' (because they are drawn on green tinted paper).
Although wildly different in style, Grünewald stayed true to his late Gothic heritage while Dürer wholeheartedly embraced the humanist and Lutheran doctrines of his friend Willibald Pirckheimer and Philipp Melancthon; the Renaissance styles of Mantegna and the Venetian colourists; he visited Northern Italy twice at the start of the sixteenth century and was the embodiment of the 'northern renaissance' man, it is, however, still difficult to explain the wide disparity in the extant works.
Perhaps it was only luck or location. Dürer was closer to 'the action' in Nürnberg, lying closer to Northern Italy and the revolution in art taking place there than Grünewald in the Rhein/Main valleys. Perhaps his desire to concentrate so much of his output into engravings and woodcuts meant that so many of his prints were in circulation that the law of averages meant that at least one of each print would survive. Perhaps what would seem to be an obsession with never throwing his working drawings away meant that Hans von Kulmbach, one of his assistants, had the opportunity to catalogue them all; it is likely that a good proportion of them passed to Dürer's brother but he appears to have sold them since they do not appear on either his wife's or his own estate after their deaths. Some, at least (more than a hundred) went to the Albertina in Vienna, possibly via the collection of Willibald Imhoff, who appears to have been the owner of many of the watercolours; the 'hare' and the 'roller' included.
Perhaps in the end, it was a combination of so many things that Dürer enjoyed as a result of his skill but also of location and his willingness to travel that gave us such a rich bounty of treasures; the multiple copies of woodcuts and engravings; the society that Dürer moved in amongst some of Luther's most staunch supporters, the intellectuals, collectors and burghers of the city; a desire to travel which saw him visit, for months at a time, the Netherlands, Basel in Switzerland and Venice twice.
Perhaps Grünewald was just not modern enough to make his own luck and his own legacy or perhaps he just did not care.
The answers to the recent quiz:
1. The first line of the song 'Question' by the Moody Blues
2. Likewise, the first line of Bob Dylan's 'Blowin in the wind'
3. A novel by Primo Levi
4. A common tongue twister
5. A non-sequitor; the answer is 'None, phone support; it's a hardware problem! '
6. A poem by me without the (necessary) punctuation
7. The chorus to Tom Jones' number one single in 1960 something
8. Sonnet no. 18 by William Shakespeare (Or Edward de Vere, Earl of Oxford or Kit Marlowe or Ben Johnson or Uncle Tom Cobbly and all etc etc)
9. Track 2, side 1, 'Chicago Transit Authority' by Chicago
10.Short story collection by Heinrich Böll
* Be honest. Has it ever occurred to you that Matthew wasn't perhaps the life and soul of the wedding party at Cana.
** Unlikely, the individual panels appear to be very consistent in style, but possible, given the project was the Isenheim Altarpiece which measures some 11' X 19' and comprises 11 panels. It appears that the work took a little over two years. I would urge anyone visiting South West Germany or South East France to pay a visit to the Unterlinden Museum in Colmar and take a peek at the most harrowing and disturbing vision of the 'crucifiction' in western art.
*** One altarpiece (the 'Heller') is known to be lost to fire in the eighteenth century; a copy by Jobst Harrich still exists. It is a pity that the original is lost for the preparatory drawings for it (most notably, 'betende Hände', 'praying hands'), in ink on blue tinted paper heightened with white, are some of the best preparatory drawings in Dürer's oeuvre
George Lodge, Poisson d'Avril and Merlin
I should perhaps explain a couple of things from my last post. Things that I forgot to do as footnotes at the time.
'Old man' was Phillip Glasier's nickname for George Lodge, not mine. Phillip had a painting of a gyrfalcon, some 3' x 2' at least, hanging on his lounge wall, and had known Lodge since the '20s, I believe. Lodge, with his knowledge and practice of falconry, was perhaps the finest painter of birds of prey that any century has produced; in a few lines he could capture the essence, the 'jizz', of any captive falcon, so much so that you could identify it by name.
(If you want to see what PG had on his wall go here and scroll down to the first 'Xmas card'. It's essentially the 'same' picture but in winter instead of summer.)
Lodge was a highly accomplished landscape and wildlife artist but only excelled at birds of prey; I only wanted the volume of raptors from Bannerman's monograph. Sadly, from my point of view, I could not afford the whole set of 12 and baulked at breaking a complete collection for the sake of my own personal 'wish list', although the assistant was willing to do so. The set, to the best of my recollection, was on a shelf about 10' from the floor and was well over £300! (And there was no way it would end up in my overnight bag; even if it would fit! And I could carry it. It's a long way from Inverness to London!)
Winkler's collection of the drawings of Albrecht Dürer is unique; hence the price. Published in Berlin between 1936 and 1939, the original plates, and a few of the actual drawings, were destroyed in the allied bombing raids. Only facsimiles of the original books (4 in all) are available now and, in essence, are illegal, infringement of copyright. The books share pride of place with my signed copy of 'Sketches and Studies...' (no 415 of 500) by Ray Ching.
I was singularly disappointed with the April Fools pranks this year; or at least the 50 or so that I heard about. I spent the first hour of that day pinning paper fish to the backs of hapless shoppers in Tesco as they queued for the checkout. (Poisson d'Avril; look it up on Wiki!)
I have recently been re-reading Mallory's 'Tales of King Arthur' (Caxton, the famous printer was responsible for calling it 'Le Morte d'Arthur') and something struck me as slightly strange. In contrast to Mallory's original tales, synthesised around 1470, late twentieth century and early twenty-first century renditions of the tales centre not on the once and future king (a steal from Mallory, 'Hic iacet Arthurus, rex quondam rexque futurus') but on his mentor, Merlin. From (Lady) Mary Stewart's trilogy, Robert Nye's ribald and bawdy 'autobiography', the recent BBC TV 5 season series and the mini-series in the late '90s (in which Arthur scarcely appears at all), Merlin is the main attraction.
Part of Arthur's fall from 'grace' must I fear be laid at the door of scholarship. While Arthur was firmly trapped in the medieval prose of Mallory, the poesie of Chretien de Troyes and the epic poetry of Wolfram von Eschenbach, he could be seen by later generations as being locked into the code of chivalry and the concepts of courtly love and the search for the Holy Grail. As scholarship on mid to late Anglo-Saxon and Welsh literature improved and the roots of the tale, whether based on fact or not, extended deeper and deeper into the past, the 'magic' of Arthur's reign was somehow lost, or at least dimished. It is not difficult to conceive that, outside of a few isolated incidents where the waves of Angles, Saxons and Jutes were halted for a time, there was little organised resistance to the invasions from the east. At best, it seems likely that any historical 'Arthur' would have been confined to Wales and the west of Cornwall since that is where traditionally the Britons had retreated to following the Roman invasion by Claudius in 43AD. By the time of the Roman wirhdrawal in about 410AD, the majority of England was Romano-British and, as far as it is possble to glean from what sources remain, Gildas and possibly Nennius and, later, Bede, what had been the province of the Emperor via his Governer, had fragmented into petty kingdoms.
If Arthur is to be thus removed from 13th and 14th century tales when Christendom reigned supreme and replanted into an earlier century when the spread of Christianity was patchy at best, then the bulk of the Arthurian legend, the quest for the Holy Grail, was likely to be fictitious; the tale of Launcelot and Guinevere was simply a rehashing of the triangle between Mark, Tristan and Isolt and therefore there isn't a lot left except a round table and Merlin the Magician.
It is, I think, not beyond the bounds of conjecture to see in Merlin's ascendency, the legacy of the late '50s resurgance in 'magic'. and the 'supernatural' in the form of one 'man'; Gandalf. In many ways, Gandalf plays exactly the role of Merlin to Aragorn's Arthur; he provides wisdon, counsel and the odd piece of magic along the way. While Aragorn is the kingly, courageous face of the victory, as Frodo is the face of the little, common man, so Gandalf is the face of the Maia, the angel, the miracle worker; the wielder, like Merlin, of a power beyond understanding.
Interestingly enough, Tolkien wrote an (unfinished) alliterative poem concerning the Arthurian legend in the early '30s. It appears to have been abandoned after ;the Hobbit; was published and work started on 'Lord of the Rings'. It is due for publication in late spring 2013 (edited, as always, by his son Christopher).
Finally a small quiz. Answers on a postcard please to the usual address:
1. Why do we never get an answer when we're knocking at the door?
2. How many roads must a man walk down before he is called a man?
3. If not now, when?
4. How much wood would a woodchuck chuck?
5. If it takes 10 men, 10 hours to dig a trench 20m long, 1m wide and 2m deep then how many software engineers does it take to change a light bulb?
6. Why does it take her so long to get ready to go out?
7. Why, why, why, Delilah?
8. Shall I compare thee to a summer's day?
9. Does anybody really know what time it is? Does anybody really care?
10.Wanderer, kommst du nach Spa?
'Old man' was Phillip Glasier's nickname for George Lodge, not mine. Phillip had a painting of a gyrfalcon, some 3' x 2' at least, hanging on his lounge wall, and had known Lodge since the '20s, I believe. Lodge, with his knowledge and practice of falconry, was perhaps the finest painter of birds of prey that any century has produced; in a few lines he could capture the essence, the 'jizz', of any captive falcon, so much so that you could identify it by name.
(If you want to see what PG had on his wall go here and scroll down to the first 'Xmas card'. It's essentially the 'same' picture but in winter instead of summer.)
Lodge was a highly accomplished landscape and wildlife artist but only excelled at birds of prey; I only wanted the volume of raptors from Bannerman's monograph. Sadly, from my point of view, I could not afford the whole set of 12 and baulked at breaking a complete collection for the sake of my own personal 'wish list', although the assistant was willing to do so. The set, to the best of my recollection, was on a shelf about 10' from the floor and was well over £300! (And there was no way it would end up in my overnight bag; even if it would fit! And I could carry it. It's a long way from Inverness to London!)
Winkler's collection of the drawings of Albrecht Dürer is unique; hence the price. Published in Berlin between 1936 and 1939, the original plates, and a few of the actual drawings, were destroyed in the allied bombing raids. Only facsimiles of the original books (4 in all) are available now and, in essence, are illegal, infringement of copyright. The books share pride of place with my signed copy of 'Sketches and Studies...' (no 415 of 500) by Ray Ching.
I was singularly disappointed with the April Fools pranks this year; or at least the 50 or so that I heard about. I spent the first hour of that day pinning paper fish to the backs of hapless shoppers in Tesco as they queued for the checkout. (Poisson d'Avril; look it up on Wiki!)
I have recently been re-reading Mallory's 'Tales of King Arthur' (Caxton, the famous printer was responsible for calling it 'Le Morte d'Arthur') and something struck me as slightly strange. In contrast to Mallory's original tales, synthesised around 1470, late twentieth century and early twenty-first century renditions of the tales centre not on the once and future king (a steal from Mallory, 'Hic iacet Arthurus, rex quondam rexque futurus') but on his mentor, Merlin. From (Lady) Mary Stewart's trilogy, Robert Nye's ribald and bawdy 'autobiography', the recent BBC TV 5 season series and the mini-series in the late '90s (in which Arthur scarcely appears at all), Merlin is the main attraction.
Part of Arthur's fall from 'grace' must I fear be laid at the door of scholarship. While Arthur was firmly trapped in the medieval prose of Mallory, the poesie of Chretien de Troyes and the epic poetry of Wolfram von Eschenbach, he could be seen by later generations as being locked into the code of chivalry and the concepts of courtly love and the search for the Holy Grail. As scholarship on mid to late Anglo-Saxon and Welsh literature improved and the roots of the tale, whether based on fact or not, extended deeper and deeper into the past, the 'magic' of Arthur's reign was somehow lost, or at least dimished. It is not difficult to conceive that, outside of a few isolated incidents where the waves of Angles, Saxons and Jutes were halted for a time, there was little organised resistance to the invasions from the east. At best, it seems likely that any historical 'Arthur' would have been confined to Wales and the west of Cornwall since that is where traditionally the Britons had retreated to following the Roman invasion by Claudius in 43AD. By the time of the Roman wirhdrawal in about 410AD, the majority of England was Romano-British and, as far as it is possble to glean from what sources remain, Gildas and possibly Nennius and, later, Bede, what had been the province of the Emperor via his Governer, had fragmented into petty kingdoms.
If Arthur is to be thus removed from 13th and 14th century tales when Christendom reigned supreme and replanted into an earlier century when the spread of Christianity was patchy at best, then the bulk of the Arthurian legend, the quest for the Holy Grail, was likely to be fictitious; the tale of Launcelot and Guinevere was simply a rehashing of the triangle between Mark, Tristan and Isolt and therefore there isn't a lot left except a round table and Merlin the Magician.
It is, I think, not beyond the bounds of conjecture to see in Merlin's ascendency, the legacy of the late '50s resurgance in 'magic'. and the 'supernatural' in the form of one 'man'; Gandalf. In many ways, Gandalf plays exactly the role of Merlin to Aragorn's Arthur; he provides wisdon, counsel and the odd piece of magic along the way. While Aragorn is the kingly, courageous face of the victory, as Frodo is the face of the little, common man, so Gandalf is the face of the Maia, the angel, the miracle worker; the wielder, like Merlin, of a power beyond understanding.
Interestingly enough, Tolkien wrote an (unfinished) alliterative poem concerning the Arthurian legend in the early '30s. It appears to have been abandoned after ;the Hobbit; was published and work started on 'Lord of the Rings'. It is due for publication in late spring 2013 (edited, as always, by his son Christopher).
Finally a small quiz. Answers on a postcard please to the usual address:
1. Why do we never get an answer when we're knocking at the door?
2. How many roads must a man walk down before he is called a man?
3. If not now, when?
4. How much wood would a woodchuck chuck?
5. If it takes 10 men, 10 hours to dig a trench 20m long, 1m wide and 2m deep then how many software engineers does it take to change a light bulb?
6. Why does it take her so long to get ready to go out?
7. Why, why, why, Delilah?
8. Shall I compare thee to a summer's day?
9. Does anybody really know what time it is? Does anybody really care?
10.Wanderer, kommst du nach Spa?
Subscribe to:
Posts (Atom)