Monthly Archives: September 2013

Would you serve wine to a pregnant woman?


So imagine you are tending a bar. A conspicuously pregnant woman asks for a large glass of Chardonnay. You do nothing. “I said can I have a large glass of Chardonnay, please … today!” she repeats her order, obviously irritated by your lack of action. You hesitate then summon up a considered response. “Is it for you or are you expecting a friend?” “What? Just give me the wine.” “Are you sure?” you ask, irritating her even more. “The wine please. Now!”

Your next line rocks her back: “I refuse. I am not going to have it on my conscience. I don’t want to serve alcohol to a pregnant woman. The National Health Service advises that pregnant women should drink no more than one or two units of alcohol once a week. That’s the equivalent of a 125-mil glass of wine. Our large glasses are double that. So: no, you can’t have your wine. Can I serve you a soft drink?”

That’s pretty much what happened in Liverpool’s Pi bar last week. The pregnant customer – or prospective customer – demanded her wine, was refused again and walked out in embarrassment. The bar’s management later apologized for their bartender’s overzealous paternalism, even though, technically, he was within his rights. The bar’s management reserve the right not to serve customers, so he was merely exercising that right. All the same, the incident forces us to wonder whether the bartender was morally right? Should he have assumed the position of moral guardian? A similar case occurred earlier this year in Australia. According to Aussie law bartenders are bound to serve customers unless they suspect they are intoxicated. They still have every right to inform pregnant customers of the dangers to themselves and their unborn child.

Here’s the case for the bartender’s defence. He or she was a thoughtful, caring, considerate human being, obliging and accommodating, but mindful of the dangers to which the customer would be exposed should she drink the wine. Unselfishly, the bartender offered helpful advice on the dangers of alcohol and of the potential damage to the unborn baby. The refusal was a generous, compassionate and altruistic act of a person who exemplifies all the wholesome qualities of a culture in which people cooperate for the common good. After all societies are based on collective endeavour, not conflict. The case against the bartender is simpler: it was none of his or her business and the impromptu policy of restricting the freedom and responsibilities of customers was arrogant and paternalistic — and reflected all that’s wrong with the cradle-to-grave nannyism that characterizes Britain today.

On reflection, the bartender could perhaps have opted for a slightly different approach, issuing friendly advice, but emphasizing that he wouldn’t prevent the customer buying the wine. Then, if the customer insisted, the bartender would have a clear conscience. And, if they couldn’t bring themselves to do it, the bartender could maybe call a colleague. But these options fudge the issue: does a bar have the right to assume the position of guardian, or is it just there to sell booze? It’s no use comparing this to the smoking ban, which is only partly designed to protect would-be smokers; it’s also to support the comfort and wellbeing of non-smokers.  Car seatbelts? In the UK, the compulsory wearing of these was debated for twenty years before being drafted into law in 1991. The moral debate concerned whether you should advise, counsel, recommend and provide guidelines for the benefit of citizens, or just compel them to take care of themselves. The same question can be asked of many other aspects of social and personal life where harm is involved. Do we, as individual controllers of our own destinies have the right to engage in behaviour that we know is potentially, or actually damaging to our wellbeing? Or is society under obligation to protect us from ourselves? We each have our own answers. My guess is that many of you, placed in the position of the bartender, would have poured the wine and moved on to the next customer, thinking, “none of my business.” But maybe it is our business.

The World Health Organization defines public health as: “Organized measures (whether public or private) to prevent disease, promote health, and prolong life among the population as a whole. Its activities aim to provide conditions in which people can be healthy and focus on entire populations, not on individual patients or diseases. Thus, public health is concerned with the total system and not only the eradication of a particular disease.” Maybe our seemingly intrusive bartender in Liverpool was in fact just a principled advocate of public health.


The niqab controversy and the Nairobi massacre: unrelated but linked by perception

We can understand how global conflict provides a model, not just for British Muslims, but for Muslims anywhere. Believing you are the victim of oppressive, prejudicial system makes sense if you accept that you are part of an unjust exercise of force that forms a wider pattern. Some have suspected that the renewal of Muslim identity is the effect rather than the cause of conflict in Britain; there is evidence that forms of allegiance focused on place, nationality, class or profession “can appear to lack an encompassing world-view and are impoverished … Islam [is] clearly the most powerful and reaffirming element,” as the writer Sofia Chanda-Gool puts it.

A college near where I live recently retracted its ban on the niqab, the veil covering all of the face apart from the eyes worn by some Muslim women. The college initially cited security reasons for prohibiting students from wearing the garments, and then reversed its policy. As readers will know, this made national news and reheated the argument over Muslim dress. A great many commentators, including several prominent Muslim women, oppose the wearing of a garment they believe signifies the subordination of women. My view is that women should be able to wear the niqab or even the burkha (the long, loose garment covering the whole body from head to feet) if they wish. They pose no threat to me, or anyone else, and, if a young woman wishes to clothe herself in this manner, that’s her business. In my experience, Muslim women will disclose their faces if the situation demands. I have seen them unveil for airport security officers and in banks, for example. The fact remains: many non-Muslims are hostile, resentful and suspicious of women who wear such clothing. Why?

 Call it synchronicity – the simultaneous occurrence of events, which have no discernible causal connection – but the events in Birmingham, by some weird mental alchemy, will appear significantly related to what has happened in Nairobi, Kenya, over the past few days. A militant Islamic group al-Shabaab has killed at least 40 people and held others hostage in a shopping centre. It’s thought Muslims were allowed to leave safely. Some non-Muslims who tried to talk their way past the fighters armed with grenades and AK-47s were told to identify the mother of the Prophet; those who could were released; those who couldn’t were shot on the spot. The attackers are from Somalia, which shares a border with Kenya. At the time of writing, their demands are not known, though it thought the actions are part of the jihad the struggle against the unbelievers on behalf of God and Islam.

 The Muslim students at the Birmingham Metropolitan College may have no tangible connections with events in Nairobi – but, whether they like it or not, they are connected. Reactions in Britain and elsewhere in Europe to the wearing of niqabs, and burkhas can’t be understood in isolation. Over the years, the country has witnessed all manner of exotic clothing, many from homegrown youth cultures, others imported. It’s also witnessed flamboyant body ornamentation like tattoos and pierces. The British often look askance at these decorations, but they assimilate them. Niqabs are proving more resistant to assimilation. The reason lies in its associations: the niqab is a symbol as much as an item of clothing. The Daily Telegraph has described it as “a symbol of segregation,” though I think this is a misunderstanding. Others think the niqab is a symbol of women’s oppression. Perhaps. But my feeling is that perceptions such as these would not elicit the kind of ferocious response we’ve seen. Wearing Muslim veils is not a fashion statement, but a visible signifier of a person’s faith and commitment. But global events mean it will be understood by those outside Islam as a sign of association. Islamic renewal may well be, as Chanda-Gool argues, reaffirming, and I still support the right of Muslim women to wear the clothes they wish. Though I fear global events could leave them frighteningly isolated. Islamophobia, the hatred or fear of Islam or Muslims, feeds from global events like the Nairobi massacre.  It’s victims are innocently removed from the conflict, but linked by a perceptual thread.




Give a dog a bad name … or another chance?

Marlon King vs Antwerp

Do you believe in rehabilitation? I mean after prison: do you think training, therapy and other methods can restore an ex-offender back to a normal functioning life after a period behind bars? I do. Not so much because I have great faith in the power of our probation service or cognitive behavioural therapy, but simply because the alternative is so defeatist.

So why am I writing about this? Two reasons. This week thousands of probation workers joined nationwide protests to claim that public safety will be jeopardised by the Government’s plans to transfer the community supervision of most former offenders to private companies. Britain’s National Probation Service has been around in one form or another since 1907, but it faces the prospect of being dismantled and having its functions contracted out to private companies, such as G4S and Serco. Probation officers’ unions argue that placing the responsibility for monitoring offenders after release to private firms will not only put communities at risk, but will threaten jobs in the probation service.

The argument would have a lot more force if the National Probation Service was doing a bang-up job itself. The truth is that it is not: six out of ten people who leave prison are re-convicted within two years. This is far from the worst recidivism rate in the world, but it’s not encouraging either. By contrast, the recidivism rate for prisoners in Norway is around 20 per cent. About 67 per cent of America’s prisoners are re-arrested and 52 per cent are re-incarcerated. A recidivist is a convicted criminal who breaks the law again after punishment.

Marlon King is a recidivist. He is also a professional footballer. He was released from prison in July 2010, after being sentenced to eighteen months for sexual assault and assault occasioning bodily harm; his employer Wigan Athletic sacked him after his conviction. He was placed on the sex offenders’ register for seven years. Earlier this year, he was arrested in connection with a hit-and-run incident, which had left a man injured. This week, he signed to play football for Sheffield United of League One. Many fans have reacted furiously, insisting that they would return season tickets and boycott games while he remained on the club’s books. King has experienced this before: immediately on his release he played for Coventry City amid fans’ protests. Interestingly, Birmingham City fans did not react when he moved to their club in 2011. You can see Sheffield fans’ point of view: King has a long line of offences dating back to 1997; he is, in many ways, a model reoffender – and I mean by model a representative example, not a good example. Sheffield fans have decided he is beyond redemption. Are they right?

My feeling is that King knows only one way to earn a living. He has played professional football since 1998 and, at 33, has got only a few more competitive years left in the sport. So far, he has shown few signs of a successful rehabilitation. But if there is a sliver of hope, it lies in football. The sport could still be his salvation. Forgive the pious tone: readers will know I am a practical man and I am still being practical. I don’t know what King would be doing now were it not for his ability to play football. I do have strong suspicions though. And I think anyone who is familiar with his life will share those suspicions.

He’s certainly no role model, but nor is any footballer no matter how virtuous they may seem. King is just plying his trade in an industry that does not demand impeccable moral qualities, or spiritual uprightness: footballers are not high-minded, right-thinking, incorruptible, scrupulous, guiltless, respectable people; they just play football. But that argument can save for another time. My point here is simple: if, as a nation, we have decided that justice is important, we should pay attention to one of its central precepts: fairness of treatment. King has served his sentence and should be allowed to resume his professional life as best he can. Football may yet to be his method of rehabilitation. Readers might reply, “don’t hold your breath,” but I am certain this is a better option than the alternative. Ex-offenders are stigmatized in the sense that they’re labelled as worthy of disgrace or social disapproval and find it difficult to re-integrate back into society. That alone accounts for the return of many to prison: they just can’t find work or any kind of acceptance. It’s difficult to shrug a bad reputation, even if it’s unjustified. We have an expression for this, of course: give a dog a bad name and hang him. King is fortunate enough to be granted a chance to redeem himself. He may not take that chance. But I for one prefer a world in which a chances are offered and spurned to one in which they are not offered at all.

Twitter is selling YOU


You have a Twitter account, right? Silly question: of course you have. You and about 200 million others, at very least. The microblogging phenomenon has only been around since 2006, but, in a sense, it seems to compare with television and the internet as media innovations that changed the way we spend our waking hours. A couple of weeks ago, I wrote about how conceptions of privacy had changed so drastically in recent years and how, in many ways, we no longer understand our private lives in the same way as previous generations. Twitter has played no small part in this: with Facebook, it has turned private lives inside out – encouraging us to reveal details of our lives that other people find either fascinating or slightly less than fascinating, but never, it seems, dull. We devour information about what other people are doing or thinking or intending to do. What people are doing at any given moment may not seem very important, but we value this kind of information. Twitter has enriched millions of lives. Can it enrich them even more – this time with hard cash?

Twitter has filed paperwork for what the world of finance calls an initial public offering (IPO), which means it will invite anybody with enough money to buy a chunk of the company. Appropriately, it announced the news in a tweet. This means that you or I or anyone else can become a part owner of a company that is already part of our lives and engages all of us for at least a portion of our day. Last year, Facebook floated on the stock exchange. Its founder Mark Zuckerberg had put this off, probably fearing that he would have to surrender his hoodie credentials and become a corporate head, answerable to his shareholders. Zuckerberg seemed concerned that people will stop thinking about Facebook as a cultural service provider and more as a profit-driven business. The initial price per Facebook share was $38 (£24); it is now about $44. A 15.7% increase in 18 months is not bad, despite a rocky start. Encouraged by this, Twitter is following its networking cousin into the market.

Twitter seems to be everywhere, all of the time: people are always tweeting, or reading tweets, or retweeting. But there are actually three times as many Facebook users as tweeters. Twitter’s revenue is also less than Facebook’s in 2011, its final year as a private company.  So there are bound to be suspicions when it arrives on the market. All the same, the sheer prominence of Twitter in contemporary culture will persuade investors that this is a company with a future. Won’t it?

Twitter has been revolutionary. But the question investors will ask themselves is: will it be revolutionary like television, or revolutionary like ITV, the first commercial company in Britain? Since the 1950s, tv has grown into arguably the most influential innovation of the twentieth century (I’ll accept a counterargument from advocates of the internal combustion engine). It’s adapted to changing environments and, in the process, changed us in myriad ways. ITV launched in 1955. BBC was the national public broadcaster and, as such, was funded by licence payers, not advertising. ITV’s remit was more populist and it operated as a commercial organization, charging for advertising spots between its programmes. It shared the market with BBC until 1964 when BBC2 came into being. ITV’s market share was shaved a little in 1982 when Channel 4 arrived, but in the late 1980s and 1990s, it was thrust into an open market with any number of digital and satellite channels all competing for advertisers. And it’s struggled ever since. So what is Twitter? A unique medium like television, or a service that has got the market to itself, but only for the moment?

Will we all be tweeting in five years? Probably. But in ten? And beyond? Twitter may be, like television, a medium that morphs with cultural changes, or it may be just one service provider that has caught the zeitgeist – and zeitgeists change. Twitter has no doubt already started planning for this possibility. For example, it has the video-sharing app Vine, and Twitter music, the music discovery service. It will probably launch new services. But the product Twitter will be putting up for sale is actually you. OK, you see yourselves as users, but, as far as advertisers are concerned you are potential customers. 200 million customers is an attractive market for advertisers. Unlike tv stations, Twitter has no portfolio of programmes, such as Corrie, The X Factor or Downton Abbey (all ITV, of course): it just has people who like tweeting and are liable to be influenced by ads.

At the moment, none of us minds the occasional pop-up; after all, it’s a free service. Twitter currently reckons it pulls in about 380 million quid a year from advertising. But, once on the stock market, Twitter will be under pressure from shareholders to pull in as much revenue from advertisers as it can and this could affect the experience that tweeters currently find so engaging. Would Twitter dare risk alienating users with too many ads? This is essentially the question potential investors will be asking themselves.

Celebrity afterlife

Film immortalizes more surely than human memory. This week sees the release of two films, each dealing with the life of dead people on whom we confer enduring fame. Diana, as we all know, is the already-panned biopic focusing on the last two years of the Princess’s life. Rush is about James Hunt, the F1 champion, who led an epically hedonistic life and died from a heart attack in 1993 at 45. Diana died in 1997 aged 36 after a road accident in Paris. But they both live on in the popular imagination, not in a morbid kind of way, but in a spirit of reverence and, in Diana’s case, adoration. We imagine Diana as everlastingly radiant, not as the 52-year-old she would have been had she lived. And, while Hunt would have been around the same age as Bruce Springsteen, David Bowie and Jeff Bridges – all still relevant figures, of course – we think of him as the rakishly handsome roué he was in the 1970s.

Diana and Hunt are not alone: our imaginations are full of famous figures who seem as real and relevant today as they did when they dominated the headlines. There are many, many more famous characters who we think about, not as historical figures, but as contemporary presences. “Our contact with celebrities is so limited that we view them as mirages until the one event that restores them their real physical presence, their deaths, the moment of our greatest intimacy with them,” writes the American scholar Daniel Harris in his 2008 essay “Celebrity deaths.” Harris’s argument is that the death of celebrities is “the ultimate democratic epiphany” in that, in a sudden moment of revelation, it their demise reminds us that, despite their status, they are “as liable to physical misfortunes as the best of us.”

The reaction to death serves to reinforce what Harris calls solidarity, by which I presume he means a unity or harmony that endures long after. Posthumous exposés may lay bare aspects of a celebrity’s life that may change our evaluations, but a dead person can’t actually do anything to alter a bond forged by death. Marilyn Monroe may have set a deplorable example of ostentation and promiscuity in the 1950s, but on her death she was beatified. Indeed, later revelations made her seem more a victim than she ever did in life. Elton John and Bernie Taupin memorably used T. H. White’s 1958 phrase “Candle in the wind” to capture her fragility in their 1973 song; they modified the lyric in 1997 to eulogize Princess Diana, who was also worshipped more in death than in life.

Norma Jeane Mortenson may have died, but Marilyn lived on, making hers the first death to lead to a renewal and, for this reason, the first celebrity death. (James Dean died earlier, in 1955, aged 24, and his image was borne on countless tee-shirts and posters. But his life was never probed and exhibited, and he was respected as much for the postwar rebellious spirit of youth he personified than himself.)

Wheeler Winston Dixon, a professor of film studies, observes how images of dead celebrities become frozen in time, surrounded with manufactured fantasies, immune from aging. The everlasting image of Marilyn, who like Diana, died aged 36 is of a lucent-eyed, smolderingly vivacious and affectingly shallow blonde. Her depths were plumbed only after her death. Hers was a death that guaranteed immortality. And there were others. Jimi Hendrix (1942-70), Elvis Presley (1935-77), John Lennon (1940-80) and Tupac Shakur (1971-96) were all sanctified in a secular sense. “Any negativity [about their lives] has long been digested by the popular culture – and they’ve stood the test of time,” writes historian Robert Klara.

Helping them stand the test are corporations with interests in resurrecting them via film, music and merchandise. Digital technologies have facilitated their appearance in advertising and, in the cases of Frank Sinatra (1915-98) on stage – in the form of a moving holographic image. All have been subjects of biopics, in Diana’s case several times over. Her death started a cycle of renewal as writers, film makers and corporations revived not just her image, but her existence in any exploitable form. Journalists Ross D. Petty and Denver D’Rozario have produced a cold-hearted analysis of the bonuses offered by departed: “Living celebrities are both expensive and risky … Deceased celebrities have the advantage of being both less expensive and less likely to suddenly lose popularity.”

Exposing our private parts

Keith Lemon thought

Privacy. Has it vanished? Is there part of your life that you jealously protect, don’t want observed or discussed with other people and restrict to yourself and perhaps very close confidantes? Or do you live a life that’s pretty much open to inspection by all and which you’re happy to share with others, even people you don’t know and will probably never meet?

In the 1980s when BBC launched its show Through the Keyhole, it was a daring innovation: the host Lloyd Grossman led viewers into the homes of famous people, scrutinizing the décor and furniture in an effort to disclose aspects of their character. The show was predicated on the intellectually respectable assumption that the physical places in which people live offered a reliable reflection of aspects of their “real” personality rather than the public persona they presented to their audiences. It was a legitimate invasion of privacy and offered viewers a rare sight of the largely hidden side of the rich and famous.

Last Saturday, ITV revived the concept, replacing the vowel-strangling gastronome with “Keith Lemon,” alter ego of Leigh Francis. Unsurprisingly, the show removed any intellectual pretensions or ingenuity. The formula was camped up, but the pleasure it offered viewers was essentially the same.

At the time of the original series, most people would have felt slightly uncomfortable about wandering into the homes of other people and poking around their personal belongings. But only slightly. And when viewed through the filter of television, the whole experience seemed completely wholesome. The beauty of the show was that it effectively turned us into shameless peeping toms.  No one felt guilty about invading others’ privacy.

Since then, we have less respect for other people’s private lives. Celebrity culture is founded on our curiosity: we don’t just want to know about other people’s private lives – we demand they don’t have private lives at all. We insist on having access to all areas of their lives. And, in exchange, we’re prepared to share our own lives. Facebook, twitter and other social media have painlessly removed any semblance of privacy – or perhaps, more accurately, they have turned it inside out. Many people provide minutely detailed logs of their daily lives, complete with accounts of their own views, opinions, feelings, emotions and all kind of personal states that they wouldn’t have dreamt of discussing in public in the 1980s. In recent decades the old-school privacy has receded. Television has both initiated and responded to this. Just look at the Jeremy Kyle Show: people clamour to appear on telly to reveal the most intimately embarrassing details of their lives in front of 1.5 million viewers.

Privacy has been under assault in all sorts of other ways: CCTV cameras surround us, many of your newspapers and magazines are dedicated purely to discovering dirty little secrets, credit card companies store an astonishing amount of detail on us. And we don’t seem to mind; we just accept that today’s society is like a vast panopticon – a circular prison in which prisoners can at all times be observed.

We’re both parts of and creators of a voyeuristic culture: we neither object to be being watched and infiltrated, nor mind admitting that we enjoy watching and infiltrating others.  Ravenous for information on other people, not just celebs, but anyone we care about, we’ve become nosey parkers. If you don’t probe others’ lives, you can’t really care about them at all. No one, it seems, feels embarrassed about tweeting the kind of information that would have made them squirm a few years ago.

The new show is in this sense catches the zeitgeist much more than the original. Back in the 1980s every scene set a question and we, assisted by Grossman and, later, the recently deceased David Frost, were invited to supply an answer. Lemon is less complex. The problem is: does the new show still have the power to surprise? After all, part of the pleasure of the first show lay in the little thrill of penetrating someone else’s private domain. Now we know full well the homes may be owned or rented by someone else, but we also know they are allowing cameras free entry because they have to: they are just filling their side of a bargain. That’s part of the deal in celebrity culture: anyone with aspirations to become a celebrity has to surrender their private life. In a way we all surrender our private lives.

Consumers today insist on a constant stream of information and, if they don’t get it, they lose interest. Once that interest has gone, the celebrity is effectively consigned to oblivion. This is a problem for the new show: it’s going to have a tough time presenting us with anything new; so it can’t really surprise, less still shock us in the way the Grossman show managed. We’re no longer peeping toms who need our pangs of guilt assuaged. We’re inquisitive, intrusive, snooping eavesdroppers and not the least bit embarrassed by our nosiness.