In this edited extract from 1966 and Not All That, Sanaa Qureshi discusses the relationship between football and nationalism, and whether football can be used subversively to achieve social justice.
1966 and Not All That (paperback + free ebook and free shipping to the UK) is currently half price as part of our World Cup sale, which runs until England are knocked out of the tournament!
“I enjoy making revolution! I enjoy going to football!” — Antonio Negri
Despite an increasingly globalised world, where in the last ten years, football clubs such as Manchester United, Real Madrid, Barcelona and Bayern Munich have become global brands, international football remains understandably pinned to the idea of triumphant nation states. Thus state orders are reinforced and a popular nationalism is commodified. As part of this, England’s World Cup win in 1966 remains, justifiably, a great source of national pride. The memories of that victory continue to fuel contemporary ideas of what success for the national team looks like.
In 1996, thirty years after the famous win, England hosted the European Championships and St George’s flags, for the first time in my short life, were inescapable. Perhaps it was because England were facing Scotland in the group stages or because England were hosts, but somehow, the Union flag, with all those colonial traumas stitched into its fabric, was replaced by the St George’s flag as the English patriot’s symbol of choice.
However, somewhere amidst the fervour of late 1990s “Cool Britannia” and New Labour, the St George’s flag also became the divisive symbol of the bullish, racist nationalism of the British National Party. Virtually no broadcast or news story about the BNP came without the familiar sight of the red and white flags or supporters adorned in England football shirts. With the far-right party picking up council seats and their candidates contesting parliamentary elections and holding on to their deposits, their rhetoric became mainstream. Thus, the St George’s flag became synonymous with modern English fascism. Moreover, the formation and subsequent rise of the English Defence League (EDL) was closely linked to a subculture of football fans coming together against their imagined enemy of Islam. Members often seen draped in football paraphernalia, specifically England shirts, routinely take part in violent street demonstrations.
It is undeniable that English nationalism, footballing or otherwise, is viscerally bound up with an aggressive racism that demarcates who belongs and who is the unwanted Other. Invariably, there are groups of people, particularly those who have been targets of the BNP or EDL, that are reluctant to embrace the English national team and the associated aggressive patriotism.
Yet despite, or perhaps because of, how vigorously England, the St George’s flag and the support of the national team has been hijacked by the isolating politics of nationalism, people believe the potential for subversion is even greater. Individuals and groups from minority communities in Britain have sought to reclaim the idea of Englishness from the far-right and to broaden the understanding of what it means to identify as English. Progressive ideals have been situated underneath a banner of nationalism that purports to be inclusive, welcoming and multi-cultural. England shirts have been worn proudly, the red and white a signal of support for a new, refreshed demonstration of Englishness.
Movements to reclaim and rebrand words and cultural associations have been favoured as a means of asserting alternative theories and ideas, albeit incrementally. However, it is difficult to assess how useful or sustainable this attempted reclamation can be without the accompaniment of the wholesale reframing of the issue, in this case, the concept of the English nation state. Further, can it be considered realistic for people of colour to reform an identity whose values are intrinsically bound up with whiteness?
Feelings of alienation are often exacerbated when international football tournaments come around and the success of the nation state appears to be so heavily hinged on the success of the national football team. Fifty years on England’s famous victory football has irrefutably shaped what English national success looks like and at the same time continues to provide a tool to interrogate what Englishness looks like. The commodification of nationalism through international sporting events thus serves as a useful means to understand how collective identities are fractured, formed and expressed through football.
The relationship between nationalism and football is complex and often fraught with reactionary politics. However, the sport also serves an important role in the formation of collective national identity in post-colonial states. Famously, Algerian players in the French league formed the Front de Liberation Nationale (FLN) in 1958, a team that exported the desire for Algerian self-determination and liberation from the French throughout North Africa, the Middle East, parts of Europe and Asia. Similarly, the Iraqi national team’s achievements in the 2007 Asia Cup were set against a backdrop of continued violence, occupation and logistical difficulties. Victory brought together an often fragmented country and a sense of Iraqi national pride was keenly felt throughout.
French success in the 1998 World Cup, which was also held on home turf, was heralded as a defining moment in the previously difficult narrative of integration. With a team led by the inimitable Zinedine Zidane, French-born to Algerian parents, and composed of players whose backgrounds told a story of French colonialism, this was supposed to be a turning point in French race relations. Instead, it has come to signify the shallow understanding of racial, economic and social inequality that continues to plague French society. The World Cup victory of a multi-cultural France allowed the French state and media to temporarily plaster over deep fissures without addressing the root causes of discontent. Owing to the universal nature of the game, the cohesion and success of a football team can be very easily translated into populist notions of unity and togetherness. Likewise, these concepts are often inspired by the collective spirit integral to team sports.
Despite its limitations, international football is situated in a unique position, where individual relationships with the state coalesce into either a collective sense of belonging or unbelonging. To be able to understand how people and communities relate to their national football team is to gain an insight into how they relate both to themselves and where they live.
The burgeoning cultural and financial potency of world football has benefited from an increasingly networked, globalised world. Football has grown in stature as a worldwide game, transcending borders, languages, races and religions. The simplicity and the aesthetics of play have contributed to its ascendant popularity, whilst free-market economics have encouraged both the corporatisation and commodification of the game.
Investment in stadium infrastructure in England was kick-started by the birth of the Premier League and the virtual end of live-televised league football on free domestic channels. Ultimately, the transformation of how the sport was both accessed and managed lay in the increased exposure provided by Sky TV. Principal income streams switched from match-day takings, including tickets and merchandise, to the ever-growing sums from TV deals, while the football stadiums became sites for executive boxes, naming rights and touchline-to-touchline advertising and sponsorship.
In line with neoliberal economics, the influx of new money did not remove inequality but instead exacerbated it. Well-established football clubs that already had money were able to tighten their financial grip on professional football, whilst those at the lower end of the spectrum continue to drift further away, facing administration and a future of financial uncertainty. As television money pours into the English Premier League, the maldistribution of wealth is a salient reminder of the society it is situated in.
Alongside the increase in capital flows, labour flows have also predictably broadened, bringing players from all over the world to the top European leagues. This mobility of labour not only diversified talent but also undoubtedly improved the standards of football across the world, especially Europe where many of these players sought to forge careers. However, with this movement arose ample opportunity for exploitation, particularly of young Africans, who were trafficked on false promises to jobs that didn’t exist. With such vast quantities of money thrown around at the highest levels of professional football, the potential for injustice is amplified, particularly in the search for social mobility and economic security. From countries still in recovery from the economic and social destruction suffered under British colonialism, professional football in Europe is considered a viable route out of poverty for many young men in Africa. Not dissimilar from the movement of migrant labour into often precarious, low-paid work in bad conditions in Western Europe, football is not exempt from its role in oppressive labour practices, nor is it very far removed from the spectre of colonialism. Flows of labour from the African continent also mirror neo-colonial resource-extraction models, with players viewed as raw materials that have their value added in the European academies. With fortress-Europe recklessly weaponising borders as a means of deterrence to incoming migrants, it is likely that this will only worsen in coming years, with only the most economically profitable allowed entry.
In England, the very foundation of the Premier League is the well-functioning football club, complete with the consistent exploitation of the lowest-paid workers, from club cleaners to catering staff. Despite the millions pocketed by star footballers, those at the other end of the spectrum, those who make matches possible, often scrape by on minimum wage. The astronomical increases in player wages and commercial revenues have not yet trickled down in any meaningful manner. After a lengthy and well-fought campaign by the Living Wage Foundation, the Premier League committed to ensuring all top-flight clubs will pay workers a living wage. However, this was stipulated to just include directly employed workers, excluding contracted staff, who are also often on precarious, zero-hour contracts. This short-sightedness on the part of Premier League Chief Executive Richard Scudamore demonstrated a real lack of desire to effect lasting change in labour policies not just in football but as an example to all employers. With burgeoning social and cultural influence both in England and the rest of the world, campaigners should use the football industry as a soapbox from which broader social change can be encouraged. Furthermore, for football to remain the most popular sport in the world, it should be willing to recognise its complicity in upholding systems of oppression, especially those from which it directly profits.
Amongst all the debates about whether football can be subverted to achieve social justice or if there is even a space for radical politics in a multi-billion-pound industry, one key thing stands out. Belonging. Whose game is it? Who should be most invested in the redemption of this beast? Those who are the architects of the spectacle or those who watch on, delighted? Numerous campaigns and movements speak about returning football to its roots, nostalgic for a time when football was the preserve of the working classes. Although it is true that football has been made successful through working class labour, the sport has always been controlled by the wealthy, capitalist classes. Codified in public schools, football was initially introduced to working class men as a means of civilising them. It is clear therefore that any reclamation of the game cannot take place at the top level — community-focused, fan-owned clubs are outliers, exceptions. To accept that this global game is too powerful, an uncontrolled monster, is not to give up on it. Instead, it allows us to focus on our communities, our local teams, our supporters groups, to direct our resources where we find utility. It offers up the potential for strength and solidarity beyond tribalism, to join up movements of resistance, from Palestine to Algeria to the militarised borders of Europe. It gives people a space in which to create a game in their own image.
Football is not separate from the society that supports it; rather it is irrevocably tied up with the most unjust systems of racism, patriarchy, homophobia, transphobia and capitalism. This, however, is precisely why the potential for resistance is so huge and necessary. The collective spirit in football, whether it’s on the street or in the stadium, is unparalleled. This is what must be harnessed to unsettle and destabilise systems of power, to liberate occupied peoples and to imagine a game that we can be proud of.
In The Wretched of the Earth, Frantz Fanon’s classic account of colonialism, he wrote:
If sports are not incorporated into the life of the nation, in the building of the nation, if we produce national sportsmen instead of conscious individuals, then sports will quickly be ruined by professionalism and commercialism.
If it is too late for football to be used to build, we must be willing to use it to destroy.
1995 may have been the year that Britpop burst through, but 1996 was the year in which it loomed largest and was most overbearing, Oasis in particular, despite not releasing an album that year. 1996 was still a year of Conservative g overnment, but so commanding was Tony Blair’s lead in the polls it was clear he was Prime Minister elect. It was possible, in 1996, for him to bask in the unspoiled glow of his triumph in bringing the long Tory nightmare to an end, untarnished by the many compromised decisions he would make almost immediately on taking office in 1997, beginning by accepting a £1 million donation from Formula 1 supremo Bernie Ecclestone, only months later to grant exemption to the motor racing organisation from a general ban on cigarette advertising. All of that was to come; in 1996, he was still practically an honorary Oasis band member.
1996 was also the year of Euro ’96, in which English footballing hopes were bound up with the worlds of both comedy and music. It wasn’t just Baddiel and Skinner’s collaboration with The Lightning Seeds, “Three Lions”, but the sanguine, laddish, retrograde mood engendered by Britpop and Loaded. It wasn’t just football that was coming home, but the general sense that after the dark Seventies and the fragmented Eighties, Britain (led by England, of course) had rediscovered its mojo, the spring in its step, the spirit of Hurst and McCartney, the white heat of a bygone era.
“Sport is a battle” is the metaphor we are now required to live by as football fans. The club must survive and prosper at the cost of everything else. However, this formula changes somewhat in international football, where the “need” for victory is often sutured unquestioningly to the national cause. Curiously, this relationship seems to intensify even as the sense of common purpose between clubs and communities fades. This came to light in a peculiarly candid way during the predictable period of recrimination following England’s equally predictable early exit from the 2014 Brazil World Cup. Even before the players had set off for home Harry Redknapp, the geezerish and journalist-friendly cockney who had been passed over for the England manager’s job in 2012 because of a pending court case, turned up in the press claiming that a number of English internationals were in the habit of begging their club managers to withdraw them from the national squad for friendly games. The allegation was stark: that some English players regard playing for their country not as an honour, but as an annoyance. England coach Roy Hodgson and his outgoing captain Steven Gerrard cannily took the sting out of Redknapp’s comments by asking him to name names, but the matter did not drop entirely. Former England striker and current light-entertainment go-to Ian Wright wrote in his column in the Sun newspaper that any player found to have shirked international “duty” without good reason should be required to phone the parents of a soldier killed in Afghanistan to explain their decision to drop out.
This was imagined on Twitter in plenty of bleakly funny versions of how the transcript of such a call might read. Palpably, the suggestion was a piece of attention-seeking on the part of Wright, who has never, it seems, got over his early-career rejections or his marginalisation in the 1990s England team by more rounded strikers such as Alan Shearer. However, it spoke to something in England’s present-day ideological make-up, namely a resurgent patriotism of symbols which regards Englishness, whatever that might mean, as somehow under threat. The role the football player takes in this set of beliefs is intriguing. Wright was playing to the idea that the default setting for footballers is a patriotic one, that they feel a sense of pride in national symbols which extends beyond their utilitarian, team-bonding value. By linking this version of patriotic obligation to that of the soldier’s, he tacitly insists on the relative unanimity of nationalistic sentiment amongst the working-class communities that both footballers and the rank-and-file military are drawn from.
While one does find the occasional player, such as Serbia’s Siniša Mihajlović or Croatia’s Zvonimir Boban, for whom patriotism is obviously a very real and visceral thing, it seems plausible and even likely that the average international player uses it as a motivational tool, a way of rationalising responsibility to the footballing cause. There’s a ludicrous misrecognition on the part of the right-wingers doing their Queen-and-country act in the stands who think the men on the pitch automatically share their blood-and-soil mentality: footballers, like most sportspeople, tend to focus themselves out of any formal political identification and even, in some cases, vaguer political affects. Presenting footballers as exclusively patriotically motivated is a form of fantasy about working-class politics, which is to say that it suits certain agendas to treat the “proles” as intrinsically nationalistic, thus implicitly turning anti-nationalistic (typically socialist) politics into an illegitimate bourgeois charade. And here lies the true equivalence between footballers and soldiers. The majority join the military because of the route it offers out of poverty, regardless of the narrative which states that they do so through an unmediated love of the patria. This narrative has, both in the UK and the US, a double function, simultaneously masking socio-economic inequality and lending affective “credibility” to those countries’ ridiculous joint-enterprise neo-imperial wars. The linking of footballers to soldiers, then, has as its ultimate outcome an intensification of the militarisation of British society, the same phenomenon, in fact, that we witness when, on the occasions when England score a goal at an international tournament, the footage cuts away to show soldiers watching the game from whichever theatre of operations they have been sent to in the latest stage of the quixotic War on Terror.
That said, the determinations of an intensified seriousness in the visual language of the football media are not limited to society’s broader militarisation. One thinks of the way that various England internationals from the present and the recent past, such as the aforementioned Gerrard, John Terry and new captain Wayne Rooney, seek to present themselves to the nation. The media consensus around the England team emphasises their surfeit of passion, which supposedly exists in inverse proportion to a shortfall of technical ability and tactical nous, but to actually watch an England game is, very often, to be struck by the cowed performances and expressions of players we are supposed to think of as possessed of leonine bravery and aggression. These are rarely performances full of sound and fury but lacking in signification: in fact, they are bereft of all these attributes.
Gerrard’s career is almost precisely coterminous with the Blair – Brown – Cameron era in British politics. In this period, the affective aspect of politics has intensified in counterpoint to a more generalised “waning of affect”: being seen to “care”, or to share in spuriously “common” desires which have replaced genuine collective purpose, seems to be regarded as a far safer bet electorally than possessing either proven competence or the potential for developing it. At the same time, and this is something which takes us once again to those portentous kit advertisements, the tenor of branding has changed significantly, with the governing maxim no longer “this product is great” but “this product is invested with passion”. We’re passionate about conservatories! We’re passionate about crisps! We’re passionate about dog food! However much it cloys with us, it is hard to believe in an individual who is not to some extent invested in aspects of these values, for who would want to be perceived as not caring?
To be regarded as wrongly or cynically motivated is something which footballers must deal with constantly: no wonder Gerrard, Terry, Rooney and the like must seek not only to play football well, but to come across as adequately invested, when they and the rest of their profession are subject to constant slights about the essential worthlessness of what they do. For all the substantial material recompense playing the sport earns them, there are few jobs which invite more clamorous accusations of social irrelevance and metaphysical inanity. This is an issue which comes up every time there is a big international competition. Of course, football becomes unpleasantly ubiquitous during the World Cup, with the main sufferer of this ubiquity being not those who don’t enjoy the game but, counterintuitively, those who do. The unpleasantness is a consequence of ubiquity’s tendency towards dilution, which has the consequence of football being turned into “footie”, that abstracted version which lends itself to all kinds of dismal exercises in masculinist and nationalistic identity formation. Watch the footie on telly last night, mate? Well, no, I went to the football last night. If you spend every weekend of the season following a team, it is pretty easy to come to feel alienated during the World Cup or European Championships, when the sport becomes the preserve of geezerish dilettantes and the themed ladvertising kicks in. It’s at this stage that I usually start to feel sympathy for people who dislike football entirely.
That’s until things turn up like the irritating 2014 meme imagining an alternate reality in which archaeology, rather than football, dominates the media and archaeologists are paid thousands of pounds a week. In the weeks leading up to the World Cup in Brazil, this became the definitive plaint on the behalf of the non-believers, the document tasked with articulating to football fans just what it means to be on the outside of the festivities. Despite the fact that I can imagine what it must be like, for the precise reason that I largely feel the same, for the non-football fan to be bombarded with “footie” for a whole month every second summer, I couldn’t identify at all with the meme.
Let’s think, first of all, about why it is specifically archaeology which replaces football in this ostensibly harmless thought experiment. Why not, say, “shopping” or “military history”? I have no score to settle with archaeology – who would, honestly? – and appreciate the discipline’s substantial, if not politically unproblematic, contribution to the sum total of human self-understanding. However, the field does have certain connotations which are useful in particular forms of self-presentation. Archaeology carries with it an image of wistful past-gazing, of laudable knowledge-foraging, of being the kid who ignored football in the playground because they were too busy digging away in the corner looking for clay pipes or Neolithic man. For all of its fascinations, it is also a realm in which the humblebragging, self-anointed geek enjoys considerable social capital.
In other words, it’s just the kind of thing which appeals to that online constituency Jacques Lacan anticipated when he said that thing about how les non-dupes errent, how the non-dupes are mistaken. That it is the not-fooled, the people who “see through stuff”, who are the most taken-in ideologically, has always had a considerable degree of appeal, but never more so than in the era of internet atheism, an age in which meme factories like the smug I Fucking Love Science pour out quotable rationalism seemingly by the second. Lovely archaeology coming on as a substitute for aggressive, alpha-male, avaricious, irrational (and, though the piece would never dare mention it, largely working-class) football seems to me the kind of notion that really speaks to the aren’t-bees-more-fascinating-than-Jesus, calling-Valentine’s-Day-Hallmark-Holiday, Stop-Kony crowd.
But, lest we fail Practical Criticism 101, let’s go back to the text itself. The point of the meme, remember, is to induce some sort of artificial parity between football and archaeology – to ask us to imagine if archaeology, presented without additional ideological freight, and football, presented likewise, swapped places in the cultural imagination. However, the writer cannot resist the opportunity to start introducing other elements into the equation almost as soon as it has been established, finding subtle ways of embedding value judgements. Here, it’s imagined that archaeologists acquire the same, “worst possible” behavioural traits that the media at large attributes, with consummate dishonesty, to all footballers. The rationale for doing this is not, as it purports to be, to get us to imagine archaeologists on an alcohol-fuelled rampage in Mayfair, but to remind us that football players are uncouth (working-class) louts who provoke “scandal”.
Then there’s another dig. Having hypothesised an archaeologist who would “act” like a footballer, the writer reminds us what an archaeologist would be doing when they’re not up to no good, namely “searching the past for answers”. That’s to say that their professional activity would still be of considerable value, inviting a comparison to the implied “pointlessness” of football. Such purported pointlessness is a classic canard of a hypocritical utilitarianism which locates value (or “point”) in, say, BBC4 documentaries about archaeology or Scandinavian crime dramas, but not in competitive sport. This, I suspect, is an aspect of that classic piece of political equivocation by which utilitarianism is good for the working-class goose, but not appropriate for the middle-class gander, one which seems to be reserved largely for football.
As the campaign in Ireland to Repeal the 8th reaches its climax, here’s a guide to some movements from the US that are combining feminist tactics, social media and political strategies to challenge abortion stigma.
Abortion Stigma: From a Whisper to a Shout
This is an edited extract from From a Whisper to a Shout: Abortion Activism and Social Media, out now from Repeater.
Abortion. Can we finally stop whispering about it?
Abortion has been around almost as long as pregnancy. Historical records show separate references to abortion techniques as long ago as 3000 years BCE in Egypt, Greece, and China, and major world religions did not forbid it. Even the Catholic Church permitted abortion until the moment of ensoulment, believed to occur at the time of quickening, the first time the pregnant person can feel movement of the fetus. No one before the nineteenth century believed abortion was taking a human life. Traditional Hebrew religious law did not consider a woman pregnant until forty days after conception, allowing a window in which abortion was morally acceptable. Some classical interpretations of the Quran permitted abortion before ensoulment, defined here as the moment when an angel breathes the spirit into the fetus at 120 days, while other interpretations approved abortion conditionally for valid reasons. Still other schools of Islam forbade abortion entirely.
As a secular society, the US history of abortion is mostly the history of its regulation. Pregnancy termination was widely practiced but completely unregulated in the eighteenth and early nineteenth centuries. An induced abortion before quickening wasn’t even considered an abortion but a pregnancy that had “slipped away”. The earliest regulations, such as Connecticut’s 1821 law, the first on record, were legislated to protect women from being poisoned by dangerous abortifacient drugs sold by unscrupulous vendors. These early laws were poison-control measures, not abortion restrictions, and they did not challenge the concept of quickening or women’s right to make decisions about their pregnancies. Their autonomy over their bodies was preserved by what the law omitted. By the mid-nineteenth century, the emerging medical profession sought to restrict abortion to take control of the procedure — and everything else related to pregnancy — away from women and midwives.
The American Medical Association (AMA) was formed in 1847 and provided physicians with an infrastructure from which to organize an anti-abortion campaign. The basis of the campaign was professional issues, as the organization moved to medicalize pregnancy and childbirth; but from the vantage point of the twenty-first century, racist and anti-feminist messages are easy to read in some early anti-abortion campaigns. Dr Horatio R. Storer, a prominent advocate of banning abortion, expressed anxiety about Mexicans, Chinese, Blacks, Indians, or Catholics spreading into the American West instead of native-born white Americans. He was opposed as well to the concurrent push from women to enter medical school. Storer also took issue with quickening: he argued that it is not a fact or a medical diagnosis, “but a sensation” based on women’s bodily sensations, leaving doctors dependent on women’s judgment and self-understanding.
By 1890 abortion was criminalized in every state, although it wasn’t treated exactly the same in each. In some places it was permitted only when necessary to save the life of the woman, while in other areas women could receive an abortion in a doctor’s office or even at home. The latter practice ended in the mid-twentieth century as new methods of controlling abortion were implemented by medical and legal authorities. By this point in the 1950s, somewhere between 200,000 and 1.2 million illegal, unsafe abortions were performed every year, according to Dr David Grimes, a former chief of the Abortion Surveillance Branch at the Centers for Disease Control and Prevention. It was also mid-twentieth century when activists began to work for the repeal of abortion laws and restrictions. This grassroots movement pre-dates what is now known as second-wave feminism.
The first activists to advocate for abortion access in terms of women’s rights, Patricia Maginnis, Lana Phelan, and Rowena Gurner, came together in California in the early 1960s — first Maginnis and Gurner in 1961, with Phelan joining in 1965. The group they formed, the Society for Humane Abortion (SHA), worked tirelessly for more than a decade to educate the public about the issues. They delivered presentations about abortion at conferences and in classes, and in 1969 published the satirical guide The Abortion Handbook for Responsible Women. To maintain SHA’s tax-exempt status, they created a second group, Association for the Repeal of Abortion Laws (ARAL), which eventually grew into the National Association for the Repeal of Abortion Laws (NARAL). After the 1973 US Supreme Court decision legalizing abortion, the group’s new leadership shifted its focus to keeping abortion “safe and legal” and changed its name to National Abortion Rights Action League (still NARAL). In 2003, the organization changed its name to NARAL Pro-Choice America, stating that they faced a hostile political climate for abortion and the new name “underscore[s] that our country is pro-choice”, according to then-president Kate Michelman. It’s surely no coincidence that the new name completely omitted the word abortion.
Several US states revoked criminalization of abortion and allowed termination of pregnancy before twenty weeks in 1970: first Hawaii, then New York, Alaska, and Washington. Also in 1970, a young, impoverished Texas woman named Norma McCorvey found herself unintentionally pregnant for the third time and challenged the state law prohibiting abortion. Texas laws were then the most restrictive in the nation. Because both abortion and unmarried pregnancy were considered so shameful, she was referred to in court documents as Jane Roe to protect her privacy. By the time the US Supreme Court ruled in her favor, on January 22, 1973, it was too late for her to abort, but Roe v. Wade made it legal for every pregnant person in the country to make a decision to end a pregnancy in the first two trimesters, for any reason.
It must be recognized, however, that Roe was decided on the basis of the right to privacy, grounded in the Fourteenth Amendment and judicial precedent, not on the basis of equality. Numerous critics, including current Supreme Court Justice Ruth Bader Ginsburg, have commented on how the decision “is as much about the doctor’s right to recommend to his patient what he thinks his patient needs. It’s always about the woman in consultation with her physician and not the woman standing alone in that case.”
Abortion opponents soon began working to limit abortion access, and in 1976 Congress passed Illinois representative Henry Hyde’s proposed prohibition of Medicaid funds for abortion for indigent women. The constitutionality of the Hyde Amendment has been upheld repeatedly, even its prohibition of medically necessary abortion. Activists working today from a reproductive justice framework note that while the right to privacy is now well established, it is essentially a negative right; that is, a right to be left alone. No positive right to abortion is established, making it functionally accessible only to those who can afford it.
In the forty-four years since the 1973 Supreme Court ruling, individual states have enacted 1,074 abortion restrictions. Of these, 353 (27%) have been enacted just since 2010. Restrictions include mandating medically inaccurate or misleading counselling prior to the procedure; requiring a waiting period after abortion counselling, thus necessitating at least two trips to the facility; mandating a medically unnecessary ultrasound exam before an abortion; banning Medicaid funding of abortion (except in cases of life endangerment, rape or incest); restricting abortion coverage in private health-insurance plans; requiring onerous and unnecessary regulations on abortion facilities; imposing medically inappropriate restrictions on medication abortion; and imposing an unconstitutional ban on abortion before viability or limits on abortion after viability. In this climate, talking about abortion is ever more important — to break abortion stigma and to keep abortion safe, legal, and accessible, abortion must be visible. Instead, abortion has been increasingly stigmatized and shamed since Roe v. Wade was decided.
Organizational/structural stigmas include such examples as the way abortion is physically separated from other healthcare, including gynecological and obstetric care, and inconsistent training available in medical schools. The community level of stigma includes the risk of being labelled promiscuous or careless or worse, another illustration of how abortion is highly stigmatized in the US. Individual-level stigma may be the most variable, as personal experience varies; the discourses of stigma in the community, organizations, government/structures, and media frequently influence individual psyches and experiences. All of these are discursive; that is, created and maintained through language and communication. This discursivity is integral to how they work.
For instance, consider the idea that abortion is stigmatized because it contradicts “ideals of womanhood”. Kumar et al. identify three archetypes that characterize the so-called essential nature of woman in the popular imaginary: a sexuality focused on procreation rather than pleasure; the inevitability of motherhood; and a nurturing instinct. When a woman voluntarily terminates a pregnancy, she is believed to be rejecting all of those archetypes — even though a majority of women who have abortions (59%) are already mothers and the most common reasons cited for seeking abortion are about family responsibilities. Yet voluntarily terminating a pregnancy challenges the moral order of patriarchal culture. In a recent interview with the Washington Post, Cecile Richards, president of Planned Parenthood, named other beliefs about women that abortion decisions and access challenge: “There’s this thought that women are just too scattered, we’re too impulsive, we are too hormonal, we can’t make good decisions for ourselves”.
Most central, according to theories of abortion stigma, is the implicit tension about female sexuality: when, why, how, with whom. Did she use contraception? (That means she intended to have sex!) Is she too young? (She’s not ready to make this decision!) Did she choose the wrong man? (Perhaps someone others think is wrong for her.) And so on. Legislation regulating abortion is the most overt example of language that judges women for deviance from these feminine archetypes and further stigmatizes abortion with the lingering stain of criminality.
This is not to suggest that women are themselves ashamed of sex or sexuality, or even that all women feel stigma about abortion. Those who have terminated pregnancies are a heterogeneous group — in no small part because they are a large group. More than one million women in the US have abortions each year and estimates are that by age forty-five one-third of women will have had an abortion. It is also believed by demographers that abortion in the US is underreported. But abortion secrecy rules are well documented: two of three abortion patients anticipate experiencing stigma and a similar number (58%) feel the need to keep their decision from friends and family — an unfortunate trend, as social support is a mitigating factor in experiencing stigma. The pressure to keep abortion secret is strong enough that even when the procedure is covered by private insurance (about 30% of cases) two-thirds of those patients will pay out-of-pocket rather than have it appear on their medical and insurance records. This may be another factor in heightened stigma in the US; anecdotal evidence suggests that abortion is less stigmatized in the UK, where it is covered by the National Health Service.
Norris et al. (2011) suggest that additional reasons for abortion stigma may lie in the discourse of personhood attributed to the fetus. Advances in fetal medicine, such as 3D fetal photography and advanced fetal surgery, have facilitated this, as has the proliferation of anti-abortion legislation.
Morality and medicine, like fact and fiction, are entangled in many of these state laws enacted to restrict access to abortion. The Guttmacher Institute (2016), a research and policy organization focused on sexual and reproductive health and rights, reports that at least half of the fifty states have imposed regulations designed to deter women, such as mandatory counselling, required waiting periods, required parental involvement for minors, mandatory ultrasound imaging, and prohibitions on the use of Medicaid funds. A separate report documents the proliferation of regulations known as TRAP laws (Targeted Regulations of Abortion Providers), which focus specifically on clinics and providers, mandating clinic requirements in such categories as corridor widths, distance from hospitals, admitting privileges for providers, and more. A 2016 US Supreme Court decision (Whole Woman’s Health v. Hellerstedt) struck down the latter type of regulations. The state law in question had already closed nearly two dozen clinics in Texas since 2013. In the three years since the law took effect, an estimated 100,000 Texas women have self-induced abortions. The number could be more than twice that, depending on how it is calculated. The court’s ruling in Whole Woman’s Health v. Hellerstedt concluded “that neither of these provisions offers medical benefits sufficient to justify the burdens upon access that each imposes. Each places a substantial obstacle in the path of women seeking a previability abortion [and] each constitutes an undue burden on abortion access”, as well as being violations of the Constitution.
Many state regulations seem frankly designed to shame and stigmatize, as well as impose additional burdens on the process. Some regulations appear intended to influence women against abortion with bad science. For example, six states require abortion providers to advise women that the procedure can result in severe mental health consequences, despite repeated research showing this is not true. Yet no state requires that pregnant people be informed of the research linking unintended pregnancy and childbearing with adverse mental health outcomes. Kansas, Texas, South Dakota, and Arizona require that the pre-abortion counselling session include a statement that abortion may harm their ability to conceive in the future. The American Congress of Obstetricians and Gynecologists (ACOG) has affirmed that “that one abortion does not affect your ability to get pregnant or the risk of future pregnancy complications”. Five states still mandate that materials or counselling be provided informing women of a potential link between abortion and breast cancer — a link that was debunked almost fifteen years ago when the National Cancer Institute convened a workshop of more than a hundred of the world’s experts on the subject, who concluded that abortion does not increase a woman’s risk of breast cancer. This finding has been affirmed by ACOG and the American Cancer Society, as well as a panel convened by the British government in 2004.
Equally egregious and insulting are the mandatory waiting periods, required in twenty-seven states. These “cooling-off periods” range from eighteen hours to three days after pre-abortion counselling before patients can receive an abortion, and usually exclude weekends and holidays. These regulations are based on the belief that women choose impulsively to terminate an unintended pregnancy, or perhaps have not considered the full impact of their choice; that is, that upon termination, they will no longer be pregnant. State legislators presume that women cannot make good decisions. Research has shown that women are actually less conflicted in abortion decisions than people making decisions about other medical procedures. A recent study of Utah’s seventy-two-hour waiting period found that most clients (91%) were just as certain after the three-day wait; 17% reported that waiting made them more certain. Previously published studies by the same researchers have shown that waiting periods increase the cost of the abortion and cause logistical hardships for patients. Civil rights lawyer Danielle Lang (2016) writes, “The argument goes as follows: Women, whose natural role is mother, would never in their right mind seek to terminate a pregnancy. Their choice therefore must be a result of bad influence, coercion or undue pressure.” Fifteen states even require pre-abortion counselling to inform the woman that she cannot be coerced into obtaining an abortion. It’s difficult to imagine other medical procedures being treated in this fashion, such as a three-day waiting period to have an aching tooth pulled or gaping wound stitched up, with a warning that no one can compel you to have those stitches. In his recently published memoir, abortion provider Dr Willie Parker extends the comparison to cancer, writing:
A woman who decides not to pursue treatment and to shorten her life in order to be clear-minded for as long as possible is considered “brave”. A woman who decides to take radical action, to undergo surgeries and try every experimental drug in the pipeline is a “warrior”. Even patients with lung cancer are not blamed and judged for smoking in the same way that women who seek abortions are blamed for having sex.
These regulations may individually seem minor, but their impact and reach are significant. Gold and Nash (2017) point out that seventeen states have at least five types of these restrictions that flout scientific evidence, and 30% of US women of reproductive age live in those states. Their review examines ten categories of restrictions; 53% of US women live in a state that has at least two of these restrictions.
In addition, abortion stigma has been effectively weaponized by anti-abortion activists. Protestors have used shaming and insulting language — such as calling patients “murderer” and “slut” as they enter clinics — and created overt barriers at clinics across the country. Despite the FACE Act (Freedom of Access to Clinic Entrances) and the presence of volunteer escorts, anti-abortion zealots regularly show up at clinics to engage in prayer vigils and so-called sidewalk counselling, as well as to mob the cars and taxis of clinic patients and their companions and to photograph patients and providers without permission. “Sidewalk counselling” often consists of repeated interrogation about whether the patient has Jesus in her life, or slut shaming. A 2017 mini-documentary about clinic protestors produced by Rewire News shows anti-abortion activists acknowledging that they write down license plate numbers of clinic staff, which prompts anxiety and fear among clinic employees.
Abortion stigma reaches beyond those who have the procedure: abortion providers and clinic workers also experience it, albeit of a somewhat different nature. This has to do in part with structural forces of current medical practice. For instance, most abortions occur in women’s health clinics, which increases their isolation from the rest of healthcare. This was initially a strategic choice by proponents to maintain sensitive and women-controlled care of patients. Today it has made providers easy targets – literally and metaphorically – for abortion opponents. They are subject to harassment and violence and their clinics are singled out for regulations that other kind of outpatient medical facilities are not. The scarcity of abortion providers today results in some doctors becoming de facto abortion specialists (which is not inherently troublesome, as many physicians and surgeons develop expertise in particular procedures). The stigma is greater if they perform second-trimester abortions. Although providers experience these sources of abortion stigma on a continual basis, they can counter them with belief in the value and necessity of their work. On the other hand, unwillingness to disclose their work in public or social settings can increase the perception that abortion work is unusual or deviant, and reinforce the stigma. Family and friends of the patient may also experience abortion stigma. Male partners have reported ambivalence, guilt, sadness, anxiety, powerlessness — the same range of emotions felt by women who seek abortions.
Kumar has cautioned researchers to avoid “conceptual inflation” in defining abortion stigma, noting the importance of separating stigma analytically from prejudice and discrimination as it is both a cause and a consequence of inequality. To understand the power dynamics, the concept must be narrowly defined. When abortion is highly politicized, as in the US, “greater conceptual clarity is needed on the power differentials that create and maintain abortion stigma such as those related to race, age, and class”.
The precise nature of abortion stigma comes into clearer focus upon examination of women’s stories of abortion experience. Two documentary films released in 2005 featured personal stories of abortion: I Had an Abortion by Jennifer Baumgardner and The Abortion Diaries by Penny Lane. Both films had the explicit purpose of making abortion narratives public, widely shared, and without shame. Both films feature individuals stating that the shame of unintended pregnancy is greater than the shame of abortion. Joh in The Abortion Diaries tells her interviewer it was upsetting to her “that somebody knew that I’d messed up. ’Cause that’s what it was considered, that if you got pregnant, you messed up.” Baumgardner reports a male friend who has been part of more than one abortion (not uncommon for an urban, single, heterosexual dude approaching forty) who says it’s not the abortion that’s shameful, it’s the pregnancy. A 2016 HBO film, Abortion: Stories Women Tell by Tracy Droz Tragos, shows little change in the intervening decade. While the new film differs from those from 2005 in that it includes stories of women who elected to continue their pregnancies and stories of abortion protestors as well as abortion stories, the theme of abortion stigma is still strong. In a Newsweek interview, Tragos references all the women she spoke with who could not tell their stories on camera, fearing repercussions at work or at home, saying they found “some solace in even talking about [it] and being heard”.
Smith et al.’s recent interview study of reproductive decisions among young women in the American South found stigma attached to both pregnancy and abortion. Within their Alabama communities, “young women faced with an unintended pregnancy were often deemed ‘fast’ and labelled ‘heathens’ or ‘whores.’ Participants described how women can be the target of accusations and gossip regardless of their pregnancy decision”. A few participants reported losing friends when others, especially parents of their friends, learned of their pregnancies. Interview studies of adult women have also found women reporting disapproving attitudes from friends and family members, as well as loss of friends and boyfriends over abortion decisions. Reproductive justice activist Loretta J. Ross (2016) suggests that respectability politics today leads to more abortion shaming in our era of accessible birth control and legal abortion because now women are seen as both moral and intellectual failures for not using contraception, whereas a previous generation of young women with few options might be chastised only for their risky abortions.
Shame and secrecy surround nearly every reproductive health event; menstruation and menstrual disorders, miscarriage, and breastfeeding all follow widespread cultural norms of concealment and secrecy. For instance, Seear (2009) reported average diagnostic delays of eight years in the UK and eleven years in the US for endometriosis, due largely to stigma that normalizes menstrual pain and enforces silence. Miscarriage is also difficult to discuss and widely misunderstood, yet common: it occurs in about one-fourth of pregnancies in the US but 55% of Americans believe that it is rare. A national survey about miscarriage found that while most respondents knew that genetic malformation is a possible cause of miscarriage, substantial numbers agreed with the following incorrect causes: lifting heavy objects (64%), having had a sexually transmitted disease in the past (41%), past use of an IUD (28%), past use of oral contraception (22%), or getting into an argument (21%). Among survey respondents who had experienced miscarriages, 47% reported feeling guilty, 41% reported feeling that they did something wrong, 41% reported feeling alone, and 28% reported feeling ashamed.
This reproductive secrecy imposes a norm of silence around any reproductive experience that does not meet patriarchal ideals of femininity, such as adhering to the roles of good wife and good mother; in other words, there is no acceptable social discussion of miscarriages, abortions, or the decision to be childless by choice. The invisibility of these common experiences belies how normal they are. For example, many women in Gelman et al.’s study did not learn how common abortion is until they had one. They told researchers how surprised they were at the packed waiting rooms, expecting that “it would be me and maybe like one or two other people”, as one informant said.
Ellison suggests that this “cultural censorship of an experience shared by so many women reinforces an inflexible tension between cultural ideals and women’s lived realities”. This can be seen easily among Smith et al.’s young informants, who report that unintended pregnancy is both shamed and a common occurrence in their communities. These young women learn to regard abortion as even more shameful, reporting that they’ve been told abortion is irresponsible, immoral, selfish, and “you’ll get sick”, making abortion invisible in these communities. The women told the researchers stories of friends and family members who had obtained secret abortions and of miscarriages that they suspected were “hidden” abortions.
While the rate of unintended pregnancy has declined considerably over the last decade, it is still a common occurrence everywhere in the US, at 45% of all pregnancies. Twenty-one percent of all pregnancies end in abortion, leading to an annual total of more than one million pregnancy terminations. Both abortion and unintended pregnancy show a downward trend, with the most likely explanation “a change in the frequency and type of contraceptive use over time”, especially long-acting hormonal methods such as IUDs. At current rates, one of every four American women is likely to have an abortion by age thirty, and one in three by age forty-five.
But abortion stigma and shaming of women who seek abortions, their supporters, and abortion providers shows little decline. There is, however, a bold new movement of feminist activists challenging and resisting abortion stigma. These groups rely heavily on the internet and social media platforms to share stories and inform others about pending legislation and judicial decisions, as well as to promote participation in traditional activism, such as protests, meetings, and rallies. One of their shared goals is to normalize abortion by talking about it. These groups include #ShoutYourAbortion, Lady Parts Justice, We Testify, The Abortion Diary, 1 in 3 Campaign, Sea Change, Shift (affiliated with Whole Woman’s Health Alliance), My Abortion My Life (a project of PreTerm clinics of Ohio), Rewire, URGE (Unite for Reproductive and Gender Equality, which uses the hashtag #AbortionPositive), Abortion Story Project, Project Voice, and probably many more. In the interest of time and space, I will focus here on #ShoutYourAbortion, Lady Parts Justice, We Testify, The Abortion Diary, and 1 in 3 Campaign. These five were selected partly for convenience, but also because they are among the most visible of these groups and they represent use of diverse social media channels: Twitter, YouTube, Facebook, websites, and podcasts.
#ShoutYourAbortion (SYA) is, according to their website, a “decentralized network of individuals talking about abortion on their own terms and creating space for others to do the same”. The SYA slogan says, “Abortion is normal. Our stories are ours to tell. This is not a debate.” The hashtag emerged in 2015, from a Twitter conversation between Amelia Bonow and Lindy West, in response to a congressional vote to defund Planned Parenthood. Bonow wrote on Facebook about how grateful she was to have had abortion accessible at her local Planned Parenthood affiliate when she needed one the previous year, and with her permission, West shared her story on Twitter, with the hashtag #ShoutYourAbortion.
Lady Parts Justice (LPJ) was founded by comic Lizz Winstead, filmmaker Arun Chaudhary, and marketing CEO Scott Goodstein. LPJ and their sister organization, Lady Parts Justice League (LPJL), produce satirical, informative videos about anti-reproductive rights legislation, including parodies of popular songs and television shows re-worked with messages related to current abortion legislation. LPJL provides USO-style support to independent clinics all over the country, performing comedy shows that serve as entertainment for clinic employees and as fundraisers for clinic needs, often in regions that seldom see popular comedy performers tour. LPJL has also helped escort patients, planted a protective prickly bush in a fencing gap to block protesters, painted fences, counter-protested anti-choice activists, and thrown dance parties in parking lots for escorts and staff.
We Testify, a program initiated by the National Network of Abortion Funds, “is dedicated to increasing the spectrum of abortion storytellers in the public sphere and shifting the way the media understands the context and complexity of accessing abortion care”. It is a leadership program designed to support people of color in building their power and telling their stories. Renee Bracey Sherman, We Testify Program Manager, is the author of a resource guide for public abortion storytelling published by Sea Change.
The Abortion Diary podcast was started by Melissa Madera in the summer of 2013, after Madera had kept her abortion experience secret for thirteen years and seen the impact that finally telling her story had on her family, her friends, and herself. She chose the platform of podcasting to create a safe space for sharing stories “because, more than anything else, I wanted to listen”. By the beginning of 2017, she had collected more than two hundred abortion stories.
The 1 in 3 Campaign is the oldest abortion story project on the internet, and features the largest set of abortion stories published online, and the only bilingual collection, with stories published in English and in Spanish. More than a storytelling platform, 1 in 3 is also an organizing hub for college students working on reproductive justice issues on their campuses. It was created in 2011 by Advocates for Youth, an organization dedicated to supporting sexual and reproductive rights for young people.
Each has unique strengths and emphases. For instance, #ShoutYourAbortion centers visibility and politics of recognition, while Lady Parts Justice uses feminist humor as a tool of political analysis, and We Testify and the 1 in 3 Campaign use story-sharing to politicize themselves and others. All are working to normalize abortion, as they politicize supporters through narrative, online outreach, activism, and consciousness raising. For imagining a world without abortion stigma does not require that we celebrate abortion, only that we acknowledge it openly and without shame. A world without abortion stigma is also a world in which women can experience a vibrant sexuality, including access to comprehensive birth control and reproductive health services without shame, and one in which talking about those experiences is neither forbidden nor required.
An edited extract from Authentocrats by Joe Kennedy, out on 21st June from Repeater
The Nine Yorkshiremen of the Apocalypse
On the Friday evening before the June 2017 United Kingdom General Election, a special edition of the BBC’s political debate show Question Time was broadcast in which Theresa May, the leader of the Conservative Party, and Jeremy Corbyn, who we’ve met already, were invited to York to answer questions from a curated studio audience. Because May had refused a head-to-head debate, the would-be prime ministers spoke separately, and Corbyn found himself on second. May had already endured a vexing time, being forcefully challenged over Conservative cuts, particularly to the NHS, and a related public-sector pay-freeze in a way she clearly found difficult to parry. As the audience had been handpicked for balance’s sake, it was clear that Corbyn would have to endure a similar temperature of scrutiny, but the themes of his interrogation were pointedly different.
By lunchtime the following day, the image of those who took Corbyn to task had imposed itself on the consciousness of not only many on the British left, but on the public at large. Nine audience members’ faces had been screen-grabbed and corralled in a composite image, swiftly circulated on Twitter, Facebook and beyond, which was designed to demonstrate the uniformity of the debate’s conservative
Next week we’ll be publishing Ryan Alexander Diduck’s Mad Skills: MIDI and Music Technology in the Twentieth Century, a cultural history of MIDI and it’s impact on the ways music is made and consumed.
From today you can read an extract from the fourth chapter, “Synthesizer, Sampler, Mixmaster, Spy”, in The Wire.
In the beginning, there was the word. The word was a voice. The voice had a speaker. And the speaker knew the magic words. Fast-forward thousands of years to a time when humans behave like robots and robots behave like humans. Nobody knows the magic words anymore. Computers don’t distinguish between messages of love or hatred. Microchips make music and war with indifferent equivalence. All word, every voice, is now code. It has been for years.
You can read the rest of the extract here.
To commemorate the passing of Mark E Smith, below is Mark Fisher’s analysis of The Fall’s Grotesque (After the Gramme), from The Weird and the Eerie (2016).
“Body a tentacle mess”: The Grotesque and The Weird: The Fall
The word grotesque derives from a type of Roman ornamental design first discovered in the fifteenth century, during the excavation of Titus’s baths. Named after the ‘grottoes’ in which they were found, the new forms consisted of human and animal shapes intermingled with foliage, flowers, and fruits in fantastic designs which bore no relationship to the logical categories of classical art. For a contemporary account of these forms we can turn to the Latin writer Vitruvius. Vitruvius was an official charged with the rebuilding of Rome under Augustus, to whom his treatise On Architecture is addressed. Not surprisingly, it bears down hard on the “improper taste” for the grotesque: “Such things neither are, nor can be, nor have been,” says the author in his description of the mixed human, animal, and vegetable forms: “For how can a reed actually sustain a roof, or a candelabrum the ornament of a gable? Or a soft and slender stalk, a seated statue? Or how can flowers and half-statues rise alternately from roots and stalks? Yet when people view these falsehoods, they approve rather than condemn, failing to consider whether any of them can really occur or not.”
— Patrick Parrinder, James Joyce
If Wells’ story is an example of a melancholic weird, then we can appreciate another dimension of the weird by thinking about the relationship between the weird and the grotesque. Like the weird, the grotesque evokes something which is out of place. The response to the apparition of a grotesque object will involve laughter as much as revulsion, and, in his study of the grotesque, Philip Thomson argued that the grotesque was often characterised by the co-presence of the laughable and that which is not compatible with the laughable. This capacity to excite laughter means that the grotesque is perhaps best understood as a particular form of the weird. It is difficult to conceive of a grotesque object that cannot also be apprehended as weird, but there are weird phenomena which do not induce laughter — Lovecraft’s stories, for example, the only humour in which is accidental.
The confluence of the weird and the grotesque is no better exemplified than in the work of the post-punk group The Fall. The Fall’s work — particularly in their period between 1980-82 — is steeped in references to the grotesque and the weird. The group’s methodology at this time is vividly captured in the cover image for the 1980 single, “City Hobgoblins”, in which we see an urban scene invaded by “emigres from old green glades”; a leering, malevolent cobold looms over a dilapidated tenement. But rather than being smoothly integrated into the photographed scene, the crudely rendered hobgoblin has been etched onto the background. This is a war of worlds, an ontological struggle, a struggle over the means of representation. From the point of view of the official bourgeois culture and its categories, a group like The Fall — working class and experimental, popular and modernist — could not and should not exist, and The Fall are remarkable for the way in which they draw out a cultural politics of the weird and the grotesque. The Fall produced what could be called a popular modernist weird, wherein the weird shapes the form as well as the content of the work. The weird tale enters into becoming with the weirdness of modernism — its unfamiliarity, its combination of elements previously held to be incommensurable, its compression, its challenges to standard models of legibility — and with all the difficulties and compulsions of post-punk sound.
Much of this comes together, albeit in an oblique and enigmatic way, on The Fall’s 1980 album Grotesque (After the Gramme). Otherwise incomprehensible references to “huckleberry masks”, “a man with butterflies on his face”, “ostrich headdress” and “light blue plant-heads” begin to make sense when you recognise that, in Parrinder’s description quoted above, the grotesque originally referred to “human and animal shapes intermingled with foliage, flowers, and fruits in fantastic designs which bore no relationship to the logical categories of classical art”.
The songs on Grotesque are tales, but tales half-told. The words are fragmentary, as if they have come to us via an unreliable transmission that keeps cutting out. Viewpoints are garbled; ontological distinctions between author, text and character are confused and fractured. It is impossible to definitively sort out the narrator’s words from direct speech. The tracks are palimpsests, badly recorded in a deliberate refusal of the “coffee table” aesthetic that the group’s leader Mark E. Smith derides on the cryptic sleeve notes. The process of recording is not airbrushed out but foregrounded, surface hiss and illegible cassette noise brandished like improvised stitching on some Hammer Frankenstein monster. The track “Impression of J Temperance” was typical, a story in the Lovecraft style in which a dog breeder’s “hideous replica”, (“brown sockets… purple eyes … fed with rubbish from disposal barges…”) stalks Manchester. This is a weird tale, but one subjected to modernist techniques of compression and collage. The result is so elliptical that it is as if the text — part-obliterated by silt, mildew and algae — has been fished out of the Manchester ship canal which Steve Hanley’s bass sounds like it is dredging.
There is certainly laughter here, a renegade form of parody and mockery that one hesitates to label satire, especially given the pallid and toothless form that satire has assumed in British culture in recent times. With The Fall, however, it is as if satire is returned to its origins in the grotesque. The Fall’s laughter does not issue from the commonsensical mainstream but from a psychotic outside. This is satire in the oneiric mode of Gillray, in which invective and lampoonery becomes delirial, a (psycho)tropological spewing of associations and animosities, the true object of which is not any failing of probity but the delusion that human dignity is possible. It is not surprising to find Smith alluding to Jarry’s Ubu Roi in a barely audible line in “City Hobgoblins”: “Ubu le Roi is a home hobgoblin.” For Jarry, as for Smith, the incoherence and incompleteness of the obscene and the absurd were to be opposed to the false symmetries of good sense. We could go so far as to say that it is the human condition to be grotesque, since the human animal is the one that does not fit in, the freak of nature who has no place in the natural order and is capable of re-combining nature’s products into hideous new forms.
The sound on Grotesque is a seemingly impossible combination of the shambolic and the disciplined, the cerebral-literary and the idiotic-physical. The album is structured around the opposition between the quotidian and the weird-grotesque. It seems as if the whole record has been constructed as a response to a hypothetical conjecture. What if rock and roll had emerged from the industrial heartlands of England rather than the Mississippi Delta? The rockabilly on “Container Drivers” or “Fiery Jack” is slowed by meat pies and gravy, its dreams of escape fatally poisoned by pints of bitter and cups of greasy-spoon tea. It is rock and roll as working men’s club cabaret, performed by a failed Gene Vincent imitator in Prestwich. The what if? speculations fail. Rock and roll needed the endless open highways; it could never have begun in England’s snarled-up ring roads and claustrophobic conurbations.
It is on the track “The N.W.R.A.” (“The North Will Rise Again”) that the conflict between the claustrophobic mundaneness of England and the grotesque-weird is most explicitly played out. All of the album’s themes coalesce in this track, a tale of cultural political intrigue that plays like some improbable mulching of T.S. Eliot, Wyndham Lewis, H.G. Wells, Philip K. Dick, Lovecraft and le Carré. It is the story of Roman Totale, a psychic and former cabaret performer whose body is covered in tentacles. It is often said that Roman Totale is one of Smith’s “alter-egos”; in fact, Smith is in the same relationship to Totale as Lovecraft was to someone like Randolph Carter. Totale is a character rather than a persona. Needless to say, he in no way resembles a “well-rounded” character so much as a carrier of mythos, an inter-textual linkage between Pulp fragments:
So R. Totale dwells underground / Away from sickly grind / With ostrich head-dress / Face a mess, covered in feathers / Orange-red with blue-black lines / That draped down to his chest / Body a tentacle mess / And light blue plant-heads.
The form of “The N.W.R.A.” is as alien to organic wholeness as is Totale’s abominable tentacular body. It is a grotesque concoction, a collage of pieces that do not belong together. The model is the novella rather than the tale and the story is told episodically, from multiple points of view, using a heteroglossic riot of styles and tones: comic, journalistic, satirical, novelistic, it is like Lovecraft’s “Call of Cthulhu” re-written by the Joyce of Ulysses and compressed into fifteen minutes. From what we can glean, Totale is at the centre of a plot — infiltrated and betrayed from the start — which aims at restoring the North to glory, perhaps to its Victorian moment of economic and industrial supremacy; perhaps to some more ancient pre-eminence, perhaps to a greatness that will eclipse anything that has come before. More than a matter of regional railing against the capital, in Smith’s vision the North comes to stand for everything suppressed by urbane good taste: the esoteric, the anomalous, the vulgar sublime, that is to say, the weird and the grotesque itself. Totale, festooned in the incongruous Grotesque costume of “ostrich head-dress”, “feathers/orange-red with blue-black lines” and “light blue plant-heads”, is the would-be Faery King of this weird revolt who ends up its maimed Fisher King, abandoned like a pulp modernist Miss Havisham amongst the relics of a carnival that will never happen, a drooling totem of a defeated tilt at social realism, the visionary leader reduced, as the psychotropics fade and the fervour cools, to being a washed-up cabaret artiste once again.
Smith returns to the weird tale form on The Fall’s 1982 album, Hex Enduction Hour, another record which is saturated with references to the weird. In the track “Jawbone and the Air Rifle”, a poacher accidentally causes damage to a tomb, unearthing a jawbone which “carries the germ of a curse / Of the Broken Brothers Pentacle Church”. The song is a tissue of allusions to texts such as M.R. James’ tales “A Warning to the Curious” and “Oh, Whistle, and I’ll Come to You, My Lad”, to Lovecraft’s “The Shadow over Innsmouth”, to Hammer Horror, and to The Wicker Man — culminating in a psychedelic/psychotic breakdown, complete with a torch-wielding mob of villagers:
He sees jawbones on the street / advertisements become carnivores / and roadworkers turn into jawbones / and he has visions of islands, heavily covered in slime. / The villagers dance round pre-fabs / and laugh through twisted mouths.
“Jawbone and the Air Rifle” resembles nothing so much as a routine by the British comedy group the League of Gentlemen. The League of Gentlemen’s febrile carnival — with its multiple references to weird tales, and its frequent conjunctions of the laughable with that which is not laughable — is a much more worthy successor to The Fall than most of the musical groups who have attempted to reckon with their influence.
The track “Iceland”, meanwhile, recorded in a lava-lined studio in Reykjavik, is an encounter with the fading myths of North European culture in the frozen territory from which they originated. Here, the grotesque laughter is gone. The song, hypnotic and undulating, meditative and mournful, recalls the bone-white steppes of Nico’s The Marble Index in its arctic atmospherics. A keening wind (on a cassette recording made by Smith) whips through the track as Smith invites us to “cast the runes against your own soul”, another M.R. James reference, this time to his story, “Casting the Runes”. “Iceland” is a Twilight of the Idols for the retreating hobgoblins, cobolds and trolls of Europe’s receding weird culture, a lament for the monstrosities and myths whose dying breaths it captures on tape:
Witness the last of the god men
A Memorex for the Krakens
You can now read an extract from Aaron J Leonard and Conor A Gallagher’s A Threat of the First Magnitude – FBI Counterintelligence and Infiltration from the Communist Party to the Revolutionary Union 1962-1974 on Truthout!
In their new book, A Threat of the First Magnitude, Aaron J. Leonard and Conor A. Gallagher explore the ways in which the FBI was able to place informants into the top layers of organizations deemed threats to the US internal security. While these efforts — in the example of the Communist Party USA and the Maoist, Revolutionary Union — were successful, another initiative, an attempt to “flip” prominent Black activist James Forman was not. The following excerpt from Chapter 7: “The Never-Ending Campaign Against James Forman” explains.
Read more at http://www.truth-out.org/news/item/43191-the-fbi-s-failed-plan-to-make-black-activist-james-forman-an-informant
In November we will be publishing a collection of Mark’s work – K-punk: The Collected Writings of Mark Fisher, edited by Darren Ambrose and with a foreword by Simon Reynolds.
This is the second of two blogs, each containing two essays included in the forthcoming collection.
We will all remember Mark Fisher.
This Movie Doesn’t Move Me
(13th March 2005)
As I nervously anticipate the new Doctor Who (although after McCoy, after McGann, what more can there be to fear?), it is worth thinking again about the appeal of the series, and also, more generally, about the unique importance of what I will call “uncanny fiction”.
A piece by Rachel Cooke in the Observer two weeks ago brought these questions into sharp relief. Cooke’s article was more than an account of a television series; it was a story about the way broadcasting, family, and the uncanny were webbed together through Doctor Who. Cooke writes powerfully about how her family’s watching of the programme was literally ritualized: she had to be on the sofa, hair washed, before the continuity announcer even said the words, “And now…” She understands that, at its best, Dr Who’s appeal consisted in the charge of the uncanny – the strangely familiar, the familiar estranged: cybermen on the steps of St Paul’s, yeti at Goodge Street (a place whose name will forever be associated with the Troughton adventure, “The Web of Fear”, for Scanshifts, who saw it whilst living in New Zealand).
Inevitably, however, she ends the piece on a melancholy note. Cooke has been to a screening of the first episode of the new series. She enjoys its expensive production values, its “sinister moments”, its use of the Millennium Wheel. “But it is not - how shall I put this? – Doctor Who’” Faced with an “overwhelming sense of loss’”, she turns to a DVD of the Baker story Robots of Death for a taste of the “real” stuff, the authentic experience that the new series cannot provide. But this proves, if anything, to be even more of a disappointment. “How slow the whole thing seems, and how silly the robots look in their Camilla Parker-Bowles-style green quilted jackets… Good grief.”
Let’s leave aside, for a moment, all the post-post-structuralist questions about the ontological status of the text “itself”, and consider the glum anecdote with which the article concludes:
Before Christmas, when it became clear that my father’s cancer was in its final stages, my brother went out and bought a DVD for us all to watch together. Dad was too ill, and box went unopened. At the time, I cried about this; yet another injustice. Now I know better. Some things in life can’t ever be retrieved - an enjoyment of green robots in sequins and pedal pushers being one of them.
This narrative of disillusionment belongs to a genre that has become familiar: the postmodern parable. To look at the old Doctor Who is not only to fail to recover a lost moment; it is to discover, with a deflating quotidian horror, that this moment never existed in the first place. An experience of awe and wonder dissolves into a pile of dressing up clothes and cheap special effects. The postmodernist is then left with two options: disavowal of the enthusiasm, i.e. what is called “growing up”, or else keeping faith with it, i.e. what is called “not growing up”. Two fates, therefore, await the no longer media-mesmerised child: depressive realism or geek fanaticism.
The intensity (with) which Cooke invested in Doctor Who is typical of so many of us who grew up in the sixties and seventies. I, slightly younger than her, remember a time when those twenty-five minutes were indeed the most sacralised of the week. Scanshifts, slightly older than me, remembers a period when he didn’t have a functioning television at home, so he would watch the new episode furtively at a department store in Christchurch, silently at first, until, delighted, he found the means of increasing the volume.
The most obvious explanation for such fervour – childhood enthusiasm and naïveté – can also be supplemented by thinking of the specific technological and cultural conditions that obtained then. Freud’s analysis of the unheimlich, the “unhomely”, is very well known, but it is worth linking his account of the uncanniness of the domestic to television. Television was itself both familiar and alien, and a series which was about the alien in the familiar was bound to have particularly easy route to the child’s unconscious. In a time of cultural rationing, of modernist broadcasting, a time, that is, in which there were no endless reruns, no VCR’s, the programmes had a precious evanescence. They were translated into memory and dream at the very moment they were being seen for the first time. This is quite different from the instant - and increasingly pre-emptive – monumentalization of postmodern media productions through makings of documentaries and interviews. So many of these productions enjoy the odd fate of being stillborn into perfect archivization, forgotten by the culture while immaculately memorialised by the technology.
But were the conditions for Dr Who’s colonizing presence in the unconscious of a generation merely scarcity and the “innocence” of a “less sophisticated” time? Does its magic, as Cooke implies, crumble like a vampire seducer in bright sunlight when exposed to the unbeguiled, unforgiving eyes of the adult?
According to Freud’s famous arguments in Totem and Taboo and The Uncanny, we moderns recapitulate in our individual psychological development the “progress” from narcissistic animism to the reality principle undergone by the species as a whole. Children, like “savages”, remain at the level of narcissistic auto-eroticism, subject to the animistic delusion that their thoughts are “omnipotent”; that what they think can directly affect the world.
But is it the case that children ever “really believed” in Doctor Who? Žižek has pointed out that when people from “primitive” societies are asked about their myths, their response is actually indirect. They say “some people believe...” Belief is always the belief of the other. In any case, what adults and moderns have lost is not the capacity to uncritically believe, but the art of using the series as triggers for producing inhabitable fictional playzones.
The model for such practices is the Perky Pat layouts in Philip K Dick’s The Three Stigmata of Palmer Eldritch. Homesick offworld colonists are able to project themselves into Ken and Barbie-like dolls who inhabit a mock-up of the earthly environment. But in order to occupy this set they need a drug. In effect, all the drug does is restore in the adult what comes easily to a child: the ability not to believe, but to act in spite of the lack of belief.
In a sense, though, to say this is already going too far. It implies that adults really have given up a narcissistic fantasy and adjusted to the harsh banality of the disenchanted-empirical. In fact, all they have done is substituted one fantasy for another. The point is that to be an adult in consumer capitalism IS to occupy the Perky Pat world of drably bright soap opera domesticity. What is eliminated in the mediocre melodrama we are invited to call adult reality is not fantasy, but the uncanny – the sense that all is not as it seems, that the kitchen-sink everyday is a front for the machinations of parasites and alien forces which either possess, control or have designs upon us. In other words, the suppressed wisdom of uncanny fiction is that it is THIS world, the world of liberal-capitalist commonsense, that is a stage set with wobbly walls. As Scanshifts and I hope to demonstrate in our upcoming audiomentary london under london on Resonance FM, the Real of the London Underground is better described by pulp and modernism (which in any case have a suitably uncanny complicity) than by postmodern drearealism. Everyone knows that, once the wafer-thin veneer of “persons” is stripped away, the population on the Tube are zombies under the control of sinister extra-terrestrial corporations.
The rise of Fantasy as a genre over the last twenty-five years can be directly correlative with the collapse of any effective alternative reality structure outside capitalism in the same period. Watching something like Star Wars, you immediately think two things. Its fictional world is BOTH impossibly remote, too far-distant to care about, AND too much like this world, too similar to our own to be fascinated by. If the uncanny is about an irreducible anomalousness in anything that comes to count as the familiar, then Fantasy is about the production of a seamless world in which all the gaps have been monofilled. It is no accident that the rise of Fantasy has gone alongside the development of digital FX. The curious hollowness and depthlessness of CGI arises not from any failure of fidelity, but, quite the opposite, from its photoshopping out of the Discrepant as such.
The Fantasy structure of Family, Nation and Heroism thus functions, not in any sense as a representation, false or otherwise, but as a model to live up to. The inevitable failure of our own lives to match up to the digital Ideal is one of the motors of capitalism’s worker-consumer passivity, the docile pursuit of what will always be elusive, a world free of fissures and discontinuities. And you only have to read one of Mark Steyn’s preppy phallic fables (which need to be ranked alongside the mummy’s boystories of someone like Robert E Howard) to see how Fantasy’s pathetically imbecilic manichean oppositions between Good and Evil, Us and (a foreign, contagious) Them are effective on the largest possible geopolitical stage.
(16th April 2005)
Well, I’m still enough of a neophyte to be thrilled by a mention in Village Voice. I suppose it is ironic that Geeta describes k-punk as “cultural studies”, given my notorious antipathy to cult studs. On the other hand, though, k-punk is cultural studies as I’d always thought it should be practised (much of my hostility to cult studs stems from a disappointment when faced with the depressing, guilt-mongering reality of cultural studies in the academy).
Anyway, here is the full text that I sent to Geeta:
1. Why I started the blog? Because it seemed like a space – the only space – in which to maintain a kind of discourse that had started in the music press and the art schools, but which had all but died out, with what I think are appalling cultural and political consequences . My interest in theory was almost entirely inspired by writers like Ian Penman and Simon, so there has always been an intense connection between theory and pop/ film for me. No sob stories, but for someone from my background it’s difficult to see where else that interest would have come from.
2. Because of that, my relation to the academy has always been uh difficult. The way in which I understood theory – primarily through popular culture – is generally detested in universities. Most dealings with the academy have been literally – clinically – depressing.
3. The Ccru as an entity was developed in hostile conditions as a kind of conduit for continuing trade between popular culture and theory. The whole pulp theory/ theory-fiction thing was/ is a way of doing theory through, not “on”, pop cultural forms. Nick Land was the key figure here, in that it was he who was able to hold, for a while, a position “within” a university philosophy department whilst dedicatedly opening up connections to the outside. Kodwo Eshun is key as someone making connections the other way – from popular culture INTO abstruse theory. But what we all concurred upon was that something like jungle was already intensely theoretical; it didn’t require academics to judge it or pontificate upon it – the role of a theorist was as an intensifier.
4. The term k-punk came out of Ccru. “K” was used as a libidinally preferable substitution for the California/ Wired captured “cyber” (the word cybernetics having its origins in the Greek, Kuber). Ccru understood cyberpunk not as a (once trendy) literary genre, but as a distributive cultural tendency facilitated by new technologies. In the same way, “punk” doesn’t designate a particular musical genre, but a confluence outside legitimate(d) space: fanzines were more significant than the music in that they allowed and produced a whole other mode of contagious activity which destroyed the need for centralized control.
5. The development of cheap and readily available sound production software, the web, blogs means there is an unprecedented punk infrasctructure available. All that is lacking is the will, the belief that what can happen in something that does not have authorisation/ legitimation can be as important – more important – than what comes through official channels.
6. In terms of will, there has been an enormous retrenchment since 1970’s punk. The availability of the means of production has seemed to go alongside a compensatory reassertion of Spectacular power.
7. To return to the academy: universities have either totally excluded or at least marginalized not only anyone connected with Ccru but also many who were at Warwick. Steve “Hyperdub” Goodman and Luciana Parisi are both Ccru agents who have managed, against the odds, to secure a position within universities. But most of us have been forced into positions outside the university. Perhaps as a result of not being incorporated (“bought off”), many in the Warwick rhizome have maintained an intense connection and robust independence. Much of the current theoretical drift on k-punk has been developed via a collaboration with Nina Power, Alberto Toscano and Ray Brassier (co-organizer of the NoiseTheoryNoise conference at Middlesex University last year). The growing popularity of philosophers like Žižek and Badiou means there is now an unexpected if rogue and fugitive line of support within the academy.
8. I teach Philosophy, Religious Studies and Critical Thinking at Orpington College. It is a Further Education college, which means that its primary intake is 16-19 year olds. This is difficult and challenging work, but the students are in the main excellent, and far more willing to enter into discussion than undergraduates. So I don’t at all regard this position as secondary or lesser than a “proper” academic post.
11 July 1968 – 13 January 2017
In November we will be publishing a collection of Mark’s work – K-punk: The Collected Writings of Mark Fisher, edited by Darren Ambrose and with a foreword by Simon Reynolds.
This is the first of two blogs, each containing two essays included in the forthcoming collection.
We will all remember Mark Fisher.
Abandon Hope (Summer is Coming)
(11th May 2015)
So it was to be a re-run of 1992, after all. It seems that even elections are subject to retromania, now. Except, this time, it is 1992 without jungle. It’s Ed Sheeran and Rudimental rather than Rufige Kru. Always ignore the polls, wrote Jeremy Gilbert late on election night. “You get a better sense of what’s going on in the electorate by sniffing the wind, sensing the affective shifts, the molecular currents, the alterations in the structures of feeling. Listen to the music, watch the TV, go to the the pubs and ride the tube. Cultural Studies trumps psephology every time.’”
Contemporary English popular culture, with its superannuated PoMo laddishness, its smirking blokishness (anyone fancy a pint with Nigel?), its poverty porn, its craven cult of big business, has become like some gigantic Poundbury Village simulation, in which nothing new happens, forever… while ubiquitous “Keep Calm” messages, ostensibly quirky-ironic, actually function as They Live commands, containing the panic and the desperation…
England is a country in which every last space where conviviality might flourish has been colonised by a commercial imperative…. supermarket check-out operatives replaced by crap robots… unexpected item in bagging area… every surface plastered with corporate graffiti and haranguing hashtags… no trick missed to screw every last penny out of people… exorbitant parking charges in NHS hospitals (exact amount only, no change given), all the profits going to private providers…
Everything seen through a downer haze… “Mostly you self-medicate”… comfort eating and bitter drinking… What’s your poison?
The suburbs are hallucinating, England is hallucinating. Monster Ripper and Smirnoff, Brandy Boost, oversized glasses of chardonnay at Wetherspoons monday club, valium scored for a few quid in the pub , the stink of weed drifting from portakabins, red eyes and yellow bibs. The pharmaceuticals industry is one of UK Plc’s biggest success stories (along with arms dealing and loans companies) as prescriptions for anti depressants are kept on repeat.(Laura Oldfield Ford)
Time for one more, Nigel?
Time, gentlemen, please…
There is no time… Time is on your side (yes it is)…
In any case, Shaun Lawson is to be congratulated – if that is the word – for what turned out to be an astonishingly accurate prediction of how the election would go. My attempts to refute the parallels with 92 in my last post were as much wishful thinking as anything else. I suppose at some level I knew after the BBC Leaders Debate how things would go – which is why I found watching it so dejecting. (Another rhyme with the past: Ed’s stumble at the end of his interrogation by the petit-bourgeoisie was a minor echo of Kinnock’s tumbling into the sea in 1983.)
It seems that the very thing which gave us hope – the possibility of vacillating Labour being pulled to the left by an alliance with the SNP – might have been what motivated Tory voters to come out in such numbers in England. (Another echo of 92: fear as a hyperstitional force.) The truth is what many of us have long suspected: Labour lost this election five years ago, by failing to challenge the Tories’ narrative. Yet this failure wasn’t about the wrong leader, PR strategy or even policies; it is ultimately rooted in Labour’s disconnection from any wider movement, and this is in turn rooted in the wider emergence of capitalist realism. Blairism may have won Labour three elections, but the unfolding of its logic could well lead to the destruction, in the not so far distant future, of the party. As Paul Mason acidly summarises, “Labour no longer knows what it is for, nor how to win power.” With Blairism, Labour knew how to win power, but in acquiring this knowledge, it forgot what it was for.
That existential quandary is bitterly ironic given that there is a large proportion of the population in England – I still believe it is the majority – which feels it has no party which represents it. I maintain that the shift to UKIP is ultimately much more to do with this sense of disenfranchisement and despair than with any intrinsic tendency towards racism or even nationalism in its supporters. Everyone has chauvinistic potentials of one kind or another which can be activated by particular sets of forces. Ultra-nationalism is a symptom of the failure of class politics; or, class politics emerges through the ultra-nationalist lens in a distorted and displaced way.
As Paul Mason also points out, a return to Blairism will certainly not win back those Labour supporters who turned to UKIP. In England, as in Scotland, it was Blairism’s taking for granted and abandonment of its working-class base that produced the sense of betrayal which led to so many former Labour supporters losing patience with the party on Thursday. In Scotland, the response to betrayal took a progressive form; in England, it assumed a reactionary mode. Partly, this is because there was no progressive outlet available in England. Working-class English voters alienated from Labour’s Oxbridge elite were left with a choice between a UKIP that deliberately talked up its appeal to working families, and an array of small left-wing parties to whose message they were not exposed and which had no chance of being elected. UKIP were also practically forced on them to by a political media so decadent, so boring, that it counts Nigel Farage as a charismatic flash of colour. Hence what Tim Burrows calls “the curiously mediated entity of Farage, a man whose direct manner, coloured tweed and pints of ale seem made for meme-politics. UKIP are more popular on Facebook than Labour and the Liberal Democrats put together.”
It would be easy to fall into despair about England after Thursday; it would be easy to conclude that the country is full of selfish, mean-spirited and stupid individuals. Yet we have to remember that most people’s engagement with politics is quite minimal; thinking in political terms, framing everyday life in terms of political categories, is now a minority pursuit. This is not a moral or intellectual failing on the part of the electorate: it is a consequence of a neoliberalism which has largely succeeded in its aim of disabling the mechanisms of mass democracy. Overworked and told they need to work harder, busy, but sill feeling that they can’t get everything done, many are too drained to care. (Too knackered to think, just give me time to come round… ) How many Tory voters are committed Conservatives, really? Mostly, they are jaded and detached, maybe voting out of fear as much as self-interest (and self-interest is often experienced as fear).
Capitalist realism is not about people positively identifying with neoliberalism; it is about the naturalisation and therefore the depoliticisation of the neoliberal worldview. The Tories’ pitch is in tune with this ambient neoliberalisation, with its apparently commonsensical emphasis on choice, opportunity and the dignity of labour, and its emotional appeal to negative solidarity. To break out of this, you need a repoliticisation, and this requires a popular mobilisation, just as we saw with the SNP.
The Tory success depended upon a popular de-activation (the days of Thatcher’s rallies are long gone). There was no enthusiasm for either of the two leading parties. The only party that could call on massive popular enthusiasm in the UK was the SNP. That popular enthusiasm – an enthusiasm that capitalist realism is set up to prevent emerging – is the rushing in of something that, for a long time, there hasn’t seemed to be any glimmer of in England: the future.
Don’t be depressed …
What hope for a country where people will camp out for three days to glimpse the Royal Couple? England is like some stricken beast too stupid to know it is dead. Ingloriously foundering in its own waste products, the backlash and bad karma of empire. (William Burroughs)
So we shouldn’t take the Tories’ victory as a sign that we are totally out of sync with the majority of the population in England. As Jeremy remarked to me on Thursday, it is not as if the equivalent of Syriza or Podemos had lost. (Although that was part of what was so devastating – our expectations were low, but reality contrived to go even lower.) Given the serious weakness of Labour’s offer, given the ferocity of the attack on Labour from the right-wing media machine in the UK, given the failure of supposedly neutral popular media such as the BBC to offer the public an adequate account of the banking crisis and its aftermath, it is actually surprising that the Tories’ victory was not even more comprehensive. Those who voted Tory aren’t necessarily indifferent to the suffering of the poor, or to the plight of the vulnerable – most merely accept (why wouldn’t they) the capitalist realist story about there being “no money left” and the need for “difficult choices”. No doubt, their acceptance of this is somewhat self-serving; no doubt, it depends on keeping those who suffer out of sight or in their peripheral vision.
But it is also a fundamentally depressing and depressive outlook. There is a connection between capitalist realism and depressive realism. The idea that life is essentially drudgery (and that therefore no one should get a free ride) is a depressive conception of fairness (if I have to be miserable, so should everyone else), which has a particular traction in a burnt-out post-protestant culture like England’s… (England is the oldest capitalist country, don’t forget…)
All Cameron offered was more of this depression: a vision of a man chipping ice off his windscreen and going to a job he hates, forever. Yet Labour not only failed to offer a narrative about how the economy had gone wrong, it also failed to offer any positive vision of what society would look like if it had its way. I’m convinced that even the most minimal sense of this might have been enough to have inspired people to reject the Tories. Yet the fact that Labour couldn’t offer it was not some mistake (a few more focus groups and meetings with advertising people, and they’d have been there!). It was one more symptom of the way in which the party has been completely colonised by capitalist realism.
The Tories quickly abandoned the “Big Society” after the 2010 campaign, but the concept did actually point to what neoliberal culture has corroded: the space between “individuals and their families” and the state. In addition to its clunky and uncommunicative name – it was a kind of anti-meme – the problem with the “Big Society” was that, in the Tories’ hands, it was a transparent ruse to dismantle the welfare state. To resocialise a culture that has been individualised to the extent that England has demands massive resources – it requires time and energy, the very things that capital (especially the contemporary neoliberal, English version of capital) strips us of most thoroughly.
Real wealth is the collective capacity to produce, care and enjoy. This is Red Plenty. We, and they, have had it wrong for a while: it is not that we are anti-capitalist, it is that capitalism, with all its visored cops, its teargas, all the theological niceties of its economics, is set up to block Red Plenty. The attack on capital has to be fundamentally based on the simple insight that, far from being about “wealth creation”, capital necessarily and always blocks our access to this common wealth. Everything for everyone. All of us first.
Labour has allowed election after election to be fought not on the Red terrain of resocialisation, but on the Blue territory of identitarian community, with its border guards (we’ll have as many as you!) and barbed wire fences (they will be as high as yours!). The genius of the progressive forces which have seized the SNP, meanwhile, was to have moved from the Blue of identitarian community – and the nationalism of colonised peoples is of course very different to the nationalism of the colonisers – to the Red of internationalist cosmopolitan conviviality.
Red belonging offers something different to traditional forms of belonging (faith, flag, family – so many corrupted forms of the commons, as Hardt and Negri have it). Jodi Dean has movingly described how the Communist Party in the US
gave some Americans the feeling that the world was of one piece, their work meaningful as the work of a class, their struggles significant as part of a global struggle to liberate collective work from those claiming it for their own private profit. For desperately poor and barely literate immigrants, communism is a source of knowledge and power – the knowledge of how the world works and the power to change it.
The sense of belonging here could not be reduced to the chauvinistic pleasures that come from being an insider in any group whatsoever; it was a special sense of involvement that promised to transfigure all aspects of everyday life in a way that, previously, only religion had promised to, so that even the dreariest task could be imbued with high significance.
Even those engaged in the boring, repetitive work of distributing leaflets or trying to recruit new members as the official line changed, or chafing against the smugness of higher ups, experience their life in the party as intensely meaningful.
As opposed to the essentially spatial imaginary of Blue belonging – which posits a bounded area, with those inside hostile and suspicious towards those who are excluded – Red belonging is temporal and dynamic. It is about belonging to a movement: a movement that abolishes the present state of things, a movement that offers unconditional care without community (it doesn’t matter where you come from or who you are, we will care for you any way).
But don’t hope either …
“There’s no need to fear or hope, but only to look for new weapons”, Deleuze writes in “Postscript on Societies of Control”. He was no doubt thinking of Spinoza’s account of hope and fear in the Ethics. “There is no hope unmingled with fear, and no fear unmingled with hope”, Spinoza claimed. He defines hope and fear as follows:
Hope is a joy not constant, arising from the idea of something future or past about the issue of which we sometimes doubt.
Fear is a sorrow not constant, arising from the idea of something future or past about the issue of which we sometimes doubt.
Hope and fear are essentially interchangeable; they are passive affects, which arise from our incapacity to actually act. Like all superstitions, hope is something we call upon when we have nothing else. This is why Obama’s “politics of hope” ended up so deflating – not only because, inevitably, the Obama administration quickly became mired in capitalist realism, but also because the condition of hope is passivity. The Obama administration didn’t want to activate the population (except at election time).
We don’t need hope; what we need is confidence and the capacity to act. “Confidence”, Spinoza argues, “is a joy arising from the idea of a past or future object from which cause for doubting is removed”. Yet it is very difficult, even at the best of times, for subordinated groups to have confidence, because for them/ us there are few if any “future objects from which cause for doubting is removed”.
“Class disadvantage is a form of injury inflicted on the person at birth”, David Smail explains.
The confident slouch of the hands-in-pocket, old Etonian cabinet minister speaks not so much as a current possession of power (on some measures the union boss might possess as much) as of a confidence which was sucked in with his mother’s milk.
(Even if the milk he fed on was unlikely to have come from his mother.) The welfare state was supposed to be a structure which removed some of this doubt, while the imposition of precarity is a political project designed to remove the confidence that the working class had attained after years of struggle. (See Jennifer M Silva’s heartbreaking Coming Up Short: Working-Class Adulthood in an Age of Uncertainty – a book to which I shall certainly return in future posts – for an account of the devastating impact of precarity on the emotional lives of young working-class men and women in the US.)
Whereas hope and fear are superstitious (although they may have some hyperstitional effects), confidence is essentially hyperstitional: it immediately increases the capacity to act, the capacity to act increases confidence, and so on – a self-fulfilling prophecy, a virtuous spiral.
So how are we to rebuild our confidence? While the conditions are difficult – and in England, they are about to get much more difficult – we can still act, and act imminently and immanently. How?
Socialisation beyond social media
The answer of course is that many groups are already doing what is necessary. But these processes will become more powerful when they are logistically coordinated (which is not to say “unified” – unity is a strategic weakness, not a strength) and bound together by stronger common narratives and fictions. Jason Read’s essay “The Order and Connection of Ideology Is the Same as the Order and Connection of Exploitation: Or, Towards a Bestiary of the Capitalist Imagination” explains why narrativisation is so important. In his account of two neo-Spinozist thinkers, Frédéric Lordon and Yves Citton, Read reminds us that “our desire, our loves and hates, are already shaped by narratives, by scripts inherited through television and books. We enter into a world already scripted, and, as Spinoza argues in his definition of the first kind of knowledge, our life is defined as much by signs and images as things experienced.”
that the scenarios that we imagine, the stories and narratives that we consume, inform our understanding of reality, not in the sense that we confuse fiction with reality, but that the basic relations that underlie our fictions shape our understanding of reality. It is not that we confuse fiction with reality, believing everything that we see, but that the fundamental elements of every narrative, events, actions, and transformations, become the very way that we make sense of reality. Fiction exists in a permanent relation of metalepsis with reality, as figures and relations from one constantly inform the other.
This is why the intensification and proliferation of the capitalist technologies of reality management and libidinal engineering in the 1980s was not merely some happy coincidence for neoliberalism; neoliberalism’s success was inconceivable without these technologies. It is also the reason that direct action, while of course crucial, will never be sufficient: we also need to act indirectly, by generating new narratives, figures and conceptual frames.
By first of all imposing a particular set of narratives, figures and frames which it then naturalised, capitalist realism hobbled what Jason Read identifies as the “particular power of humanity (and the linchpin of our emancipation)”: “our faculty to reorder differently the images, the thoughts, the affects, the desires and the beliefs that are associated in our mind, the phrases that come out of our mouths, and the movements that emanate from our bodies.” Cultural Studies was also based on this account of the capacity for reordering (which it derived partly from Spinoza, via Althusser). The reordering of images thoughts, affects, desires, beliefs and languages plainly cannot be achieved by “politics” alone – it is a matter for culture, in the widest sense.
Seen from this point of view, the locking of popular culture into repetition that I describe in Ghosts Of My Life – and which Simon Reynolds also describes in Retromania – is therefore a very serious problem. Popular culture’s incapacity to produce innovation is a persistent ambient signal that nothing can ever change. Sometimes, it can seem fiendishly difficult to account for what has happened to popular culture, but the explanation for its sterility and stasis is ultimately quite simple. Innovation in popular culture has overwhelmingly come from the working class. Neoliberalism has been a systematic and sustained attack on working-class life – the results are now all around us.
Furthermore, the incursion of capitalist cyberspace into every area of life and the psyche has intensified the processes of de-socialisation. This is not to say that there are no progressive potentials in the web, but these have almost certainly been overrated, while the impact of cyberspace in de-socialising culture and subjectivity has been massively underestimated. Here I merely rehearse Bifo’s account of semiocapitalism and Jodi Dean’s critique of communicative capitalism, but it is important to operationalise this critique.
Blogs and social media have allowed us to talk to ourselves (but not to reach out beyond the left bubbles); they have also generated pathological behaviours and forms of subjectivity which not only generate misery and anger – they waste time and energy, our most crucial resources. Email and handhelds, meanwhile, have produced new forms of isolation and loneliness: the fact that we can receive communications from work anywhere and anytime means we are exposed to work’s order-words when we are alone, without the possibility of support from fellow workers.
In sum, the obsession with the web, its monopolisation of any idea of the new, has served capitalist realism rather than undermined it. Which does not mean, naturally, that we should abandon the web, only that we should find out how to develop a more instrumental relationship with it. Put simply, we should use it – as a means of dissemination, communication and distribution – but not live inside it. The problem is that this goes against the tendencies of handhelds. We all recognise the by now cliched image of a train carriage full of people pecking at their tiny screens, but have we really registered how miserable this really is, and how much it suits capital for these pockets of socialisation to be closed down?
Knowing someone in this life feels as desperate as me
Some folk in Plan C have been talking about consciousness raising, and for many reasons, I believe that it is a crucially important to revive and proliferate this practice (or range of practices) now. Consciousness raising is partly about the discovery and production of subjugated knowledges, but it is also about the immediate production of socialisation, of forms of subjectivity antithetical to the always/on-always lonely mode of contemporary capitalist individuality.
Consciousness raising opens up the possibility of living, not merely theorising about, a collective perspective. It can give us the resources to behave, think and act differently at work (if it makes any sense to talk about being “at” work any more), where capitalist realism has become second nature. The roots of any successful struggle will come from people sharing their feelings, especially their feelings of misery and desperation, and together attributing the sources of these feelings to impersonal structures, albeit impersonal structures mediated by particular figures to which we must attach populist loathing.
In the harsh conditions of cyberspatialised capitalism – conditions that, as Jennifer M Silva demonstrates, have produced a “hardening” of the self, especially in the young – consciousness raising can produce a new compassion, for others and for ourselves. Neurotic-Oedipalising capitalism responsibilises, harshly blaming us, while – in its therapeutic mode – telling us that we have the power as individuals to change anything and everything: if we’re unhappy, it’s up to us to fix it. Consciousness raising, meanwhile, is about positive depersonalisation: it’s not your fault, it’s capitalism. No individuals can change anything, not even themselves; but collective activation is already, immanently, overcoming individualised immiseration.
So I present below a number of strategies, practices and orientations, starting from the most immediate (something groups can do right now) and moving towards the more remotes. The list is of course not exhaustive; and I can’t claim credit for coming up with any of the strategies myself. The point is to share them, add to them, elaborate them.
The chief obstruction to all of these steps is what, in a trenchant and clear-eyed analysis, Ewa Jasiewicz calls “time poverty”:
Our time is under attack. Work will be intensified, worse paid, and more casualised – if we don’t have it, we’ll be working to have it; mandatory and supervised job searches and workfare will see people forced to spend their time locked into coerced, computerised distraction. A real, diverse, working class self-representative movement needs to include people facing and living these experiences, but how will that happen when we’re too tied up working?
Access to time and our own labour is key and will determine participation and the ability to organise. If we can’t have our own time to organise, we can’t organise, we can’t meet each other, we cannot find each other. Work and the benefits regime – which is work under different conditions and profit margins – are key sites of struggle. Solidarity will need to step up if we are to win workplace disputes and strikes, refusals of workfare and support for people getting sanctioned, so that people have more control over their time and labour.
All our commons are under attack. The condition of time poverty and its roots – intensification of labour, welfare repression, criminalisation and incarceration – have to be recognised as major obstacles to movement, diversity and power. These obstacles need to be tackled if we want to overcome the ideology of wage labour as a determinant of human value on a popular level.
The problem is that, in order to struggle against time poverty, the main resource we require is time – a nasty vicious circle that capital, with its malevolent genius, now has… This problem is absolutely immanent – writing this and the other posts I have completed this week has meant that I have fallen enormously behind on my work, which is storing up stress for the next week or so.
The first thing we must do in response to all this is to put into practice what I outlined above: try not to blame ourselves. #Itsnotyourfault We must try to do everything we can to politicise time poverty rather than accept blame as individuals for failing to complete our work on time. The reason we feel overwhelmed is that we are overwhelmed – it isn’t an individual failing of ours; it isn’t because we haven’t “managed our time” properly. However, we can use the scarce resources we already have more effectively if we work together to codify practices of collective re-habituation (setting new rules for our engagement with social media and capitalist cyberspace in general for example).
Anyway, here goes:
- Talk to fellow workers about how we feel This will re-introduce care and affectioninto spaces where we are supposed to be competitive and isolated. It will also start to break down the difference between (paid) work and social reproduction on which capitalism depends.
- Talk to opponents Most people who vote Tory and UKIP are not monsters, much as we might like to think they are. It’s important that we understand why they voted as they did. Also, they may not have been exposed to an alternative view. Remember that people are more likely to be persuaded if defensive character armour is not triggered.
- Create knowledge exchange labs This follows from what I argued a few days ago. Lack of knowledge about economics seems to me an especially pressing problem to address, but we could also do with more of us knowing about law, I suspect.
- Create social spaces Create times and spaces specifically dedicated to attending to one another: not (yet more) conferences, but sessions where people can share their feelings and ideas. I would suggest restricting use of handhelds in these spaces: not everything has to be live tweeted or archived! Those with access to educational or art spaces could open these up for this purpose.
- Use social media pro-actively, not reactively Use social media to publicise, to spread memes, and to constitute a counter-media. Social media can provide emotional support during miserable events like Thursday. But we should try to use social media as resource rather than living inside it at all times. Facebook can be useful for discussions and trying out new ideas, but attempting to debate on Twitter is absurd and makes us feel more stressed. (He says, thinking of the time when, sitting on a National Express coach, perched over his handheld, he tried to intervene in an intricate discussion about Spinoza’s philosophy – all conducted in 140 characters.)
- Generate new figures of loathing in our propaganda Again, this follows up from what I argued in the Communist Realismpost. Capitalist realism was established by constituting the figure of the lazy, feckless scrounger as a populist scapegoat. We must float a new figure of the parasite: landlords milking the state through housing benefit, ‘entrepreneurs’ exploring cheap labour, etc.
- Engage in forms of activism aimed at logistical disruption Capital has to be seriously inconvenienced and to fear before it yields any territory or resources. It can just wait out most protests,but it will take notice when its logistical operations are threatened. We must be prepared for them cutting up veryrough once we start doing this – using anti-terrorist legislation to justify practically any form of repression. They won’t play fair, but it’s not a game of cricket – they know it’s class war, and we should never forget it either.
- Develop Hub struggles Some struggles will be more strategically and symbolically significant than others – for instance, the Miners’ Strike was a hub struggle for capitalist realism. We might not be able to identify in advance what these struggles are, but we must be ready to swarm in and intensify them when they do occur.
Summer is coming
The Lannisters won on Thursday, but their gold has already run out, and summer is coming. What we saw in the debates dominated by Nicola Sturgeon was not a mirage – it is a rising tide, an international movement, a movement of history, which has not yet reached an England sandbagged in misery and mediocrity. Comrades, I hope (ha!) for the sake of your mental health and your blood pressures that you didn’t see the right-wing tabloids over the weekend (tw for class hatred): middle England crowing over its “humiliation” of ‘”ed” Ed. Well if they think Ed was Red, wait until they see the coming Red Swarm. Outer England has been sedated, but it is waking from its long slumber, carrying new weapons ….
Choose Your Weapons
(12th August 2007)
People are often telling me that I ought to read Frank Kogan’s work, but I’ve never got around it. (Partly that’s because, Greil Marcus apart, I’ve never really tuned into much American pop criticism at all, which in my no doubt far too hasty judgement has seemed to be bogged down in a hyper-stylized faux-naif gonzoid mode that has never really appealed to me.) The - again, perhaps unfair - impression I have is that, in Britain, the battles that Kogan keeps on fighting were won, long ago, by working-class autodidact intellectuals. No doubt the two recent pieces by Kogan that Simon has linked to are grotesquely unrepresentative of his work as a whole (I certainly hope so, since it is difficult to see why so many intelligent people would take his work seriously if they weren’t), but it’s hard not to read them as symptomatic, not only of an impasse and a malaise within what I now hesitate to call “Popism”, but of a far more pervasive, deeply-entrenched cultural conservatism in which so-called Popism is intrinsically implicated.
Remember, in the immediate wake of 9/11, all those po-faced Adornoite proclamations that there would be “no more triviality” in American popular culture after the Twin Towers fell? There can be few who, even when the remains of the Twin Towers were smouldering, really believed that US pop culture would enter a new thoughtful, solemn and serious phase after September 11th - and it’s surely superfluous to remember, at this point, that what ensued was a newly vicious cynicism soft-focused by a piety that only a wounded Leviathan assuming the role of aggrieved victim can muster - but would anyone, then, have believed that, only six years later, a supposedly serious critic would write a piece called “Paris [Hilton] is our Vietnam“… especially, when, in those years, there has, like, been another Vietnam. What we are dealing with in a phrase like “Paris is our Vietnam” is not trivia – this isn’t the collective narcissism of a leisure class ignorant of geopolitics – but a self conscious trivialization, an act of passive nihilistic transvaluation. Debating the merits or otherwise of a boring heiress have been elevated to the status of a political struggle; and not even by preening aesthetes in some Wildean/ Warholian celebration of superficiality, but by middle-aged men in sweat pants, sitting on the spectator’s armchair at the end of History and dissolutely flicking through the channels.
The end of history is the nightmare from which I am trying to awake.
At least the “Paris is Vietnam” piece laid bare the resentment of resentment that I have previously argued is the real libidinal motor of “popism” - “we love Paris all the more because others hate her (but luckily we loved her any way, honest!)” But this latest piece Simon has linked to is, if anything, even more oddly pointless and indicative. Unlike the pleasantly mediocre Paris Hilton LP, the ostensible object of the piece, Backstreet Boys’ single “Everybody (Backstreets Back)” is actually rather good. Practically everyone I know liked it. The problem is the idea that saying this is in some way news in 2007. No word of a lie, I had to check the date on that post, assuming, at first, that it must have been written a decade ago.
The article makes me think that, if the motivating factor with British popists is, overwhelmingly, class, with Americans it might be age. Perhaps those a little deeper into middle age than I am were still subject to the proscriptions and prescriptions of a Leavisite high culture. But it seems to me that popists now are like Mick Jagger confronted with punk in 1976: they don’t seem to realise that, if there is an establishment, it is them. Even if the “Nathan” with whom Kogan debates exists – and I’ll be honest with you, I’m finding it hard to believe that he does – his function is a fantasmatic one (in the same way that Lacan argued that, if a pathologically jealous husband is proved right about his wife’s infidelities, his jealousy remains pathological): for popists to believe that their position is in any way challenging or novel, they have to keep digging up “Nathans” who contest it. But, in 2007, Nathan’s hoary old belief that only groups who write their own songs can be valid has been refuted so many times that it is rather like someone mounting a defence of slavery today - sure, there are such people who sold such a view, but the position is so irrelevant to the current conjuncture that it is quaintly antiquated rather than a political threat. There may be a small minority of pop fans who claim to hold Nathan’s views; but, given the success of Sinatra, the Supremes, Elvis Presley and the very boybands that popists think it is so transgressive to re-evaluate, those views would in most cases be performatively contradicted by the fans actual tastes. (Kogan does grant that the problem is not so much fans’ tastes as their accounts of them – but the unspoken assumption is that it is alright, indeed mandatory, to contest male rock fans’ accounts of their own tastes, but that the aesthetic judgements of the figure with which the popist creepily identifies, the teenage girl, ought never to be gainsaid.) (The other irony is that, if you talk to an actual teenager today, they are far more likely to both like and have heard of Nirvana than they are the Backstreet Boys.)
The once-challenging claim that for certain listeners, the (likes of) Backstreet Boys could have been as potent as (the likes of) Nirvana has been passive-nihilistically reversed - now, the message disseminated by the wider culture – if not necessarily by the popists themselves – is that nothing was ever better than the Backstreet Boys. The old high-culture disdain for pop cultural objects is retained; what is destroyed is the notion that there is anything more valuable than those objects. If pop is no more than a question of hedonic stim, then so are Shakespeare and Dostoyevsky. Reading Milton, or listening to Joy Division, have been re-branded as just another consumer choice, of no more significance than which brand of sweets you happen to like. Part of the reason that I find the term “Popism” unhelpful now is that implies some connection between what I would prefer to call Deflationary Hedonic Relativism and what Morley and Penman were doing in the early Eighties. But their project was the exact inverse of this: their claim was that, as much sophistication, intelligence and affect could be found in the pop song as anywhere else. Importantly, the music, and the popular culture of the time, made the argument for them. The evaluation was not some fits-all-eras a priori position, but an intervention at a particular time designed to have certain effects. Morley and Penman were still critics, who expected to influence production, not consumer guides marking commodities out of five stars, or executives spending their spare time ranking every song with the word “sugar” in it on live journal communities that are the cyberspace equivalent of public school dorms.
Whereas Morley and Penman (self-taught working-class intellectuals both) complicated the relationship between theory and popular culture with writing that - in its formal properties, its style and its erudition, as well as in its content – contested commonsense, Deflationary Hedonic Relativism merely ratifies the empiricist dogmas that underpin consumerism. More than that. Owen Hatherley has astutely observed that, in addition to reiterating the standard Anglo-American bluff dismissal of metaphysics, the Deflationary Hedonistic Relativist disclaiming of theory (“we just like what we like, we don’t have a theory”) uncannily echoes the dreary mantras of the average NME indie band: “we just do what we do, anything else is a bonus’”, ‘the music is the only important thing”. In the UK, the rhetorical fight between “Popists” and indie is as much a phoney war as the parliamentary political punch and judy show between Cameron’s Tories and Brown’s New Labour: a storm in a ruling class tea-cup. In both cases, the social reality is that of ex-public schoolkids carrying on their inter-House rivalries by other means. In the case of both indie and Popism, there is a strangely inverted relationship to populism and the popular. While the “Popists” claim to be populist but actually support music that is increasingly marginal in terms of sales figures, the indie types claim to celebrate an alternative while their preferred music of choice (Trad skiffle) has Full Spectrum Dominance (you can’t listen to Radio 2 for fifteen minutes without hearing a Kaiser Chiefs song). In many ways, because it was attempting to analyse a genuinely popular phenomenon, Simon’s defence of the Arctic Monkeys was more genuinely popist than all of the popist screeds on Paris Hilton’s barely-bought LP - but of course much of the impulse behind them was the ultra-rockist desire to be seen thumbing ones nose at critical consensus. Witness the genuinely pathetic - it certainly provokes pathos in me - attempt to whip up controversy about the workmanlike plod of Kelly Clarkson, on a blog which, in its combination of hysterical overheating and dreary earnestness, is as boring as it is symptomatic - though, I have to confess I have never managed to get to the end of a single post, a problem I have with a great many “popist” writings, including the magnum opus of popism, Morley’s Words and Music.
Much as he occasionally flails and rails against popist commonplaces (see, for instance, his recent - I would argue unwarranted - attack on Girls Aloud), Morley is as deeply integrated into Deflationary Hedonic Relativist commonsense as Penman is excluded from it. What was the strangely affectless Words and Music if not a description of the OedIpod from inside? All those friction-free freeways, those inconsequent consumer options standing in for existential choices… Yet Morley is still a theorist of the ends of History and of Music, still too obviously in love with intelligence to be fully plugged into the anti-theoretical OedIpod circuitry. Even so, Ian’s silence speaks far louder than Morley’s chatter, and, after my very few dealings with Old Media, I’m increasingly seeing Ian’s withdrawal, not as a tragic failure, but as a noble retreat.
All of UK culture tends to the condition of the clip show, in which talking heads - including, of course, Morley – are paid to say what dimwit posh producers have decided that the audience already thinks over footage of what everyone has already seen. I recently had dealings with an apparatchik of Very Old Media. What you get from representatives of VOM is always the same litany of requirements: writing must be “light”, “upbeat” and “irreverent”. This last word is perhaps the key one, since it indicates that the sustaining fantasy to which the young agents of Very Old Media are subject is exactly the same as the one in which popists indulge: that they are refusing to show reverence to some stuffy censorious big Other. But where, in the dreary-bright, dressed-down sarky snarky arcades of postmodern culture, is this “reverence”? What is the postmodern big Other if it is not this “irreverence” itself? (Only people who have not been in a university humanities dept for a quarter-of-century - i.e. not at all your bogstandard Oxbdridge grad Meeja employee/leisure-time popist - could really believe that there is some ruthlessly-policed high culture canon. When Harold Bloom wrote The Western Canon it was as a challenge to the relativism that is hegemonically dominant in English Studies.) I’ve quickly learned that “light”, “upbeat” and “irreverent” are all codes for “thoughtless” and “mundanist”. Confronted with these values and their representatives - who, as you would expect, are much posher than me – I often encounter a cognitive dissonance, or rather a dissonance between affect and cognition. Faced with the Thick Posh People who staff so much of the media, I feel inferiority - their accents and even their names are enough to induce such feelings - but think that they must be wrong. It is this kind of dissonance that can produce serious mental illness; or - if the conditions are right - rage.
Anti-intellectualism is a ruling-class reflex, whereby ruling-class stupidity is attributed to the masses (I think we’ve discussed here before the ruse of the Thick Posh Person whereby make a show of pretending to be thick in order to conceal that they are, in fact, thick.) It’s scarcely surprising that inherited privilege tends to produce stupidity, since, if you do not need intelligence, why would you take the trouble to acquire it? Media dumbing down is the most banal kind of self-fulfilling prophecy.
As Simon Frith and Jon Savage long ago noted in their NLR essay, “The Intellectuals and the Mass Media”, which Owen Hatherley recently brought to my attention again, the plain common-man pose of the typical public school and Oxbridge-educated media commentator trades on the assumption that these commentators are far more in touch with “reality” than anyone involved in Theory. The implicit opposition is between Media (as transparent window-on-the-world transmitter of good, solid commonsense) and Education (as out-of-touch disseminator of useless, elitist arcanery). Once, Media was a contested ground, in which the impulse to educate was in tension with the injunction to entertain. Now - and the indispensable Lawrence Miles is incisive on this, as on so many other things, in his latest compendium of insights - Old Media is almost totally given over to a vapid notion of Entertainment - and so, increasingly, is education.
In my teenage years, I certainly benefited far more from reading Morley and Penman and their progeny than from the middlebrow dreariness of much of my formal education. It’s because of them, and later Simon and Kodwo et al, that I became interested in Theory and bothered to pursue it in postgraduate study. It is essential to note that Morley and Penman were not just an “application” of High Theory to Low Culture; the hierarchical structure was scrambled, not just inverted, and the use of Theory in this context was as much a challenge to the middle-class assumptions of Continental Philosophy as it was to the anti-theoretical empiricism of mainstream British popular culture. But now that teaching is itself being pressed into becoming a service industry (delivering measurable outputs in the form of exam results) and teachers are required to be both child minders and entertainers, those working in the education system who still want to induce students into the complicated enjoyments that can be derived from going beyond the pleasure principle, from encountering something difficult, something that runs counter to one’s received assumptions, find themselves in an embattled minority. Here we are now entertain us.
The credos of ruling class anti-intellectualism that most Old Media professionals are forced to internalise are far more effective than the Stasi ever was in generating a popular culture that is unprecedentedly monotonous. Put it this way: a situation in which Lawrence Miles languishes, at the limits of mental health, barely able to leave his house, while the likes of Rod Liddle swagger around the mediascape is not only aesthetically abhorrent, it is fundamentally unjust. Contrary to the “it’s only hedonic stim” deflationary move that both Stekelmanites and Popists share, popular culture remains immensely important, even if it only serves an essential ideological function as the background noise of a capitalist realism which naturalises environmental depredation, mental health plague and sclerotic social conditions in which mobility between classes is lessening towards zero.
A class war is being waged, but only one side is fighting.
Choose your side. Choose your weapons.