Yesterday I delivered the final manuscript of a book to the publisher. It represented quite an important moment for me, bringing together the insights gleaned from half a century of research on labour.

The last time I remember finishing a complete book in this way was back in 1981, in that burst of energy that high blood pressure produces in mid pregnancy. That one was never intended as a magnum opus, with a title (‘Your job in the Eighties’) that screamed that it had a sell-by date as well, of course, as a write-by date, imposed by the impending birth.

Little did I realise then how pressured the ensuing decades would be. I published edited collections and wrote an awful lot of book-length reports but the only really serious writing I did was in the form of relatively short essays, produced in the brief intervals between the pressing demands of meeting the deadlines for the work that paid the bills.

Thanks to Monthly Review Press, some of these essays, originally written for very different audiences, and with the first dating back to 1978, were published together as a book in 2003, and another collection followed in 2014. But the essay format does not really allow you to build an argument slowly from the beginning and follow it through. On the one hand you cannot presuppose that the reader has read anything else you have written beforehand so you have to go back to square one to explain certain things each time (leading to repetition if they are read sequentially) and the length limit means you cannot go into as much depth and detail as would ideally be nice.

I was constantly urged by friends to ‘write a proper book’ and, indeed, told that I only had myself to blame for any lack of recognition or acknowledgement because I had not done so.

So at last, I bit the bullet and decided to write one, hoping that it might be my last word on this subject that has occupied so much of my time and allow me to move on to other things. I found it quite hard to write in some ways. Partly because, as ever, there were other demands on my time (among others the need to babysit my grand-daughter) but mainly because of the difficulty of avoiding self-plagiarism. If you have been saying something for fifty years (even if this is to very small or uncomprehending audiences) it does not feel fresh when you repeat it. As John Berger memorably said, ‘the first time you say something, you’re discovering a truth. The next time, it’s a little less true’. I would spend hours trying to find a new way to write something only to discover that I had put it much more succinctly, years ago.

Nevertheless, and despite a little bit of (duly acknowledged) recycling here and there I did, I think, manage at least to build a coherent argument starting in Chapter 1 and ending in Chapter 8, with a clear conceptual framework that I hope will be useful to other researchers and students (and maybe even some general readers) in years to come.

But Oh!, as they say, the irony.

However there was one thing I was not prepared for. Even after a working lifetime of playing Cassandra, I was still taken by surprise by one thing: the way that this ‘book’ is going to be published. In a particularly ironic twist, this provides one of the most vivid (and cruel) examples of precisely the kind of fragmentation (of thought processes, of labour processes, of social interaction…) that I have been writing about all these years and, indeed forms part of the book’s subject matter.

Palgrave Macmillan, the publishers, who are now part of the Springer empire, are in the process of introducing a new way of publishing books online, one that integrates them with the way that academic journals are increasingly published. While hard-copy ‘proper books’ printed on paper will no doubt remain, albeit increasingly expensive, they expect the majority of readers to purchase their contents online. And with that in mind they are putting together packages that enable subscribers to pick and mix from a suite of content. Instead of buying a whole book they will be able to download chapters, one at a time, and bundle them together with chapters from other books. Thus, at a stroke, destroying that coherence it has taken so long for me to craft and introducing all sorts of new scope for incomprehension for the reader who comes in at, say, chapter 5.

In this new environment, I suppose that old derided essay format, so criticised by my friends and blamed for my relative invisibility in the academic world, at least in the UK, will turn out to be the best way to communicate after all. Assuming that readers are credited even with the attention span to read 6,000 words consecutively, I fear that the future may be even worse: with the literature made up of individual nuggets, each with an abstract that will be all that most people read, arranged interchangeably in a two-dimensional mosaic in which the genealogy of ideas, the logical sequence of an argument, deep scholarship and, yes, even the quality of writing, are flattened out of existence.

It will be a world where the relationship between reader and writer, that sharing of ideas which matters so much to me, in both capacities, is reduced to a purely instrumental one. Writers are expected to produce a series of discrete, easily explained ‘contributions to knowledge’ (as the reviewers for the academic journals like to put it) which can be harvested as quickly as possible by readers whose only interest is in assembling them, along with others, like so many lego bricks, to produce their own, equally simplified, ‘contributions’. In a process that resembles nothing so much as a dating website – something I wrote about, as it happens, only a couple of weeks ago in my last blog entry.

Researchers, be warned. The fragmentation fairy is waving her wand and you are about to be transported to Academic Tinder. Where, if you have done your homework, you will know that the only successful swipes are those that go to the right.



The glitterball life

One of the many contradictory aspects of being a social researcher is the profoundly ambivalent feelings it throws up about the filling in of questionnaires – that constricted sorting of the nuanced intricacies of life into simple yes/no categories, rating abstractions on a scale of one to ten or trying to place the incommensurable into rank order.  I make much of my living from the results of other people doing so but also, in my capacity as a filler-in, have grown to detest the process, which sometimes reduces me to a state of almost speechless rage.

As a filler-in, you might want to answer ‘yes but’ or ‘sometimes’ or ‘both’ or ‘it depends’ or ‘good in some ways but not in others’  but can only do this if the questionnaire designer has been able to imagine such a possibility in advance.  But the researchers in charge of the questionnaire design (sometimes including me) have other things in mind: cheapness; brevity; compatibility with other surveys; ease of analysis; the questions the policy-makers want answers to. Then there are the people (also sometimes including myself) who want to use the results – people in search of answers to new questions.hoping that somewhere out there, the ‘facts’ exist that can prove, or disprove our hypotheses.

I can remember my fury, back in the 1970s, when, doing some research on the impact of technological change on women’s employment, I discovered that the only people who had gone back through all the census data since the beginning of the 20th century to examine changes in occupation by industry had not seen the importance of gender as a variable (perhaps they even thought it sexist to draw attention to it?) so they had added up the figures for the men and the women and presented them in an undifferentiated way, thus rendering their research entirely useless for anybody interested in gender segregation in the workforce by occupation and industry and requiring it to be done all over again. But I was equally furious, if not more so, when a few months ago, having spent a long time filling in a detailed online questionnaire intended to collect information about academics’ experiences of the REF (Research Excellence Framework) process, I got to the last page only to discover that it would not let me ‘submit’ my response until I had told them whether I was (a) heterosexual (b) homosexual or (c) bisexual. There was no option to say ‘prefer not to say’ ( ‘none of your business’), ‘other’, ‘celibate’, ‘not sexually active’ or even ‘don’t know’ and I crossly gave up. But of course it is quite possible that my annoyance at being forced into this limited range of options and (unexpressible) opinion that my sexual preferences should not matter in relation to the stated aim of the survey, might on some future occasion be mirrored by an equally strong annoyance on the part of some researcher who wants to analyse the results by sexual orientation, exactly like my own reaction to those 1970s gender-suppressing academics.

For me, as a subject, abstracting various aspects of my identity as standardised units that can be combined in various permutations may be experienced as deeply dehumanising. For me, as a researcher, being able to analyse and compare these units across a population is a necessary precondition for spotting patterns that may not only enable us to understand changes that are taking place in society but also, if we are lucky, try to come up with solutions that can make that society more responsive to peoples’ needs. A contradiction, if ever there was one.

If this were just a problem peculiar to social research it would be easy to live with and we could rationalise it as a case of the good outweighing the bad; that participating in social research is in general a good thing, and, even if it isn’t, easy to opt out of. But alas, this is not the case. Digitalisation has made the filling in of forms an all-pervasive feature of everyday life. There’s no escaping it. Every time you apply for a job, open a bank account or book a flight, you have to enter information about yourself into a pre-defined form. And even when you are not consciously filling in a questionnaire you are still supplying information to somebody just about every time you navigate any website or enter a search term into google. We are all now familiar with the sorts of targetted advertising that results from the profiling that is based on the information we may have supplied voluntarily (e.g. age, postcode, relationship status, weight, height, number of children) when it is put together with the data generated by our online searches, our Facebook likes, our retweets and Alexa’s eavesdropping on our conversations over dinner.

It is a situation that has given rise to some of the most extraordinary paradoxes of our time. Total strangers can, via algorithms, have unrestricted access to the most intimate details of your life but you cannot even talk to your bank about our own money if you cannot remember the 3rd and 5th digits of your ‘memorable word’. All ‘for your own protection’, you must understand. The public and the private switch places constantly. You can sit on a bus and hear the overworked care assistant next to you talking on her mobile phone at the top of her voice to her (deaf?) patient about extremely private matters, but woe betide you if you take a photograph of your grandchildren in your local swimming pool. We are used to hearing the phrase ‘Data protection’ used as an excuse for anything from failing to inform parents about the suicide attempt of their 18-year-old to refusing to let you log on to a website after forgetting your password.

Social researchers have to jump through all sorts of hoops to collect data. You cannot interview somebody without their informed consent; you have to agree not to use the data for any other purpose than your stated intention; to anonymise it strictly; and to destroy it when the research is finished. Yet airlines can, without your consent, supply all your personal details to US Border security, right down to your meal preferences. And who knows who is listening in to your phone calls and reading your emails?

This sort of double standard is something it is easy to rant about, and many have done so, much more eloquently than I could. But that is not really what I want to write about here. What I am interested in is our ambivalence to the ever-multiplying supply of data which increasingly mediates the way we negotiate and understand our world, the ambivalence I discussed earlier in relation to being a social researcher but of which that forms only a tiny part. Most of us have come, if not to love, at least to rely on the way in which an individual item can be identified as a unique configuration of standard ingredients. Although it may be tempered by some sentimental nostalgia for the way things used to be, on the whole we welcome not having to trudge up and down the high street, asking in every likely shop for the obscure thing that we want but instead being able to google it. When I was a child you could go into a department store and find the department that sold cardigans and ask if they had a black one, in your size, made of wool, with long sleeves and pockets and an assistant would get them out and display them for you on a counter so you could decide which, if any, you wanted to try on. By the end of the 20th century, those same department stores, or at least the few that had survived, were organised as a series of concessions, by brand. So you would have to wander round, investigating, label by label, whether any such cardigans existed getting more tired and frustrated at every step (and with assistants few and far between). How nice, then, to be able simply to enter the search terms and find the very thing, without even having to get up from your chair. And yes, it might be a bit annoying to be targeted with ads for black cardigans every time you log on for the next few weeks (despite your ad blocker) but you might think that’s actually a small price to pay, when the convenience is multiplied across the whole range of goods and services you purchase.

What is perhaps more worrying is that, in this digital society, you are not just a shopper. Indeed, you are increasingly somebody who is shopped for. On the job market you offer yourself, using a standard form, as an assemblage of increasingly standardised ingredients (your qualifications, the languages you speak, the software packages you know, the companies and clients you have worked for, what you have earned, what you have made, or published) struggling to find some way of expressing your uniqueness (often via one of those excruciating cover letters in which you describe your ‘bubbly personality’, ‘passion for [whatever the ad said]’, ‘willingness to go the extra mile’ and ‘strong communication skills’). These days this self-advertising is more and more likely to be on an online platform in which not only are your past assignments visible, but so too are the ratings you have been given by the clients you carried them out for. (This is something I am doing research on right now; I won’t bombard you with details here.)

The old occupational identities are increasingly fractured as we become, for the purposes of the labour market, bundles of interchangeable attributes, each of which has to be described in standardised terms and be measurable even if the combination is unique. As such, we are also increasingly comparable to each other (and therefore potentially substitutable for each other) in a market which (at least in terms of information) is geographically unbounded. If you, in Dhaka, have the same skills profile as me, in London, and if the work is digitisable, then what is going to determine which of us is given the job?  As on the job market so in other aspects of life. Just as we may become used to understanding our employable selves as bundles of standardised skills and competences, we may also start to classify our social selves in terms of standardised sets of tastes and consumer choices. Does this lead to the same sorts of social anxiety, I wonder?

If I, as, let us say, a translator from German into Mandarin with a specialist knowledge of polymer science and experience of preparing texts for academic journals, feel competitively threatened by somebody else with the same skill set, might I also, as a person wanting to be loved and appreciated, feel competitively threatened by the thought that, however much I want to stand out in the crowd, there are lots of other people out there who like the same kind of music as me, wear the same brand of clothing and like watching the same movies. If I can be pinpointed so easily by a marketer’s algorithm, wherein lies my uniqueness? I might try to reassure myself with the feedback of others. Which, in these digital times, is now very easily quantifiable. But what if others get more ‘likes’ than me on Instagram, or swipes on Grindr? What if their Facebook posts are shared, or their tweets retweeted, more than mine? What then is my value?

A world in which identities can be described as collections of attributes which can be broken down ever more precisely into separate facets feels to me like one in which personalities are turned inside out. This is particularly visible in the process of finding friends and lovers. In the past, getting to know somebody might have been modeled as a process of peeling off outer layers, like the skin of an onion, to get to some kind of hidden internal essence (or ‘soul’, even). You might scan a crowd of strangers, uniformly grey,  to catch a sudden flash of eye contact that hinted at a possible connection which could then be explored tentatively. There were many false starts and a lot depended on chance. People whose lives were constrained socially might never meet a soulmate. But if you did it could take you completely by surprise (though the cliche of falling in love at first sight was probably vanishingly rare). One should not sentimentalise this, of course. Lots of matches were made by arrangement, built slowly into strong companionship from pretty loveless beginnings, or were never very happy at all. But I think it is fair to say that if and when one did ‘fall in love’, it was unpredictable, exciting and private.

Nowadays, significant numbers of people use online dating sites to find their partners. In the USA in 2018 the proportion of people who said that they had met their spouse or partner online was 12% among 18-30-year-olds, 13% among 30-44-year-olds and only fell below 10% among the over-65-year-olds. Nearly one internet user in five (19%) admitted to using dating websites or apps in 2017. And why not? In a world in which one shops online for everything else, it is completely logical to do so for sexual partners too, especially if you are too busy or temperamentally averse to seeking them out in noisy night-clubs, dubious bars or other physical venues. I know a number of happy couples who met each other that way. And yet, and yet… As the algorithms get more sophisticated I find myself recoiling ever more from the idea of being shopped for in this way. Instead of exploring people, one by one, from the outside in, what these sites do is abstract each separate facet and present them, searchably, for you to choose from, with as much as possible of what is internal carefully catalogued and displayed on the outside. You are asked to define what you want by skin colour, age, height, weight, income, politics, sexual preference, tastes in food and art and music, income and innumerable other variables, often, these days, linked together by artificial intelligence to produce  sophisticated psychological profiles which can be tested against yours for compatibility. Every aspect of your personality that can be captured is displayed publicly for inspection, like the tiny mirrors on a glitterball. A glitterball in which others might see their own characteristics diffracted and reflected. A glitterball among glitterballs. In a global mosaic of standardised attributes.

For me, this flies in the face of everything I want to believe in about human attraction and love and taste and the human ability to learn and change. The last thing I want is to be pinned down, for example, as somebody who likes a particular kind of music. One day I might love listening to Sarah Vaughan, another day I might be moved to tears by a Schubert sonata or stirred by the bell-like clarity of the opening notes of a Sam Cooke song. But more importantly I want to be open to be surprised by some other kind of music I haven’t even encountered yet, or learn to listen attentively to something I might have rejected out of hand in the past. And there is music in every category that bores or annoys me. In just the same way I have no idea who I might fall in love with. Whether it is a man or a woman, someone black, white, tall or short. Who knows? I might think I don’t like scientists and then find myself suddenly and inexplicably enchanted by one. What makes somebody attractive (or not) is profoundly mysterious. To label myself in advance as locked into a particular pattern of preference feels profoundly wrong. And it feels equally wrong to exclude others on the basis of some superficial (and possibly temporary and changeable) attribute.

I cannot think of anybody I really value in my life whom I would even have found had I selected them using conventional search terms. It is the accidents of synchronicity that, in my experience, lead to the best friendships as well as the greatest moments of creative inspiration. Am I a freak? Or do others feel the same? And should we be trying to find ways to put rounded and complete human beings back together in all their malleablity and unpredictability and inconsistentency and the essential unknowability that constitutes their deepest attraction? If so, how?

and more on the future of work

In the new spirit of reblogging here things I have already blogged elsewhere, here is a piece that appeared today on the LSE blog at

(their headline not mine).

Future of Work: taking the blinkers off to see new possibilities

Anybody relying for their information on the current headlines would find it hard to make sense of what is happening in the labour market. On the one hand, the news media are awash with apocalyptic forecasts, often backed up by studies from reputable organisations such as the US National Bureau of Economic Research , the Oxford Martin School or the Bruegel think tank, that robots, machine learning, drones, d3D printers, driverless cars and other applications of Artificial Intelligence are going to eliminate very large numbers of jobs, not just in manufacturing but also in service industries, ranging from low-skill tasks like picking and packing in warehouses and home delivery right up to high-skill professional tasks like legal research or stockbroking.

On the other hand, employment levels in the UK are at an all-time high of 74.6 per cent, with the unemployment level, which averaged over 7 per cent from 1971 to 2016, having fallen to just 4.7 per cent in January, 2017.

So, are we facing mass unemployment or not? Here we are, nearly a decade after a major financial crisis that led to job losses, austerity and waves of corporate restructuring including bankruptcies, mergers and acquisitions, seeing the emergence of new winners, with new business models and the birth of new industries, with new technological applications playing a key role. If we take a broad historical view, this is actually quite a familiar story.

We could look, for example, at the development of new industries based on the spread of electrical power and mass entertainment after the 1929 crash, or of computerisation after the 1973 energy crisis, or the explosive growth of the Internet in the decade after the infamous 1987 Black Monday. Each of these technologies was also, of course, instrumental in displacing large numbers of jobs in older industries. And with each wave, livelihoods were irrevocably damaged, because the new jobs were not created in the same areas, or for the same people, as the old ones.

The elderly look on in amazement at the desirable new labour-saving appliances their grandchildren buy, remembering the back-breaking drudgery of the old methods. But for every gleaming new factory in one part of the world, there are piles of rusting machinery in others, along with devastated lives and communities. Such ‘creative destruction’, as Schumpeter called it, is, surely, part and parcel of capitalism as usual.

So why, in the second decade of the 21st century, are so many commentators, on so many different parts of the political spectrum, convinced that this time things will be different: that we are, in Paul Mason’s phrase, moving into a period that could be described as ‘postcapitalist’?

Part of the explanation might lie in the way that capitalism is often seen, especially by the young, as a single, monolithic system that embraces all aspects of life. Perhaps a more useful way of understanding it is a somewhat messy assemblage of different capitalists competing with each other, scrabbling for market share, experimenting with new business models and often failing. In times of crisis, when many are going to the wall, technologies (including some that have been around for a while) may be seized on, not as part of an orchestrated general plan, but in much more piecemeal ways, by particular firms looking for means to restore profitability: to reduce labour costs, develop new products or services or enter new markets.

Obvious first targets for automation are processes where labour costs are high, usually because they require scarce skills or workers are well organised. So it is not surprising that skilled print workers were first in the firing line for digitisation, or auto factories for robots. The first companies to introduce innovations can make a killing – getting ahead of their competitors with a step change in increased productivity.

But such advantages do not last long. Once the technology is generally available, it is open to any competitor to buy it at the lowest market price and copy these production methods. A race to the bottom is started, which can only be sidestepped by firms that continue to innovate. It is fanciful to imagine that it would be possible to populate the world’s factories with 2017 state-of-the-art robots and then just leave them to get on with production. Leaving aside the question of how these robots are to be assembled and maintained, there is no conceivable business model that would make this profitable over any sustained period of time.

A much more likely scenario is that vast new industries will grow up to manufacture these new means of production which, like today’s laptops and mobile phones, will rapidly become obsolete and need replacing. These industries will also give birth to new service jobs, involved in their design, distribution, maintenance and in dealing with the unintended consequences of their widespread adoption (such as cyber-crime and new safety hazards).

Current technologies do not just create new kinds of jobs, they also change the way work is organised, managed and controlled. My research has shown that 2.5 per cent of workers already get more than half their income from online platforms. These new organisational models do not just change the way existing jobs are managed but also bring new areas of economic activity within the direct orbit of capitalism, for instance by drawing into the formal economy the kinds of cash-in-hand work done by window-cleaners, dog-walkers, baby-sitters or gardeners. They may not be jobs in the traditional sense, but they are work, with the potential to be organised differently in the future, that can form the basis of profitable new industries.

Another factor that blinkers thinking about the future of work is a failure to see beyond the boundaries of the existing industrial structure and imagine where other new industries will emerge from. Whether it’s the DNA of plants, the human needs for entertainment, sociality and health or outer space, the universe is full of new opportunities for commodification. The question is, can the planet sustain them?

Intellectual jamming

When the news of B B King’s death reached me earlier this summer, I turned, as I’m sure many others did too,  to Google and Youtube to find recorded performances to remind me of the greatness of this inspirational blues guitarist. I had known that he was extraordinarily prolific and catholic in the company he kept but it was still astonishing, in this overview, to see the range of people he performed with over his long and hard-working career: singers ranging from Etta James, Aretha Franklin, Van Morrison and Bonnie Raitt to Tracey Chapman, Susan Tedeschi and Chaka Khan not to mention other guitarists influenced by him, like Jimi Hendrix, Eric Clapton, Ronnie Wood (and even the likes of U2 and Mick Hucknell).

What shines through from many of these performances, as well of course as King’s talent, is an extraordinary generosity of spirit that is always open to new dialogue: that attentive, respectful listening and voicing, breathing in and breathing out, call and response, giving the other people the right amount of time to express themselves before answering, that musicians call jamming and that characterises not just human communication at its best but also how other forms of art are made co-operatively.

I suppose this represents some sort of ideal of collaborative creative labour, exhibiting how new wholes can be made that are so much greater than the sum of their parts. For it to work, each participant has to have skills that are recognised and admired by the others, but such interdependence, especially when it involves taking unrehearsed risks in public, also entails making oneself very vulnerable and has to be underpinned by strong mutual trust. It set me thinking about how rare this kind of jamming is in intellectual work these days.  Rare but not non-existant.

I remember it very clearly in my teens and early twenties, in long profound conversations that went on till dawn about the meaning of life in which each insight from one person seemed to spark an even brighter response from the other. The morning afterwards, of course, many of these insights were forgotten, or understood to be clichés, or at least less original than one had supposed, but they nevertheless left residues that led to further thought, or reading, or even works of art. But it was not just private conversations that had this quality. In my first ‘proper’ job, at Penguin Education, which I joined in 1970, I was lucky enough to work with a team of people who collaborated in a way that was more characteristic of a film crew than many publishing projects. Here, series editors, commissioning editors, copy editors, authors, picture researchers, graphic designers, typographers, photographers and illustrators, each with a clearly defined role but also willing to learn from each other, collaborated on several series of illustrated books and audio-visual materials for schools several which were groundbreaking at the time.

Among the most famous were the Voices (and its supplement for primary schools, Junior Voices) anthologies of poetry, prose and pictures (here is a link to a recorded version of some of them). Another was Connexions, published, under the editorship of Richard Mabey, when the leaving age for secondary school pupils was raised from 15 to 16 in 1972, to introduce these final year students to contemporary discussions in a groovy way. Celebrated here, it was probably the first time the technical potential of offset litho printing was used (by designer Arthur Lockwood) to bring the ‘feel’ of a magazine to what was still in theory a school text book. The first time a kid was spotted reading one on a bus, as I recall, a bottle of champagne was cracked open. Whilst there were of course hierarchies in the organisation of work what I remember most clearly is a strong sense of joint endeavour and shared satisfaction.

It is a model that does not guarantee success. There are always risks: of creative disagreements; incompatible personalities; competitiveness overpowering collaboration; sharp-elbowed scrambling for recognition;  the usual tensions between democracy and efficiency;  all combined with the pressures of time and budget. Some of these have been addressed in the film industry by strict mechanisms of attribution (though invisible power battles underpin even those endlessly rolling credits).  But it is clear that, despite these many difficulties, our culture would be very much poorer (if, indeed, it could be said to exist at all) if people were not prepared to open up their imaginations to each other in this free and generous way in the faith that, by doing so, they will create something that no individual could accomplish alone. Each has taken the personal risk that the gesture might be seen as a clumsy, the solo might dissolve into incoherence, the joke be unfunny, the sentiment mawkish, or the whole thing met with blank incomprehension; all this has been braved in the hope that if it all works, something glorious will emerge.

I could write at length about the complicated relationship between being alone, and being with others, reflecting and expressing, that is entailed in so many creative processes, but that is not what I want to do today. Nor do I want to get too deep into a discussion of the ways in which ‘project-based working’, whilst drawing on many of the traditions of how teams work together in creative industries, is also used a an instrument of casualisation, keeping workers in a state of perpetual insecurity, with a constant need both to beg and to brag (I have a chapter on ‘begging and bragging’ in this book). No, what prompted me to write this post was the simple regret that this collaborative spirit is so singularly lacking in academic life, despite the rhetoric of collegiality that still haunts university campuses.

Far from being places where colleagues freely share ideas and inspire each other to generate new collective understandings, many universities now feel more like prisons for ideas, corralled into separate schools and disciplines – places where non-competitive behaviours and disrespect for hierarchies and boundaries may actively be punished. The unsuspecting new entrant may arrive with a starry-eyed vision of common rooms and high tables where ideas are aired for general appreciation, to be met with wit, informed debate, recognition and a sense of having contributed to the development of a larger body of knowledge. But, like a cow discovering the limits of a field through a series of shocking encounters with electric fences, you will soon learn the reality. Send an article unsolicited to a senior colleague for an opinion? FSSSTTTT-KKK*! You didn’t really expect them to have time to read it, did you? Co-author an article with a student for publication in a non-ranked journal? FSSSTTTT-KKK! What’s that going to do for your department’s ‘excellence’ score? You do realise you have performance targets to meet, don’t you? Talk about some ideas at a conference that you haven’t yet published in an article? FSSSTTTT-KKK! You have given valuable intellectual property away to your department’s rivals, what were you thinking of? Put your deepest thoughts into a research report that is a ‘deliverable’ for a collaborative project? FSSSTTTT-KKK! You just gave that well-known professor from a Russell Group university the material for his next article! Do you seriously think you’ll be properly acknowledged? Discover that there is someone in a different department of your university whose ideas really chime with yours and suggest a joint project? FSSSTTTT-KKK! You have started a major row between warring deans about who will own the outcome. How COULD you? Explain what you mean in really, really simple language? FSSSTTTT-KKK! Oh, come on. Be serious!

People being the curious, creative, idealistic beings that they are, there is clearly now a continuing hankering for alternative spaces in which intellectual jamming can take place. It is evident in the profusion of blogs and postings on mailing lists by young scholars, in the setting up of new networks and attempts to find ways of organising conference sessions that go beyond the sequential delivery of over-rehearsed pre-prepared texts. Not least, I see it in the enthusiastic participation of large numbers of, mainly young, researchers in the events organised by the Dynamics of Virtual Work  network I am currently leading.

But these opportunities for dialogue increasingly feel like small gaps in the electric fences through which hands can be grasped occasionally and a few ideas at a time can be smuggled. Where is the wide open landscape, the public realm in which an independent intelligentsia can converse openly? We are all, of course, free, within certain circumscribed limits, to make use of the means put at our disposal by global corporations to express ourselves, but, with no independent source of livelihood, this is increasingly looking like Anatole France’s famous freedom to sleep under bridges and beg in the streets. Apart from a lucky few, those inside the academy have no time, and those outside it no money to create opportunities for unhurried, focussed collaboration. The intellectual common, such as it is, is now a  minefield of contradictions. On the one hand it provides the main means for expression and collaboration for an exponentially growing proportion of the world’s citizens, but on the other it is also increasingly a site for the accumulation of new capital. We navigate it at our peril.

* Several people have asked me what this strange acronym represents. It is actually my – clearly rather feeble – attempt to evoke the sound that is made when living flesh comes into contact with an electric fence. Here is a recording of what it actually sounds like.


I finished another research proposal yesterday. Clicking the ‘submit’ button induces the same sense of irrevokability that used to accompany the posting of a bulky manilla envelope through the maw of a post box: anxiety that it might contain some terrible mistake combining acutely with an overwhelming urgency to get it over with. I can remember times in the past when the thought of leaving it till the next morning was so unbearable that I would go out out in a raincoat in the grey pre-dawn just to be shot of that bundled offering, that peculiar combination of boasting and supplication that a proposal embodies, which i have written about elsewhere as ‘begging and bragging‘. This time it was a more collegiate effort than it often is and someone else performed the fateful deed, so the moment of release was a little modified – a somewhat anticlimactic transmission of an email with a pdf attachment, though the moment of hitting ‘send’ still had its poignancy.

Nevertheless, it is an event to be marked and an excuse to give myself some time away from all the other pressing oughts.

And also a moment to reflect on this writing of proposals which has consumed so much of my time over the last four decades. And with this, to reflect on some of the contradictions in intellectual life which the writing of proposals brings to the surface, contradictions that seem to become much more acute with each passing year.

First there is the matter of the sheer volume of writing required. This latest effort was about 45,000 words, shorter than some. In this case not all written by me but nevertheless taking up time – to chase up the authors, edit etc.  Multiply that by – say twice a year (I’ve sometimes done more – my record was 11 in one year – but I’m trying to be conservative here) and multiply that by thirty five (allowing for a few years when i was less productive, or the proposals were smaller) and that’s getting on for 3 million words. I’d rather not think how many books that could be! I suppose about a quarter of them turned into funded projects which provided me – rather unevenly – with a living all those years. I could only have written (or, to be fair to my collaborators, co-written) all those books if i had a private income of some sort (and, of course, strong motivation, which might not have been there if I’d been living the life of Riley at someone else’s expense). So it can’t really all be regarded as wasted effort. But it does sometimes feel like it.

A curious feature of research proposals is that they have no public visibility as anyone’s intellectual property. If the research is commissioned, they ‘belong’ to the client. If, like so many European proposals, they are put together by a team of collaborators from different countries, then they become the collective property of the team, regardless of how much, or little, effort any given member has put into the writing of them. If they proposal fails, large chunks of it are liable to be cut and pasted without acknowledgement into other proposals which may, or may not, involve the original authors of those pieces of text. If the proposal succeeds, then all members feel free to use is as they like. This should not matter in principle but can be annoying in practice. In one large European project I worked on a few years ago we developed some rules designed to help junior researchers gain some recognition for their work. The project involved carrying out a large number of case studies. According to these rules, anyone wanting to draw conclusions from the case studies was obliged to cite the original case study report written by the researcher who had done the interviews when referring to it, rather than any synthesised analysis on which their name might not appear. Fair enough, you might think. But in practice this failed to take account  of the genealogy of the interpretative text. I had written quite a lot of the original research proposal, under some time pressure, and, when doing so, had lifted some text from other work of mine in progress (some of which formed hypotheses that were tested in the fieldwork). Parts of this text then got reused word for word (with no acknowledgement) by the case study authors and reappeared several times in various reports analysing the fieldwork results, so when I finally got round to publishing something based on the text I had drawn on for writing the proposal in the first place, i was condemned for having ripped off the work of these junior researchers and, apologetic and bad at standing my ground as I am, ended up littering my ‘new’ text with references to the work of others which had used my own earlier unacknowledged language, in a sort double expropriation.

A proposal is a sharp reminder that, as intellectual property lawyers constantly remind us, you cannot legally own an idea. A friend once told me an anecdote about a now-retired BBC producer who used to bring to brainstorming meetings a postbag containing letters that had come in from viewers with ideas for programmes. These would be emptied out on the table for the assembled professionals to pick through for inspiration with, of course, no reference to the innocent viewers who had submitted them. Things are not so different in the world of research evaluation. When doing evaluations for the European Commission I was once – disturbingly – advised ‘Never put in a proposal in the first call for a new Framework Programme. Just look out for good ideas you can use in the second call’. And I have certainly had the experience on more than one occasion of witnessing good proposals rejected only to see remarkably similar ones succeed a year or so down the line.

But this raises much more general moral questions of how ideas should be attributed. We are all, of course, immersed in other people’s ideas from childhood. It never occurs to anyone to acknowledge the understandings of the world derived from the explanations given by parents or teachers in answer to those early questions ‘how?’ or ‘why?’. This carries over into adulthood. The penny-dropping moment that occurs when a student suddenly ‘gets’ an idea when it is expounded by a gifted lecturer is experienced as part of his or her education, something to be absorbed from the surrounding culture as easily and naturally as a tune played on the radio or a joke heard in the pub or even a sermon. Few if any would dream of ‘citing’ it. The conscientious lecturer’s role, duty even, is to pass on understanding in such a way that the student internalises it and makes it his, or her, own. But this responsibility runs into headlong collision with the increasingly powerful imperative also placed on that lecturer, to publish and be cited,which implies becoming the visible public owner of a set of ideas that are privately owned and deserving of attribution. These ideas have to be hoarded, as private intellectual property, until the moment of publication, for fear that they will be stolen and published under someone else’s title. Academics must therefore, both share, and not share. They must also both collaborate and compete. And they must aim both for ‘excellence’ and ‘impact’. Of such contradictions are nervous breakdowns made (see this Guardian article for some scary evidence on mental illness in academia).

The process of assembling a research proposal embodies many of these contradictions, albeit often in ways that are unspoken. Take the matter of competition. A European proposal represents a collaboration between scholars in different European countries. Indeed the rationale for funding research rests in no small part on the principle that knowledge and experience will be transferred from one partner to another through the process of collaboration. So far so good, you might think. But no one country should dominate, so in practice you should not have more than one partner from the same country in the same proposal without a very strong rationale. So this puts people from the same country into direct competition with each other. And the more expertise on any given topic there is in any given country – the larger that country’s academic community – and the more pressure there is to secure external research funding in that country, the more intense that competition is. And there are many who could bear testimony to the internecine environment in some disciplines in, for instance, the UK and Germany, resulting from this. But of course there are also strong pressures to collaborate nationally for instrumental if no other reasons. Careers depend on peer review, on favourable evaluations from national funding sources, on friendly people to act as external examiners and sit on appointments committees. Who knows when you might need an ally? This academic terrain is a minefield whose safe negotiation requires a Stendhal or a James Cavell to do justice to its intricacy.

The citation becomes a sort of currency in this game. Although the algorithms for assessing citations are becoming ever more sophisticated, this is still primarly a quantitative matter. The more citations you have, the greater your standing. So in deciding to cite someone you are not just positioning yourself as someone who respects (or disagrees with) that person, you are also adding to their pile of points. Consciously, or perhaps not, academics form themselves into little gangs (often grouped round particular journals or conferences) within which there is a tacit agreement to cite each others’ work, but ignore that of others. Unsurprisingly this has a strongly gendered character, as Daniel Maliniak, Ryan M. Powers and Barbara F. Walter found in their study of the gender citation gap, with women much less likely to be cited than men. I have not studied this systematically but anecdotes suggest that it is evident even in the field of gender studies. Discussions I have had with women who know the field better than I do suggest that when ‘men’s studies’ first emerged as a distinctive field in the 1980s the first writers referred back to feminist authors of the 1970s but as soon as there was a second generation of publications, the authors chose only to cite the men from the first generation (the fathers, so to speak, rather than the grandmothers).

At its nastiest, selective citation can be a way of covering up plagiarism. This trick involves author A reading author B and citing all the people author B cites but not author B’s own work (except perhaps some trivial aspect of it which is rubbished). Author A can then claim ownership of all author B’s ideas without ever acknowledging them. And yes this does seem to be something that happens much more to women than to men. But generally speaking, I think it is done not from malice but in ignorance or from an unconfident need to gain approval by copying the people seen as successful. Up against a deadline, with huge pressure to publish on top of a heavy workload of teaching and marking and administration, the harassed academic skims through the literature that  other people have already cited, taking this to be the ‘state of the art’. The article being written will, of course, have to go through anonymous peer review so uppermost in the author’s mind may be an anxiety that the reviewer – or members of that reviewer’s gang – may actually appear in this bibliography, or expect to do so, so nothing must be left out. You mustn’t, after all, appear ignorant of anything already cited in the field. Often there isn’t even time to read the articles in question and the citation is made on the basis of an abstract – but you omit it at your peril.  The end result is clear. Each time a work is cited, its stature as an important text in the field is enhanced. Thus are some reputations built. But in the same process others are left invisible. Is this, maybe, another example of the way the gender division of labour manifests itself? Are the parts of being an academic that involve teaching and administration and proposal-writing –  the intellectual equivalents of childcare and housework  – regarded as less entitled to reward or recognition than those that are formally theorised and published in academic journals?

Twenty years ago, this citation-seeking culture, a culture in which intellectual activity is increasingly commodified, seemed peculiar to, or at least much stronger in, the English-speaking countries. It is now much more broadly pervasive, perhaps because the global academic world is expected to be English speaking; the values have been smuggled along with the language. So there is now a second question hovering behind every invitation to participate in a new research proposal. In addition to ‘how much money will we get out of it?’ is ‘how many articles will I get out of it?’. Sadly, in addition to increasing the tension between the ‘we’ and the ‘I’ – this pushes ever further into the background that old simple motivation for doing research: to find stuff out.

And this raises yet another tension: between the empirical and the theoretical. With ‘impact’ generally measured by the results of the former, and ‘excellence’ by the latter. But perhaps that should be the subject for another blog.

In the meanwhile, I should end by saying that there is silver lining in all this. When you find yourself working with people you can trust, and do share their knowledge freely and  are serious about carrying out new and original research and care about what use is made of the results then this is something to be treasured and celebrated. As I do today. Thanks, colleagues!


Size Queens, consumption work and the unpredictable paths that ideas travel

Last week I received an email saying,

‘My band, The Size Queens, are about to release our 5th album, in part due to your work. The cybertariat was the inspiration for it — though we’ve been progressively moving in this direction, to try and understand why the economic promise of weightlessness seems heavier than before. Our entire project, to be released on Election Day in the States, is a song cycle and accompanying video …  no guarantee you’ll like the project at all. But we like you. .. The new record [is] “Consumption Work: Tammy, Cybertariat, At The Aral Sea.” I hope you’ll find it edifying to see that your work in economics has inspired those of us working in music.’

I’m not sure that ‘edifying’ is quite the word, but I was certainly very pleased and flattered. And this reminded me that the concept of ‘consumption work’ that played such an important part in my thinking in the late 1970s first came to me as a result of listening to music. So the idea could really be said to have come full circle.

The original inspiration, so far as I remember it, was Lord Buckley’s Supermarket. His work seems little known now, but, although he died in 1960, so I never had a chance to hear him live, for me, and for the group of friends (I have now forgotten which) in whose company I  first heard his records he was an important figure, not least because he first introduced us to a kind of Black American hipster slang we had not come across before, although much of it later entered the hippy mainstream. I think he was the first person I heard referring to the police as ‘the fuzz’ and mentioning in public, in various lightly-coded ways, the smoking of marijuhana. There was something irrestistably cool – intelligent and funny in equal parts – about his semi-improvised verbal riffs, performed against a jazz background. I suppose nowadays he would be thought of as a performance artist, or even a proto-rapper. In his eloquent monologues, Jesus was resurrected as ‘the Nazz’, Shakespeare as ‘Willy the Shake’ and Gandhi as ‘the Hip Ghan’. With a typical touch of genius, in Supermarket the store owners are referred to as ‘Greed heads’.

The observation that stayed in my head described the experience of self-service in a supermarket, a phenomenon that must have been pretty recent in the 1950s when he performed it.

‘Remember the first supermarkets?’ he asks. And, after describing the process of getting a cart he says,

‘And there you are pushin’ in the supermarket with the cart.
You grab the cart and you go strolling up and down the aisles
and you load up all your jazz
and you’re working for them, see?’

At first, he explains,

‘It’s alright, because you’re getting –
this is the beginning –
very, very, low, low, low, low prices.
Saving, you see.
So you don’t mind, you know, pushing a little bit.’

But then, after the prices have risen (or, in his immortal words, ‘Prices – whhhhooooo!’)

‘you’re still pushin’ the mother cart.’

This idea that employers save money by getting consumers to carry out, without payment, the work that was previously done by paid workers lodged somewhere deep in my brain. Nearly fifty years later, I can still summon up the exact intonation, rhythm and self-parodying sexiness of tone of his ‘You’re working for them’.

The phrase ‘consumption work’ came from a 1976 article in Monthly Review by  Amy Bridges and Batya Weinbaum, called ‘The other side of the paycheck: monopoly capital and the structure of consumption’, a socialist feminist analysis of the relationship of housework to capitalism.

These two insights came together for me when, in 1978, as a member of a study group on new technologies, organised by the journal Capital and Class, I was trying to solve two intellectual puzzles. The first of these was how it is that the amount of time people spend doing housework carries on going up despite the ever-increasing number of ‘labour-saving’ products they buy. The second was how it is that prophecies that automation will lead to permanent mass unemployment have never been fulfilled.

The resulting article (reprinted 25 years later in my 2003 book, The Making of a Cybertariat) made singularly little impression at the time (it was not included in the book the group produced). In fact I had more or less given up hope that anybody would take the idea and run with it*. Though, of course, it remains an active part of my thinking and I have developed the idea further over the years. If ever I find time to write it there will be a book….

So it is a really wonderful surprise to discover that the idea has spread so far, and helped inspire such creativity. And these guys make good music too.The video, which can be found here is a knockout! Their main site is here: .

*The term was taken up by one academic who did not acknowledge my work at all, although I had given her quite a bit of my material. (I should have been suspicious when she asked me ‘who knows your work?’. But I was feeling very intellectually lonely at the time and anxious to discuss the concept and its implications with someone at last, and I misread the clues and thought, in my naiveté, that perhaps she was asking this because she wanted to help promote my ideas. I was more or less unemployed at the time and she had a senior academic post and it would certainly have helped my career). I was in two minds about stating this here. It does sound a bit bitter and paranoid. But I discussed it with a friend this morning who thought I should put it on the record, so here it is. Thinking about it again now I realise that I am myself partly to blame: for acting like a kid in a playground holding out my toy saying ‘please play with me’ to the other kids and then being hurt when they grab it and make off with it; for not taking sufficiently into account the incredible damage done to any idea of sisterhood or collaborative working by three decades of attempts by neoliberal governments to destroy the radical tradition of British social science and discipline its practioners into habits of competition and commodification and marketisation of intellectual property; and finally for neglecting to play the game of self-promotional publishing in A-list academic journals.

Found Art (or the delights of negative entropy)

cream on cream

Cream on cream. Multiple layers of slightly mismatched paint covering graffiti on this wall produce an effect that reminds me of the visual experimentation of early 20th century artists like Kazimir Malevich or Ben Nicholson*.

A sophisticated awareness of graffiti is now part of the essential intellectual armoury of any East London resident or visitor with pretentions to hipness or gentility. Tourists take guided tours of the street art of Shoreditch, Islington home-owners trying to sell their £1 million houses proudly point out the Banksy at the end of the road to their potential buyers and no art bookshop is complete without a table of expensive glossy books on street art (some, rubbing in the irony, with names like ‘The Art of Rebellion’). There is even an iphone app called ‘Street Art London’, celebrating the work of the likes of (pseudonymous yet would-be famous) Phlegm and ROA.

found Frank Stella

A found ‘Frank Stella’*

Conferring this formal status as ‘art’ onto something that used to be regarded (by society at large) as nuisance and vandalism and (by a minority of intellectuals) as a transgressive form of Art Brut creates troubling contradictions both at the aesthetic and the social level.

At the aesthetic level, the self-conscious artistry of a Banksy or Phlegm deprives us as viewers of our ability, in the tradition of Marcel Duchamp or Jean Dubuffet or  Kurt Schwitters, to become an artist ourselves by being the person who ‘sees’ the art in our environment. Their graffiti are defiantly presented to us as ‘already’ art and we are put immediately onto the back foot, with the choice of passively enjoying it, thus constituting ourselves as fans of the artist, thereby conferring status on him or her, or of being cast as killjoy philistines. Either way we lose that intellectual superiority that comes from asserting the dominance of our own vision that goes along with the identity of artist.

a found Mark Rothko

A found ‘Mark Rothko’*

At the social level, the public authorities or property owners who are keepers of the urban landscape are faced with the practical dilemma of defining what is, or is not, public art in their daily decisions about what to leave untouched and what to paint over. This gives them an unacknowledged role as arbiters of taste.

Another found ‘Rothko’*

The results of their paintings over have become a great source of visual pleasure to me. there is a cream-painted wall opposite my house that is frequently graffitied on and, just as frequently painted over by Council workmen. Each time they do so, another rectangle of slightly differently coloured cream paint is layered over what already exists creating a subtle patchwork of cream-on-cream that I love. It reminds me of early 20th century experiments in abstraction.

Also near my house is a derelict pub one of whose walls is overlaid with similarly overlapping layers of shades of red and pink. I think of it as a wall of  Rothko paintings, though the other side of the building is more reminiscent of Jasper Johns or Frank Stella.

found Jasper John

A found ‘Jasper Johns’

I am of course seeing them through modernist-trained 20th century eyes. But no eye is innocent. Sometimes I wonder about the vision of those Council workmen whose job it is to go round implementing the zero-tolerance-of-antisocial-behaviour-including-graffiti policy. For all I know, some of them could themselves  have spent teenage evenings with a spray can leaving their personal mark on the drab neighbourhoods they grew up in. Or some might be spare-time artists in a more socially recognised sense. Or might some even be doing the work as Community Service, enforced punishment for past crimes, perhaps even seeing what they have to do as a brutal desecration of forms of cultural expression that they identify with and cherish? Or could this repeated repainting be work done, not under the Council’s jurisdiction at all but by some artist-squatter?

another detail from the rothko wall

Another detail from the ‘Rothko wall’*


Most jobs, of course, involve some sort of pride in the craft being exercised (I have written about this here) and I am sure that any conscientious worker with a paint roller in hand must be exercising some sort of judgement about how the paint is applied, perhaps even with some sense of leaving an individual stamp on the finished work. I wish I knew more about the labour process of these workers. Is the defiant patch of grey on my local Rothko wall (pictured here) the result of a conscious aesthetic decision, perhaps? Or had they simply run out of reddish paint that day and abandoned the attempt at a colour match?

Rothko wall

An artefact created in a complex interaction between weather, plant life, neglect and human intervention both sanctioned and unsanctioned*

Whatever the intention, my pleasure as a viewer is tempered with a certain unease. Haunting each such wall is its complex history: the pristine wall, the graffitied wall, and the overpainted wall, with perhaps many intevening layers of deterioriation, repair, alteration and restoration. Each of these might provoke a different aesthetic response: admiration, regret, celebration, aversion. In taking a picture of such a wall, am I responding to it as intended art (like a tourist taking a photo of the Taj Mahal) or as unintended art (like someone ‘finding’ unexpected beauty in nature)? Or might I be ‘discovering’ some form of ‘naive art’, like a Cubist coming across an African mask, or a feminist historian a patchwork quilt? And if so, am I perhaps patronising the people who made it, imputing to them an ignorance of their own creativity or even appropriating and commodifying the results of their aesthetic labour? Am I entitled to see my representation of it as an original artistic work?


View from train window, mid 1990s*

I first became aware of the beauty of the overpainting of graffiti when I took this picture from a train window in Brighton, some time in the 1990s. When I took it, I was most conscious of the pattern of black verticals and horizontals against the different reds and oranges on the station platform. It was only when I looked at it afterwards that I realised that the lovely subtle colour patterns, whose irregularities had puzzled me, must be the result of such overpainting. Despite the streaks (caused by the hairy plate on an old scanner on which a cat used to sit) I still like it as an image, especially the serendipitous way it has captioned itself with the word ‘private’.

There is something both moving and optimistic in this continuous human effort of renovating and remaking our urban landscapes. It gives us a visual representation of the dialectical relationship between originality and inherited aesthetic values, between individual transgression and collective social control, between the private and the public and between the past and the present. Unfortunately, the conditions that sustain this delicate dynamic balance are now under threat. It could easily be lost: if public spending cuts continue; if the anger of unemployed youth spills out of control; if more of our common public space is privatised and placed behind locked gates; or if ‘development’ is allowed to bulldoze our communities. Cherish it while you can.

another rothko

Another luscious ‘Rothko’ to end on*

Postscript: All this was triggered by the fact that I was burgled last week and my handbag stolen, leaving me temporarily not only without any formal means of identification or of conducting any financial transactions but also without my bus pass. As a result, I was obliged to walk along a route I usually travel by bus, giving me a chance to take photographs of the ever-evolving ‘Rothko wall’ I so often enjoy through its windows.

* click on these images to see them in greater detail.

The cannibalisation of the NHS continues

Poster displayed in NHS clinic by private audiology firm advertising the wider range of products it can supply to private patients  (single NHS product on the left; 6 private alternatives on right)

Every encounter with the health service seems to bring on the one hand an example of the dedication and sympathy of individual health workers and on the other yet another instance of the way in which the NHS is being cannibalised by private companies.

Last week brought another example, perhaps trivial but nevertheless indicative. I had to pick up a new hearing aid from the private company with whom the part of the NHS to which my GP belongs has taken out a contract to supply audiology services.

My first appointment with these people was, I have to say,  arranged with commendable speed – presumably to  make sure I got onto their books and was not treated by some remnant of the publicly provided service.

My hearing test took place in a trailer parked in the car park of a GP practice not far from where I live and was conducted with great efficiency. My second appointment, to be fitted with the hearing aid, was rather less efficient. The wait was longer and I was the first ever patient of the guy who did the fitting, who had a lot of trouble with the software and kept explaining that he had not been trained on this model before popping out to consult a more experienced colleague. The only one they had in stock was the wrong colour but they said this could be changed at a later date.

That later date, six months later, is now. Since the original fitting the service has gone downhill more. The only way one can arrange to get new batteries or book another appointment is via a call centre staffed by people with Scots accents (in Glasgow?) who seem to find any kind of communication an unwelcome chore. For three days the whole system was down and you couldn’t make any appointments at all. when i finally got through to someone the conversation went like this:

Call centre operator: ‘I’m afraid we don’t have any appointments between now and July 6th at your local centre’

me: ‘What about after July 6th?’

CCO: ‘The system only goes up to then’

me: ‘Can someone call me back when the later dates are available?’

CCO: ‘We don’t offer that service’.

me: ‘Can you offer me an appointment anywhere else?’

CCO: ‘We can only offer you one in Haringey’

me: ‘Isn’t there anywhere nearer?’

CCO: ‘That’s the only one near you’.

me: ‘Actually it isn’t near me at all. Can you tell me where else I might be seen?’

CCO (getting very grumpy) starts to read out a list of centres in London, one of which is in a part of Islington near the Hackney border, very easy to get to.

me: ‘That’s MUCH nearer to where I live than Haringey. Why did you tell me that was the only one near me? Don’t you have a map you can check these things on?’

CCO: ‘We don’t offer that service’.

Anyway, I got my appointment and went to pick up my new hearing aid, from the same company, this time from an office based inside a GP practice. There I found that, not content with creaming off routine NHS work for their own profit, they had actually put up a sign in the waiting area (shown above in a blurred photo taken on my iphone) advertising the superior range of aids they can offer to private patients. So they even get a chance to advertise freely to a captive target audience at the taxpayer’s expense!

Here’s my article from the current Socialist Register about the commodification of public services

And a great piece by Stuart Weir comparing privatisation with the enclosure movement:

And a recent article by Colin Leys, the tireless commentator who has exposed so much about what is rotten about NHS privatisation, about another very nasty aspect of the interpenetration of the public and the private in contemporary Britain:

The gender agenda

The new issue of the journal is at last published. The ninth in the series, it is the first to focus explicitly on gender, although of course many previous issues have included articles that address it.

Volume 6 no 1 of Work Organisation, Labour and Globaliisation

Gender and the global division of labour

In writing the introduction (which can be downloaded here), I found myself revisiting questions I used to think about – and discuss in women’s groups – forty years ago and this churned up an unexpectedly powerful mix of emotions. I think it is time to re-examine my relationship with socialist feminism.

The wind has certainly changed recently. About three or four years ago it became clear that Marxism is becoming intellectually respectable again (rather in the same way that modernist architecture is back in vogue), and post-modernism has finally become passé. This is surely a cause for rejoicing among those of us who mistrusted its relativism and saw it as an excuse for political fence-sitting among a cowardly generation of academics fearful of losing their jobs – or at least their research funding – under neo-liberalist regimes.

It’s now OK again to use the word ‘capitalism’ and, in some circles, even ‘the labour theory of value’ or ‘class consciousness’. Conferences with names like ‘Historical Materialism’ are full to bursting with competitive young academics. The sea of grey hair at public meetings of the left is now speckled with other colours. Videos of David Harvey lectures go viral on student facebook sites. Capital Reading Groups are being set up. Ten years ago, this would have seemed impossible. For someone of my generation, emerging from three decades of feeling not understood,  jeered at, patronised as quaintly old-fashioned or shunned as dangerous intellectual company to keep, what’s not to like?

Well, to judge by my subjective reactions, quite a lot, actually.

Although i am feeling vindicated in much of the work I have done over the years and getting more recognition for it than would have seemed possible at the turn of the millenium, I have found myself over the last three or four years firing off more angry emails to Marxists than ever in my life before.  Indeed, I suspect I am acquiring a reputation for irrational and paranoid irascibility that goes way beyond the mild tetchiness that is generally tolerated in someone of my age. And these emails overwhelmingly relate to issues of gender. So what is making me so cross?

To be honest, this is something I am exploring as I write, so my reasoning may not be perfectly structured, However I hope it will not just come across as a rant. I would really like to have been able to discuss it first with some of the women with whom I had such intense discussions about these things in the 1970s, but, alas, some are no longer with us, some have moved to other continents, some have changed their lifestyles and politics in such a way as to put them beyond easy reach of such discussions, some are too burdened to spare the time for such things and others I have simply lost touch with. So here come some first observations, in the raw – and in no particular order.

I am sure that many men I know will read this and feel baffled, hurt or misunderstood. I’m sorry about this. I don’t mean to belittle your efforts and am truly grateful for the support and recognition that some of you have given me, and other feminists, over the years. But these things do need to be said.


One very irritating feature of the new Marxism (which was also present, with a bit more excuse, in the older versions) is the conviction among its masculine adherents that they have the theoretical overview. Their particular version of Marxist theory explains the whole universe and its workings and all that remains to be done is to dot the ‘i’s and cross the ‘t’s and argue about how exactly it should be applied to current circumstances. The ‘woman question’, as it was traditionally known, comes very much into the ‘i’ dotting category and forms a minor sub-branch of the overall theory. The idea that Marx and Engels might have left some questions unanswered, and some contradictions unresolved, seems unthinkable to them (even though it is obvious that Marx himself thought there was a great deal more to be done). Reading Marx through a feminist lens actually makes it quite easy to identify some of these unanswered questions, but addressing them seriously implies a need to to rethink the orthodox ‘overview’. This they cannot imagine. So they are deaf to the argument that perhaps it is feminist political economy that has the overview (looking as it does at both production and reproduction, and both men and women) and that it is the narrow study of male activity that constitutes the sub-specialism.

One of the concepts that becomes problematic viewed in this way is that of ‘necessary labour time’. Concepts such as commodification, and the reserve army of labour also need rethinking. (I have made a small start on some of these questions in my introduction to ‘the reproduction of difference, which can be read online here).

Like other women of my generation, I have found in the past that work that I have done in these areas has either been ignored, or has been appropriated without acknowledgement (sometimes the first guy to ‘get’ my argument will cite my work, then those who come after will cite only his) or has been consigned to the dusty box labelled ‘the woman question’ with its broader implications unattended to.


Another extraordinarily irritating characteristic of many of the new generation of Marxists is their assumption that feminism has already been done.  These guys (some of whom must have been brought up by feminist mothers) believe that they are sensitive to differences of gender, race, sexuality and disability, much more so, indeed, than the general run of men, whom they may take to task for infelicitious use of ‘inappropriate’ language. They usually react with pained incomprehension or denial if accused of insensitivity on any of these fronts. When that doesn’t work, knee-jerk defences tend to kick in: if they think they are accused of sexism they will indignantly refer to the importance of race (and vice versa) in the dominant group’s intuitive instinct for maintaining its power by strategies of divide and rule.  Sometimes they even give the impression that only a white man can really enforce social justice, because he is in the enviable position of being able to exercise impartiality in any cat-fights that may break out between rival ‘minorities’. They point to their equality committees and gender studies departments as evidence that all these concerns are being properly taken care of, in their rightful places. Their conviction that they have nothing new to learn is unshakeable


Whether in academic departments or policy development circles, it is generally assumed that the big new issues can be identified in gender-neutral ways. Whether the topic is colonialism or modernism or epistemology or structural adjustment policies, it is the job of masculine Great Minds to map out the terrain, and the job of feminists to follow behind, writing articles or setting up courses in which the Big Abstract Concept is preceded by the words ‘gender and’ or ‘women and’. Thirty five years ago, there was some logic to the frenetic intellectual activity which subjected all the ‘isms’ to feminist critique. It was done in the hope that this would be a one-off task and that these critiques would be taken on board in what is now known as the ‘mainstream’. It is, however, abundantly clear by now that all this achieved was to create new subsidiary fields of ‘gender studies’ whose existence, whilst it did provide a reasonably protected home for some important women thinkers, let the male scholars off the hook by absolving them even of any need to read this stuff and allowing them to get on with their Boy’s Own theory building. Meanwhile, anyone with an interest in gender had to read twice as much: the original ‘path-breaking’ scholarship of the Great Minds AND the feminist critiques of their work. Some of course found it easier and more satisfying to look inward and operate intellectually entirely within the world of gender studies (rapidly spreading to include its own sub-fields, such as queer studies). But those who still wanted to inhabit the disciplines of economics, philosophy, history, geography, development studies, sociology, politics or whatever had to either forget their feminism altogether or content themselves with the very traditional role of following behind the men, tidying up after them and carrying the heavy loads.


Related to these assumptions that feminism belongs in gender studies departments, and that the only pioneering intellectual work that women are capable of is in this field, is a complementary notion that all women are de facto experts on gender (in this conception, men, of course, don’t really have a gender, any more than white people have a race). This plays out in exchanges like this one:

HIM: ‘We are organising a conference about Important Topical Issue (ITI) and Abstract Noun(AN) and we’d like you to be a keynote speaker’

ME: ‘OK. I’ve written a book/done a research project/taught a course  on ITI and AN. I would be happy to speak about this’.

(It then gets put in my diary, travel arrangements are made, a title is agreed for my presentation etc.)

Six months later

HIM: ‘We are now finalising the programme and we’d like you to speak in a session on ITI and gender.’

ME: ‘Well I was actually not planning to talk about gender, except very incidentally. I was planning to speak quite generally about ITI and AN and present the conclusions from my latest work in this field.’

HIM: ‘We have been lucky enough to persuade Professor Very Famous to speak and he will be giving the overview about ITI and AN. But we really need someone to cover the gender angle. I am sure that you, of all people, must agree that this is very important’

ME: ‘Well actually I haven’t really done any recent work in this field that focuses particularly on gender; my work has addressed other broad questions. If you want someone to speak on gender, could I suggest that you invite Person A, Person B or Person C?’ (Thinks: ‘And furthermore I have been working in this field for much longer than Very Famous who actually plagiarised some of my work several years ago and his work is very shallow. And he never thanked me for the help I gave him with his first project.’)

HIM: ‘I’m afraid it’s too late to invite anyone new and our budget won’t run to it. We are really relying on you for this’.

There are several alternative endings to this scenario. In the first, I meekly comply. In the second, I pull out. In the third, I stand my ground and insist that I make a general keynote speech (as originally proposed) which is not seen as a subsidiary category of Professor Famous’s overview and am treated like a difficult primadonna and removed from the first day’s agenda altogether and put into a ‘closing plenary’ session which is delayed and takes place after most of the conference participants have already left for the airport. I am introduced as a ‘feminist professor’ by a man who handles the word with verbal tongs, makes a sexist joke and mispronounces my name. There are more possibilities, which I won’t bore you with now.


In the present revival of interest in Marxist theory, the work done by Marxist feminists in the 1970s seems to have been completely forgotten. The thought of all those new Capital Reading Groups having to start again from scratch is a deeply depressing one. And of course whenever I raise this point in conversation with someone they say that this work really needs to be done and suggest that I write a book about it.


Is it really the best use of my time to go back and revisit the thinking that we did forty years ago? There are so many important new questions to explore. But it has to be done by someone, I suppose, to avoid a new generation of women having to go through the same struggles all over again. Like the washing up, someone has to do it. Any offers? Now don’t all put your hands up at once, will you, guys.

The price of knowledge (and the knowledge of price)

It’s the start of another academic year – the last in which those students who took a gap year will have remotely affordable fees to pay and for most the first in which the cost of higher education will tip them into serious life-changing debt. With a variety of combinations of anxiety, regret and relief,  parents are contemplating empty teenage bedrooms whilst, with complementary combinations of exhilaration, anomie and panic, their offspring are unpacking their laptops and hanging their clothes in unfamiliar wardrobes. Over them all hangs the question: how will it all be paid for?

My daughter (born in 1982) was part of the first cohort in which parents had to pay fees at all, a generation that was walloped by government policy every step of their way through childhood. Too young to benefit from New Labour’s grudging concessions to working parents (childcare costs were not even tax deductible, let alone subsidised by the state when she was a toddler) yet too old to benefit from the more generous funding of state services for parents in the pre-Thatcher era. Every year, something seemed to be withdrawn that was available to those just a year older. The previous year at her primary school had a one-week trip to Wales for their school journey; her year had to settle for Essex. Hers was the first year to lose the free music tuition that had been available under the old Inner London Education Authority and the first year not to be given free bus passes for travel to secondary school – but also the last year not to benefit from the reduced fares for 16-18-year olds introduced by New Labour, along with Educational Maintenance Grants. This meant paying the full adult fare for the cost of commuting from Zone 2 to Zone 4 in her final two years at school. Hers was also the guinea pig generation for much experimental interference in schools: the first cohort to have to do SATS examinations at primary school, the first to have the choice of GCSEs constrained in such a way that it was impossible, for instance, to specialise strongly in languages, or in visual and performing arts (though in theory producing more all-rounders and reducing the gender gap in subject choice). Not only did this forgotten generation suffer right through their education. They also entered a labour market in which the concept of a job for life had vanished. Apart from a lucky few, there were no apprenticeships or protected graduate trainee positions to be applied for. Suddenly, they had to compete, not just with the contemporaries they had been educated with, but in a global labour market, with similarly qualified kids from all over the world. Without experience it was almost impossible to be employed and the only solution to this Catch 22 on offer to the majority was ‘work experience’ – the unpaid internship that was supposed to confer ’employability’ (in the process, of course, further undermining wages and conditions for the lucky workers who were actually paid).

The punishment of this squeezed and neglected generation was also, of course, a punishment for their parents, particularly those who, like me, were bringing them up on a single income. We are often presented (including by our children) as privileged baby-boomers, on the one hand blocking the career ladder for our ambitious juniors at work, on the other a demographic time-bomb representing an unsustainable cost to the state and an impossible burden on our childrens’ generation who will have to support us in our decrepitude.

For some, this may well be the case, but for many is is possible to see the economic role of this generation very differently. When we were young we entered a labour market based on a different set of welfare norms. The tax and national insurance contributions we paid went, not towards our own individual pensions, but, in a solidaristic model, to provide for the pensions of our parents’ generation and the  benefits paid to those who didn’t work in a more forgiving welfare state (including the cost of maintaining the mentally ill in – albeit sometimes harsh – mental hospitals).

This model changed when  most of us were in mid-career, in the Thatcher years. We were now supposed to be saving for our own future. But these were also times when the labour market was harsh for many of us, especially women, and even more especially for the growing number of women who were no longer living with a male breadwinner. Far from being able to put money aside, most of us were hard put to find the money just to survive and support our children. And many of us are still having to support them, well after they have reached the age when their parents were economically self-sufficient, in the phenomenon I once heard an Italian statistician describe as ‘Hotel Mamma’. It is not only infantilising and frustrating for them to have to go on living under the parental roof into their thirties and even forties; it is also both expensive and tiresome for their parents: a generation who left home in the 1960s and 1970s to gain some  emotional privacy are now deprived of it by the ever-present critical scrutiny of their children – and sometimes grandchildren.
I have been following the debates over the past few years about student funding with some astonishment. At some point – probably during the 1980s –  a seismic upheaval took place in the consensus that had existed between all political parties in the post World War II period that the cost of higher education should be borne by society as a whole, since society as a whole would of course benefit from the results. If graduates ended up earning more than their peers then, according to this post-war consensus, there was a perfectly simple way for the state to claim back its share of this additional wealth: through income tax. I am still puzzled by how this – to me self-evident – logic broke down. The new consensus, taken for granted as much in the Labour party as in the Coalition government, is that it is unfair for the rest of society to ‘carry the cost’ of tertiary education. Never mind the fact that the many graduates are highly unlikely ever to earn more than the average (think, for instance, of where a degree in theology, or archaelogy or mediaeval history might lead you). Never mind that many of the brightest will be encouraged to leave the country altogether and seek their fortunes elsewhere to avoid paying back their loans; the new common sense holds that it is right and proper that students should spend the most productive period of their adult lives after graduation paying back the cost of their tuition fees and their living costs as students whilst they attended institutions that, to add insult to injury, are rapidly becoming production lines of standardised forms of learning.

How did this come about? Whatever happened to the idea that collectively passing on knowledge and wisdom to the next generation has a general social value that may not necessarily be measured in high salaries? Might not we all benefit from  intelligent discussion on the radio, well-informed local government, compassionate public service delivery, thought-provoking poetry, joyful music and inspiring sermons from the pulpit at the weekend? Won’t well-educated adults make better and more responsible parents? When did the notion of income tax suddenly become a dangerously subversive political no-go area?

It seems as though the idea of social redistribution which is both intra-generational and inter-generational across a whole society is well and truly dead. The debate has narrowed to a kind of bickering about how the costs are to be redistributed over an individual’s own lifetime (which carries implications of inter-generational subsidy within the family): graduate tax versus various different forms of loan with a few means-tested subsidies for those who can prove themselves exceptionally needy.

In thinking about the unacceptability of the income tax solution it struck me that there is a real basis for a psychological rejection of it as unfair. And this lies in the reality that, to some extent, students really ARE privileged, and always have been. This is not necessarily a financial privilege. Indeed it can plausibly be argued that a combination of the downgrading of the value of an undergraduate degree in the labour market, combined with deteriorating job prospects and the burden of paying back a student loan will lead to the value of many undergraduate degrees being financially NEGATIVE. Rather, students (or at least those students who do not have to combine studying with paid employment or put in long hours in laboratories) are privileged in having a period of three or more years in their lives when they have the leisure to read and reflect and develop ideas, the opportunity to meet and get to know – and if they are lucky find soulmates among – a variety of people from different backgrounds, to follow a thought to its conclusion, to experiment socially and sexually, to experience the satisfaction of seeing creative effort fulfilled and to enjoy relatively unstructured time that permits them to sit up till four in the morning talking about the meaning of life. This is an idealised view and many never achieve a fraction of these things. We know that students are increasingly likely to suffer from depression and anxiety (with an alarming increase in the suicide rate) and that the pressure to earn whilst studying is constantly growing. Yet there is enough truth in it to rouse some resentment in those – still a statistical majority – who do not go to university, or at least to allow politicians to whip up such resentment.

Saddling  students with a choice between crippling debt or emigration does not, however, seem like any kind of a solution. Wouldn’t it be better to ask that this privilege is repaid to the rest of society through putting the  knowledge and wisdom that students acquire to good use? How about requiring all students to put in – say – 30 days a year voluntary work: acting as handymen/women or  gardeners to elderly and disabled people, cheering up residents in care homes, helping organise holiday playschemes for children, redecorating dilapidated community centres or whatever. Or, for those lacking social skills or not to be trusted around the vulnerable, manual work improving the environment?  Maoist-style Red Guards or Cameronian envoys of the Big Society? Either way, I suspect both they and the rest of us would benefit far more from this kind of social redistribution of knowledge and time than by channelling their debt (and that of their parents) through the dubious conduits of the banking system.