The glitterball life

One of the many contradictory aspects of being a social researcher is the profoundly ambivalent feelings it throws up about the filling in of questionnaires – that constricted sorting of the nuanced intricacies of life into simple yes/no categories, rating abstractions on a scale of one to ten or trying to place the incommensurable into rank order.  I make much of my living from the results of other people doing so but also, in my capacity as a filler-in, have grown to detest the process, which sometimes reduces me to a state of almost speechless rage.

As a filler-in, you might want to answer ‘yes but’ or ‘sometimes’ or ‘both’ or ‘it depends’ or ‘good in some ways but not in others’  but can only do this if the questionnaire designer has been able to imagine such a possibility in advance.  But the researchers in charge of the questionnaire design (sometimes including me) have other things in mind: cheapness; brevity; compatibility with other surveys; the questions the policy-makers want answers to. Then there are the people (also sometimes including myself) who want to use the results – people in search of answers to new questions.hoping that somewhere out there, the ‘facts’ exist that can prove, or disprove our hypotheses.

I can remember my fury, back in the 1970s, when doing some research on the impact of technological change on women’s employment, I discovered that the only people who had gone back through all the census data since the beginning of the 20th century to examine changes in occupation by industry had not seen the importance of gender as a variable (perhaps they even thought it sexist to draw attention to it?) so they had added up the figures for the men and the women and presented them in an undifferentiated way, thus rendering their research entirely useless for anybody interested in gender segregation and requiring it to be done all over again. But I was equally furious, if not more so, when a few months ago, having spent a long time filling in a detailed online questionnaire intended to collect information about academics’ experiences of the REF (Research Excellence Framework) process, I got to the last page only to discover that it would not let me ‘submit’ my response until I had told them whether I was (a) heterosexual (b) homosexual or (c) bisexual. There was no option to say ‘prefer not to say’ ( ‘none of your f..ing business’), ‘other’, ‘not sexually active’ or even ‘don’t know’ and I crossly gave up. But of course it is quite possible that my annoyance at being forced into this limited range of options and (unexpressible) opinion that my sexual preferences should not matter in relation to the stated aim of the survey, might on some future occasion be mirrored by an equally strong annoyance on the part of some researcher, exactly like my own reaction to those 1970s gender-suppressing academics.

For me, as a subject, abstracting various aspects of my identity as standardised units that can be combined in various permutations may be experienced as deeply dehumanising. For me, as a researcher, being able to analyse and compare these units across a population is a necessary precondition for spotting patterns that may not only enable us to understand changes that are taking place in society but also, if we are lucky try to come up with solutions that can make that society more responsive to peoples’ needs. A contradiction, if ever there was one.

If this were just a problem peculiar to social research it would be easy to live with and we could rationalise it as something that is in general a good thing, and, even if it isn’t, easy to opt out of. But alas, this is not the case. Digitalisation has made the filling in of forms an all-pervasive feature of life. There’s no escaping it. Every time you apply for a job, open a bank account or book a flight, you have to enter information about yourself into a pre-defined form. And even when you are not consciously filling in a questionnaire you are still supplying information to somebody just about every time you navigate any website. We are all now familiar with the sorts of targetted advertising that results from the profiling that is based on the information we may have supplied voluntarily (e.g. age, postcode, relationship status, weight, height, number of children) when it is put together with the data generated by our google searches, our Facebook likes, our retweets and Alexa’s eavesdropping on our conversations over dinner.

It is a situation that has given rise to some of the most extraordinary paradoxes of our time. Total strangers can, via algorithms, have unrestricted access to the most intimate details of your life but you cannot even talk to your bank about our own money if you cannot remember the 3rd and 5th digits of your ‘memorable word’. All ‘for your own protection’, you must understand. The public and the private switch places constantly. You can sit on a bus and hear the overworked care assistant next to you talking on her mobile phone at the top of her voice to her (deaf?) patient about extremely private matters, but woe betide you if you take a photograph of your grandchildren in your local swimming pool. We are used to hearing the phrase ‘Data protection’ used as an excuse for anything from failing to inform parents about the suicide attempt of their 18-year-old to refusing to let you log on to a website after forgetting your password.

Social researchers have to jump through all sorts of hoops to collect data. You cannot interview somebody without their informed consent; you have to agree not to use the data for any other purpose than your stated intention; to anonymise it strictly; and to destroy it when the research is finished. Yet airlines can, without your consent, supply all your personal details to US Border security, right down to your meal preferences. And who knows who is listening in to your phone calls and reading your emails?

This sort of double standard is something it is easy to rant about, and many have done so, much more eloquently than I could. But that is not really what I want to write about here. What I am interested in is our ambivalence to the ever-multiplying supply of data which increasingly mediates the way we negotiate and understand our world, the ambivalence I discussed earlier in relation to being a social researcher but of which that forms only a tiny part. Most of us have come, if not to love, at least to rely on the way in which an individual item can be identified as a unique configuration of standard ingredients. Although it may be tempered by some sentimental nostalgia for the way things used to be, on the whole we welcome not having to trudge up and down the high street, asking in every likely shop for the obscure thing that we want but instead being able to google it. When I was a child you could go into a department store and find the department that sold cardigans and ask if they had a black one, in your size, made of cotton, with long sleeves and pockets and an assistant would get them out and display them for you on a counter so you could decide which, if any, you wanted to try on. By the end of the 20th century, those same department stores, or at least the few that had survived, were organised as a series of concessions, by brand. So you would have to wander round, investigating, label by label, whether any such cardigans existed getting more tired and frustrated at every step (and with assistants few and far between). How nice, then, to be able simply to enter the search terms and find the very thing, without even having to get up from your chair. And yes, it might be a bit annoying to be targeted with ads for black cardigans every time you log on for the next few weeks (despite your ad blocker) but you might think that’s actually a small price to pay, when the convenience is multiplied across the whole range of goods and services you purchase.

What is perhaps more worrying is that, in this digital society, you are not just a shopper. Indeed, you are increasingly somebody who is shopped for. On the job market you offer yourself, using a standard form, as an assemblage of increasingly standardised ingredients (your qualifications, the languages you speak, the software packages you know, the companies and clients you have worked for, what you have earned, what you have made, or published) struggling to find some way of expressing your uniqueness (often via one of those excruciating cover letters in which you describe your ‘bubbly personality’, ‘passion for [whatever the ad said]’, ‘willingness to go the extra mile’ and ‘strong communication skills’). These days this self-advertising is more and more likely to be on an online platform in which not only are your past assignments visible, but so too are the ratings you have been given by the clients you carried them out for. (This is something I am doing research on right now but I won’t bombard you with details here.)

The old occupational identities are increasingly fractured as we become, for the purposes of the labour market, bundles of interchangeable attributes, each of which has to be described in standardised terms and be measurable even if the combination is unique. As such, we are also increasingly comparable to each other (and therefore potentially substitutable for each other) in a market which (at least in terms of information) is geographically unbounded. If you, in Dhaka, have the same skills profile as me, in London, and if the work is digitisable, then what is going to determine which of us is given the job?  As on the job market so in other aspects of life. Just as we may become used to understanding our employable selves as bundles of standardised skills and competences, we may also start to classify our social selves in terms of standardised sets of tastes and consumer choices. Does this lead to the same sorts of social anxiety, I wonder?

If I, as, let us say, a translator from German into Mandarin with a specialist knowledge of polymer science and experience of preparing texts for academic journals, feel competitively threatened by somebody else with the same skill set, might I also, as a person wanting to be loved and appreciated, feel competitively threatened by the thought that, however much I want to stand out in the crowd, there are lots of other people out there who like the same kind of music as me, wear the same brand of clothing and like watching the same movies. If I can be pinpointed so easily by a marketer’s algorithm, wherein lies my uniqueness? I might try to reassure myself with the feedback of others. Which, in these digital times, is now very easily quantifiable. But what if others get more ‘likes’ than me on Instagram, or swipes on Grindr? What if their Facebook posts are shared, or their tweets retweeted, more than mine? What then is my value?

A world in which identities can be described as collections of attributes which, broken down, ever more precisely, into separate facets feels to me like one in which personalities are turned inside out. This is particularly visible in the process of finding friends and lovers. In the past, getting to know somebody might have been modeled as a process of peeling off outer layers, like the skin of an onion, to get to some kind of hidden internal essence (or ‘soul’, even). You might scan a crowd of strangers, uniformly grey,  to catch a sudden flash of eye contact that hinted at a possible connection which could then be explored tentatively. There were many false starts and a lot depended on chance. People whose lives were constrained socially might never meet a soulmate. But if you did it could take you completely by surprise (though the cliche of falling in love at first sight was probably vanishingly rare). One should not sentimentalise this, of course. Lots of matches were made by arrangement, built slowly into strong companionship from pretty loveless beginnings, or were never very happy at all. But I think it is fair to say that if and when one did fall in love, it was unpredictable, exciting and private.

Nowadays, significant numbers of people use online dating sites to find their partners. In the USA in 2018 the proportion of people who said that they had met their spouse or partner online was 12% among 18-30-year-olds, 13% among 30-44-year-olds and only fell below 10% among the over-65-year-olds. Nearly one internet user in five (19%) admitted to using dating websites or apps in 2017. And why not? In a world in which one shops online for everything else, it is completely logical to do so for sexual partners too, especially if you are too busy or temperamentally averse to seeking them out in noisy night-clubs, dubious bars or other physical venues. I know a number of happy couples who met each other that way. And yet, and yet… As the algorithms get more sophisticated I find myself recoiling ever more from the idea of being shopped for in this way. Instead of exploring people, one by one, from the outside in, what these sites do is abstract each separate facet and present them, searchably, for you to choose from, with as much as possible of what is internal carefully catalogued and displayed on the outside. You are asked to define what you want by skin colour, age, height, weight, income, politics, sexual preference, tastes in food and art and music, income and innumerable other variables, often, these days, linked together by artificial intelligence to produce  sophisticated psychological profiles which can be tested against yours for compatibility. Every aspect of your personality that can be captured is displayed publicly for inspection, like the tiny mirrors on a glitterball. A glitterball in which others might see their own characteristics diffracted and reflected. A glitterball among glitterballs. In a global mosaic of standardised attributes.

For me, this flies in the face of everything I want to believe in about human attraction and love and taste and the human ability to learn and change. The last thing I want is to be pinned down, for example, as somebody who likes a particular kind of music. One day I might love listening to Dexter Gordon, another day I might be moved to tears by a Schubert sonata or stirred by the bell-like clarity of the opening notes of a Sam Cooke song. But more importantly I want to be open to be surprised by some other kind of music I haven’t even encountered yet, or learn to listen attentively to something I might have rejected out of hand in the past. And there is music in every category that bores or annoys me. In just the same way I have no idea who I might fall in love with. Whether it is a man or a woman, someone black, white, tall or short. Who knows? I might think I don’t like scientists and then find myself suddenly and inexplicably enchanted by one. What makes somebody attractive (or not) is profoundly mysterious. To label myself in advance as locked into a particular pattern of preference feels profoundly wrong. And it feels equally wrong to exclude others on the basis of some superficial (and possibly temporary and changeable) attribute.

I cannot think of anybody I really value in my life whom I would even have found had I selected them using conventional search terms. It is the accidents of synchronicity that, in my experience, lead to the best friendships as well as the greatest moments of creative inspiration. Am I freak? Or do others feel the same? And should we be trying to find ways to put rounded and complete human beings back together in all their malleablity and unpredictability and inconsistentency and the essential unknowability that constitutes their deepest attraction?

 

 

Advertisements

Dilemma for democracy

referendumIt was pointed out to me over Christmas that I only posted once on this blog in 2018. And like that one, this post was also triggered by reflections on current debates on the left: this time the fraught discussions about what MPs should do to avoid the trap set by Teresa May of having to choose between a no-deal Brexit and the disastrous dog’s dinner of a Brexit deal that she has negotiated. I would love to be writing about something else, but this is what feels most urgent.

Withdraw Article 50, engineer another general election, hold another referendum. I am surely not alone among my friends and contemporaries in being happy to embrace any or all of these options if it would get us out of this horrible, horrible mess. But what has been occupying my thoughts has not been the advantages of one over another, but rather the deeper questions which this whole debacle has raised. In particular, the very delicate issues it touches on relating to democracy. which can be awkward for socialists to air in public.  And in the depths of this do-not-probe-too-deeply-or-stamp-too-hard quagmire sits the political mechanism of the referendum.

The referendum is often hailed as the ultimate expression of democracy, though many question its compatibility with parliamentary democracy. Parliamentary democracy of course has many faults, but it is does incorporate two important principles that are negated by a referendum. First, the results are always reversible. You elect your MP for a limited period and can then change your mind if you don’t like what she or he has been doing and elect somebody else after the term is up. Second, it recognises that the public does not have perfect information or prescience about unexpected new developments. Although you take account of what policies your parliamentary candidate stands for, you vote for a person, not a policy, on the basis that, once elected, the MP will make informed decisions on complex issues based on a combination of her or his political views, the latest information and expert advice.

A referendum does not just force a choice between crude binary options; its result is also deemed irreversible, however ill-informed the basis on which people made that one-off binary choice. Once the people have spoken, goes the rhetoric, what they have said should stand for all time.  And because the logic underpinning the holding of a referendum is fundamentally populist, any criticism or second thoughts about it must be regarded not only as anti-democratic but also elitist. And what socialist wants to go around sporting those two labels?

It is perhaps no accident that Switzerland, the country that makes most use of referenda was just about the last in the world to give women the vote, or that in Germany, with its first-hand experience of the horrors that crude populism can lead to, the referendum is viewed with great suspicion.

We have all known for a long time that a lot of public opinion, shaped as it is by right-wing media scaremongering and misinformation, is profoundly reactionary. It is no secret that at any time in the second half of the 20th century if there had been a popular vote on it the British people would have wanted to bring back capital punishment, for example. However there was a quiet consensus at the time (which some populists might consider elitist) that they should not be given the chance to do so. And – at least in the Labour party – that there was a responsibility to try to educate the general public in the interests of encouraging more humane values.  Parliament was often ahead of public opinion on many  issues such as the legalisation of homosexuality, the prevention of violence against children and religious tolerance.

Had Cameron, in his desire to throw a bone to the rabid right of the Tory Party, organised a referendum not on Brexit but on bringing back hanging, and had the public voted for it, I wonder, would we still hear Labour MPs saying ‘the people have spoken’ and investigating ways to bring it back in a nice, moderate form?

The same could be said for any number of other issues, including race and immigration. Any referendum runs the risk of establishing such (uninformed) views as ‘the views of the people’ and casting them in concrete for all time. The parliamentary system,  despite its many defects, does at least allow for mutual education and, as already noted, presupposes that informed politicians will modify their views in the light of new information in the knowledge that if their constituents really don’t like the decisions they make then they always have the choice to vote them out next time round.

The nettle that has to be grasped by the Labour party is this: at what point should socialists come out and insist that we stand for certain values which the majority may well not yet hold but we nevertheless think are important enough to keep campaigining for? In other words, at what point do we prioritise leading over following? Faced with an unpalatable opinion poll result, should we just kow tow to dominant opinion and change the manifesto to accommodate it? Or make a principled case for the other view, and campaign to change peoples’ minds?

Avoiding grasping this nettle has led to a great deal of fudging over the last two and a half years, not helped by nasty in-fighting within as well as between parties. The horrible mess we are in seems to me to be a direct outcome of this. It is going to be extraordinarily difficult for parliamentary democracy to survive.

What should the strategy be? I am no wiser than when I started writing this blog. Two wrongs don’t make a right and, having written what I have about referenda it would seem profoundly illogical to say let’s have another one. But if this is the only way to avoid a nasty Brexit then I would personally go for it as the least bad option. If you are stuck in a stinking bog, sinking by the minute, you grab any rope that is thrown to you.

Holiday greetings

LIQUOR

This year I am posting a very old photograph – taken in Goa around 20 years ago and badly scanned – because it still conveys a message about Christmas that amuses me.

Wishing you all a happy holiday wherever you may be and however you choose to celebrate it (or not). And hoping 2019 will be better in every way than its predecessor.

Thoughts on anti-Semitism

It is a long time since I wrote in this blog. There is a repeating pattern whereby a period of illness (in this case recovering from some surgery just before Christmas) sets me back in my ‘proper’ writing, rendering me too guilty to indulge in non-commissioned work until I have cleared the backlog. But eventually I feel so strongly about something that I have to break the self-imposed taboo. Something similar happened a couple of years ago when I found myself compelled to break the silence during the build-up to the Brexit referendum because of concerns about how it was reported. Today’s impulse also comes from a kind of horror at what is going on around me politically but this time the context is the whisked-up media attention currently being paid to anti-Semitism in the Labour Party, which, it appears to me, wearing my Cassandra hat, may be leading us towards something that is profoundly harmful, political and socially. (even though it might in the long run have some positive impacts by bringing what is hidden or taken for granted to the surface).

Despite (or perhaps because of) the huge media coverage, there seem to be important questions that are not being asked, or answered, adequately. There is of course a lot of discussion about where the attacks are coming from and why now: smoking guns aplenty for conspiracy theorists. The Tories, looking for any ammunition to use against Labour in the run-up to the local elections; the Israeli Government, happy to have pro-Palestinian voices silenced while they shoot unarmed civilians in Gaza; Blairites in the Labour Party who seize on any opportunity to attack Corbyn, regardless of its impact on Labour’s election chances; the mainstream mass media, drifting ever further rightwards with the BBC (fearful of the axe) rivalling the Murdoch press in its pandering to the Tories. This morning, on the Today programme, the fact that Corbyn had attended a meeting of Jewdas (‘a left-wing Jewish group critical of more mainstream Jewish organisations. organisations’, as the BBC put it) was treated as evidence that he was failing to address anti-Semitism in the Party in a piece of doublethink worthy of Orwell. The question that, it seems to me, is not being addressed sufficiently is how and why this particular charge is so difficult for serious socialists to counter. How is it that the energetic, resilient left, which has successfully fought back over the last couple of years against so many anti-Corbyn smears, can be so easily silenced when accused of something which that very socialist left has, over the years, done more than any other political grouping in this country to counter? What notions of good and bad, what intersections of fear and shame, what confusions, what extremes of guilt-by-association, have brought us to this pass?

So here I am again, adding yet another voice to the conversation.

Like last time, when I wrote about Brexit, I have been very hesitant about putting my thoughts online. On both topics there are many people much better-informed than I am. I feel a bit like the fool stomping in, in my muddy wellies, onto a polished parquet floor where even the angelic dancers hesitate before taking a tentative step. Not only is it likely that my generalisations will be quibbled with and my attitudes questioned as old-fashioned, badly informed or politically incorrect, but, in the case of anti-Semitism, there is also the undeniable fact that I am not Jewish, and therefore, perhaps genuinely insensitive to what is going on.

So I will start with a self-interrogatory personal narrative of what Jewishness means to me. I grew up with a strong awareness that anti-Semitism existed and was not easy to fight. My father was a student in Vienna in the 1930s and witnessed extreme forms of it first hand. He was a close friend of Muriel Gardiner (who became Muriel Buttinger after her marriage to Joseph Buttinger, the leader of the Austrian Socialist Party). Taking advantage of her family’s wealth and her US passport, Muriel played an active role in sheltering socialists wanted by the police from the Nazis and helping find ways for Jews to get out of the country safely, a role that was lightly fictionalised in the 1977 film, Julia, where her part was played by Vanessa Redgrave. My father played a small role in this (including helping to acquire British passports that could be used to get people across the border) and Muriel remained a life-long friend of his and, later, an inspiration to me. I have vivid memories of the last time I met her, over dinner in an Italian restaurant in Bloomsbury in the 1970s, in the company of my father and my friend Nick Redgrave, where she talked about her American socialist youth, collecting money for the defence of Sacco and Vanzetti in 1920. It was on this occasion that, when I admired the chunky Venetian baroque pearl necklace she was wearing, with typical impulsive generosity, she immediately took it off and said, ‘Here, have it!’.

Anyway, suffice it to say, this experience cemented for me an association between Jewishness and socialism. Like the rest of their generation (my father was born in 1902, my mother in 1907) my parents were by no means free of racial stereotypes. For them, Jews were clever, sensitive, musical and studious. They were good fathers but unlikely to be interested in sport or the consumption of alcohol. Some of these stereotypes were challenged when applied to the complex personalities of Jewish people I knew personally but many were not. I really did meet a lot of Jewish psychoanalysts, writers, university lecturers, publishers, musicians and artists. And an awful lot of them really were socialists. Most were also part of a shared culture which was secular and humanist and their Jewishness did not seem particularly important.

During the 1960s and 1970s, Jewishness, like other identities, felt to me like part of a historical legacy that would become less and less important with the progressive advance of universal education, and the spread of democracy and egalitarian welfare regimes. Religions, it seemed then, formed part of superstitious heritages designed to bolster hierarchies (including patriarchal ones), reinforce obedience to power, provide hope to people facing intolerable adversity and give them spiritual sustenance in their sorrow. While of course their practices should be tolerated and their rituals celebrated, their roles would increasingly be taken over by other communitarian agencies in the brave socialist future we imagined.

But I was sometimes reminded of the strength of Jewish identity, even among secular Jews. I remember a conversation with Michael Kidron (then my editor at Pluto Press) in which I was describing how international my family had become (my siblings’ spouses were, variously, Polish-German, Japanese, French and Palestinian – which extended in the next generation to Indian, Chinese and US spouses) and his immediate reaction was ‘Yes, but you’re all Goy’. I was also struck, living in Yorkshire in the 1970s, by the bonds that stretched across the Jewish community there, crossing party-political boundaries. When the National Front wanted to march through Leeds City Centre, this was stopped, so I was told, by Irwin Bellow, the Tory leader of the City Council, former owner of a company that made sewing machines (subsequently knighted for his services selling council houses as a minister in the Thatcher government) after a phone call from Lou Baruch, the communist leader of Bradford Trades Council (‘the textile workers’ champ’) – the anti-union capitalist and the trade union leader cheerfully colluding to thwart anti-Semitism.

In the early 2000s, when organisations like Jews For Justice in Palestine were formed in the UK, it came as quite a surprise to me how many friends I had up to then simply thought of as fellow socialists and feminists decided to identify themselves publicly as Jews. It had never even occurred to me in many cases that that’s what they were. I actually found it quite difficult to place myself, as a non-Jew, in relation to such campaigns, which seemed to construct people like me as outsiders. If one was campaigning on the basis that everybody was equal and religious distinctions should not matter, then it seemed on one level contradictory to insist on such distinctions. On another level, of course, it was very understandable. In a world where every non-Jew runs the risk of being (consciously or unconsciously) anti-Semitic, just as all white people run the risk of being (consciously or unconsciously) racist, then Jewish voices that stand against Israeli state policy have a unique chance of being heard out. Nevertheless, it left me, and perhaps others, with a sense of having nowhere to put my solidarity, a silencing of sorts.

This is not the whole story, of course. The atrocity of the Holocaust hung like a pall over my childhood, as I suppose it did for most of my generation. I didn’t even realise how deeply it affected me until I became a mother and found myself haunted by detailed and very concrete imaginings of the experience of deportation and the death-camps. What did you do, I kept wondering, crammed, standing, into a cattle-truck with nothing but the clothes on your back, when your baby needed a nappy change? When it cried? When it wanted to crawl? How did you cope with the leaking breast-milk when that baby had been snatched from you? Such questions, I realised, had formed part of the mental sound-track of my life since childhood, breaking their way into consciousness only at moments of emotional stress or vulnerability and playing who knows what convulsive riffs while they remained unconscious.  I can still remember, with great vividness, an experience at my North Wales primary school in the mid-1950s. Two boys were approaching kids in the playground with knowing ‘I’ve got a secret’ smirks, asking if anyone wanted to see their pictures. One by one, we were shown two well-thumbed black-and-white photographs, cut from a magazine, of the piled-up emaciated bodies that were found at the liberation of Belsen. It is hard to exaggerate the shock of this – not just the obscene reality that was represented in those pictures but also the voyeuristic frisson that these two boys seemed to experience, as if it were pornography, and the air of secrecy, as though these images revealed something so shameful that children were forbidden to view them. I am sure I am not the only person whose nightmares were invaded by these images. Such experiences confirm the idea of the Holocaust as something uniquely awful, incommensurate with other atrocities. Even to mention it in the same breath as other genocidal massacres can feel like somehow trying to diminish its importance. Small wonder that denying it is a criminal offence in many countries.

And then there is Israel. My early view of Israel was partly shaped by second-hand accounts of kibbutzim, which provided gap-year experiences to many more or less idealistic kids in the years after National Service was abolished for young British men. They seemed like a foretaste of socialism – sexual freedom and communal living amongst the orange groves. Israel was, in this view, the happy ending that awaited those who were lucky enough to have survived the horror and brave enough to fight for freedom, a view that was reinforced by the 1960 film Exodus, directed by Otto Preminger, one of the first I ever saw on a wide screen.

Since then, I have become only too aware of how necessary it is to unpick such facile narratives and explore their contradictions, not least through my first-hand contact with the descendents of Arabs for whom the foundation of Israel meant being turfed off their ancestral lands. But such unpicking is extraordinarily difficult to do when the narratives are so highly-charged, both emotionally and morally.

I became acutely aware of this when I visited the United States Holocaust Memorial Museum. By coincidence, this, my first (and so far only) visit to Washington, took place only a couple of weeks after the 9/11 attack on the Pentagon. I was on one of the first flights into a city where at least one of the other airports was still closed. There was a strange jumpy hysteria in the atmosphere, with military aircraft zipping overhead and people with their heads down hurrying to get home. Flags fluttered everywhere. Streets were almost empty. My academic hosts had arranged for me to have dinner with their Dean in a fashionable restaurant that served up expensive versions of poor peoples’ foods (things I had only ever read about, like hominy grits). The restaurant was almost deserted and the Dean wanted to spend the minimum possible time there, departing in the middle of the main course after asking the waiter to box up his food so he could take it home with him, ratcheting up the already high level of awkwardness for those of us who had to sit it out until dessert had been consumed.

Rejecting polite offers to entertain me, I decided, in the free day before my seminar, to visit the Holocaust Museum, about which I had heard a great deal. The experience was  immersive. As you entered, you were given a card with the details of a Holocaust victim with whom you were encouraged to identify (I am using the past tense here because things may have changed in the sixteen years that have elapsed since this visit). The first spaces that greeted you gave a historical account of Hitler’s rise to power, with photographs of mass rallies, the swastikaed flags in the photographs uncannily echoing all those stars and stripes waving outside. It was made clear that the Nazis attacked socialists and trade unionists, as well as Jews but the main story was about anti-Semitism. The next section of the museum reinforced this, with a historical account of anti-Semitism in Europe and lots of artefacts showing the rich cultural heritage of European Judaism. Then you were taken, step by harrowing step, through the detail of the Holocaust – the roundings up, the transport, the conditions in the camps, the death chambers. Incidental mention was made of  non-Jewish victims (the gypsies, the gays, the mentally handicapped, the socialists) but overwhelmingly the story was about Jews, and hatred of Jews, and the unspeakable consequences of that hatred. It was all made concrete and vivid, not just through identification with the avatar-victim on one’s personal card but also by the volume of material evidence. Everyone I have ever spoken to who has visited that museum remembers the enormous pile of worn, discarded shoes, heartbreaking in its very banality. Emerging, trembling from the emotional impact of all this, you entered the final part of the permanent exhibition, intended, I suppose, to be uplifting, covering the liberation of the camps and the resistance. The last room celebrated Israel.

I came out into the glare of the Washington sunlight feeling shaken and moved. But also, confusingly, a little bit tricked. It took a lot of thought to unravel this feeling and I ended up concluding that it was the result of the slow elision of oversimplified dualistic oppositions, a slippery my-enemy’s-enemy-is-my friend/if-you-are-not-with-me-you-are-against-me logic that, when extended, led one along a path that was too narrow, too exclusive and not quite where one intended to go. This is a logic that conflates the political and the moral and, by virtue of the power of that morality, creates a stage in which everyone must be a victim, a villain or a hero (not unlike Stephen Karpman’s victim-rescuer-persecutor ‘drama triangle’). It is a world of goodies and baddies with very little scope either for shades of grey or for personal change.  The logic goes something like this: Hitler = evil; Jews = victims; Allied troops = heroes. Since Hitler was bad, Jews must be good, therefore Israel must also be good. Anybody who is against Israel must therefore be bad (like Hitler) – including Arabs. Socialists fit very awkwardly into this logic. According to the Nazi logic, they are as bad as Jews (indeed they are often assumed to be Jews, or manipulated by them) who must be stamped out, which makes them, by anti-Nazi logic, victims and/or heroes. Their historical role as opponents of anti-Semitism and racism in many European countries also renders them good. However if they use the same reasoning that enabled them to identify dispossessed Jews as victims to recognise dispossessed Arabs as victims too, that makes them anti-Israel which renders them bad. They become like those optical illusions of which the eye can only see one version at a time, toggling wildly between good and evil.

There is a sense in which we all want to be heroes of our own biographies, casting others as fellow victims or persecutors, allies or opponents. But in a political landscape so shot-through with moral righteousness and outrage it is extraordinarily difficult to step forward with confident conviction of one’s own heroism, especially if one is not a central protagonist in the story. Indeed, the greater one’s self-awareness and knowledge of history, the more difficult this becomes.

I had a Catholic upbringing which impressed on me the importance of the nightly ‘examination of conscience’ in which you reflected on everything you had done that day and, if any of it was bad, resolved how you would put it right tomorrow: a sort of memory-scan for shame. This kind of self-examination is of course not unique to Catholics. Variants of it can be found in the practices of psychotherapy, for example, or the consciousness-raising that went on in women’s groups in the 1970s.

Many of us, perhaps especially on the white left, are acutely aware of our own inadequacies. When it comes to racism and anti-Semitism, there are few, I suspect, who can put their hands on their hearts and proclaim themselves entirely not guilty. My generation was brought up in a culture that was profoundly racist and homophobic. Did we really never snigger at the camp gay stereotype played by John Inman in ‘Are You Being Served?’ or laugh at the jokes in ‘It Ain’t Half Hot Mum’ or ‘Love Thy Neighbour’? And how much of it washed off on us? We have had to admit that, even if we never consciously discriminated against a black person ourselves, we probably owe our relatively advantaged social positions in the British middle class at least in part to the history of slavery and imperialism. Even as we try to uncover our own hidden racism, we become more and more aware of how our society is steeped in it, how it takes myriad forms and changes over time, how difficult it is to disentangle concepts of cultural difference from those of discrimination, how complicated are the interactions between the past and the present in the formation of identities.

As with so many things in life, the more you know, the more you understand how complicated something is, and the more hesitant you may become about laying down the law to others. Yet alongside this growing comprehension of the complexity of human group inter-relationships, also comes an increasing awareness of how much unfairness and suffering and injustice there is out there. The impulse to remain silent is countered by an equally important impulse to do something about it (that’s what makes people join organisations like the Labour Party). But the interplay between these two impulses might create a sort of paralysis, or at least wrong-foot those who try to enter the public debate without having thought out their position carefully.

Justice is not a card game in which one kind of victimhood trumps another, rendering it irrelevant. We need a broader moral frame that recognises the co-existence of different forms of oppression, even the possibility that the same person, or group of people, might be simultaneously both an oppressor and a victim.

But articulating such a programme requires a degree of nuance that is beyond the binary logic of the mass media to cope with. And which of us, we might ask, has the right to propose such a programme? We seem to have arrived at a situation where non-Jewish socialists feel both unentitled to do so and held back by their very awareness of their own imperfections. I am not sure I am right about this but am wondering how much this might be the explanation for the diffidence (or perhaps even cowardice?) which non-Jews on the left feel about speaking out in the current debate. But speak out, I believe, we must. Somehow.

 

 

 

Not such good work, Matthew Taylor

The long-waited Taylor Review of Modern Working Practices is now published, under the title Good Work and it is, I am afraid, very disappointing indeed. In terms of its concrete recommendations it goes beyond being a missed opportunity, out of kilter with its times, to posing an active threat to workers’ rights and undoing past advances.

As might be expected from a lead author who was appointed head of Tony Blair’s Number 10 Policy Unit in 2005, it is not short on spin. It speaks repeatedly of ‘enduring principles of fairness’, nods often to the idea of good work as an essential ingredient of happiness and wellbeing and claims to be focusing ‘not just on new forms of labour such as gig work but on good work in general’. Pious mission statements, such as ‘We believe work should provide us all with the opportunity to fulfil our own needs and potential in ways that suit our situations throughout our lives’ sit alongside nods to the inevitability (and benignity) of technological progress. In the classic contradictory formula of centre-left neoliberalism it manages simultaneously to say that ‘Good work is something for which Government needs to be held accountable’ and ‘The best way to achieve better work is not national regulation but responsible corporate governance’!

Why was it no surprise to discover this morning that Taylor’s co-investigator, Greg Marsh, was a former investor in that most visible of gig economy companies, Deliveroo?

Out of kilter with the time

In light of recent events, the report seems oddly old-fashioned. It is little more than six months since the Inquiry was established (in October 2016) but during that period there have been unprecedented developments on the ground, with an upsurge in organising by casual workers in the UK (and elsewhere). New trade union organisations, such as the  UPHD (United Private Hire Drivers) and the IWGB (International Workers of Great Britain IWGB) have sprung up to represent drivers for platforms like Uber and delivery workers for companies like Deliveroo as well as casualised workers in other sectors, such as outsourced cleaning workers, porters and foster carers. A series of test cases brought by these organisations, sometimes with the support of traditional trade unions like the GMB, have established in case after case that workers for companies like City Sprint, Uber and Pimlico Plumbers are not the ‘independent contractors’ these companies claimed they were but ‘workers’, entitled to such rights as the minimum wage and paid holidays. As a result of these, and other well-publicised cases of exploitation of low-wage workers, such as Sports Direct, there has been a sea-change in public attitudes to fairness at work evidenced by the popularity of the demand for an end to zero-hour contracts in the Labour Party Manifesto.

The British public seems, at last, to have seen beyond the rhetoric that elides what is ‘flexible’ for the employer (in the form of a just-in-time workforce, waiting to be summoned at short notice by an app) with the older demands raised fifty years ago by the Women’s Movement for a ‘flexibility’ that responds to the unpredictable demands of family. Having lived it in their own lives, or watched their kids do so, most people now see only too well that being available on demand makes it very hard indeed to manage your own life, especially when childcare is involved. But the report shows no awareness that workers and employers may have different interests, merely stating vacuously that ‘Encouraging flexible work is good for everyone and has been shown to have a positive impact on productivity, worker retention and quality of work’.

While public opinion seems to have been saying ‘enough is enough’, the court judgements  have been saying, in the words of Jason Moyer-Lee, General Secretary of IWGB,  ‘”gig workers” already have rights – all we need to do is enforce them’.

A rational response to this situation – the opportunity that this report misses – would take the existing principles as a starting point and work to ensure that there are clear guidelines for their implementation, putting the onus of proof not onto vulnerable workers but onto those who dictate their working conditions and profit from their services. But this is very far from the Taylor approach.

Missed opportunity

The report quite rightly recognises that the employment status of casual workers is confusing and poorly understood. This is partly because it is dealt with separately under the tax system and in employment law. Under the tax system, unless you have some other legal status such as being a limited company or a partnership, you are either an employee or self-employed. Many workers living hand-to-mouth think it is preferable to be self-employed because that way they can defer the payment of income tax and set expenses against it. Under employment law being an employee brings a range of rights and protections, including such things as maternity and paternity pay, sick pay, parental leave and pensions coverage. These are probably worth much more to most workers in real terms than whatever tax savings they make by being self-employed, but of course can only be claimed if your employer actually agrees that you are indeed an employee and fulfils his or her part of the bargain. There are however some rights, guaranteed under employment law to all workers regardless of whether they are formally classed as employees. These include the right to the minimum wage and to paid public holidays.

The difficulty of establishing employee status is not new. Back in the 1970s and 1980s when I was doing research on homeworking this issue came up again and again. Frightened women, unaware of their rights, were told firmly that they were not employees (often believing – usually wrongly – that what they were doing was not quite legal and that if found out they would become liable for tax or national insurance payments and fined for being in breach of health and safety or tenancy regulations) so they would accept that they had no rights. The law had then no single test for being ‘genuinely self-employed’. Tribunals or courts were supposed to weigh up a lot of different factors such as who determined what work should be done and what should be paid for it, whether or not the worker had the right to employ somebody else to do it, how continuous it was, who paid for the materials and so on. Little has changed since then, although the case law has moved on. The most crucial principle is whether a relationship of subordination can be said to apply.

In the case of most platform companies, there is little doubt that the workers are indeed subordinate. Although practices vary from company to company, workers are usually told precisely what to do, with each ‘task’ well defined and costed. Not only is their pay and work process laid down, there are also typically detailed rules about quality standards to be met. While there may be some limited right to turn a few jobs down, there are usually strong penalties for doing so repeatedly. They do not have the right to pass the work on to others. And in some cases (Deliveroo being a case in point) they are even required to wear uniforms or sport company logos.

The report could have laid out clear guidelines for defining genuine self-employment and spelled out the obligations of employers of subordinate workers. But what it has done instead is muddied the waters still further by proposing exceptions to the existing principles which could be detrimental not only to workers who are currently working casually but also to other workers, including those currently defined as employees.

 How could its recommendations make matters worse?

  1. Establishing a new intermediate kind of employment status – the ‘dependent contractor’

The report proposes setting up ‘an intermediate category covering casual, independent relationships, with a more limited set of key employment rights applying’. Although this approach has been rightly resisted by British legislators in the past, this is not a particularly original response. Indeed it something of a knee-jerk reaction by neo-liberal ‘modernisers’ to the development of new forms of work. It was, for example, strongly promoted in Europe in the 1980s and 1990s (for example by the Belgian labour lawyer Roger Blanpain) as a way of encouraging teleworking without bringing it completely within the scope of existing employment protection laws. Italy provides a particularly extreme example of the ways in which different forms of ‘parasubordinate’ status and sub-categories of self-employment have been created to cover workers, such as call centre workers, who fall outside traditional sectoral agreements and regulatory categores. The overwhelming evidence is that when such new kinds of status are established they do not just result in reduced coverage for the ‘new’ kinds of workers who fall under them but, even more importantly, are then extended across the workforce to bring other more traditional forms within their scope, resulting in a worsening of conditions across the board. In other words, what they do is provide employers with a new tool for casualisation and erosion of existing rights, whatever well-intentioned language is used that purports to prevent this.

  1. Undermining the minimum wage

The report also proposes a change in the way that the National Minimum Wage (NMW) is applied: ‘In re-defining ‘dependent contractor’ status, Government should adapt the piece rates legislation to ensure those working in the gig economy are still able to enjoy maximum flexibility whilst also being able to earn the NMW’. What it proposes is complex, and difficult to summarise here. At the headline level it looks like a proposal to increase the NMW by a modest amount for workers with the proposed new ‘dependent contractor’ status. Howeverthe report also wags a stern finger at those who think that workers should be paid for all the time they spend waiting for jobs to come up, which is, they say unreasonable and open to abuse. Given that many workers in the gig economy spend half their time or more logging on in the hope of work that does not arrive, this could in practice lead to a fall in the time eligible for payment.

There is more in the report. I have only scratched the surface here. But am about to board a flight for China so will postpone further discussion for another day.

and more on the future of work

In the new spirit of reblogging here things I have already blogged elsewhere, here is a piece that appeared today on the LSE blog at http://blogs.lse.ac.uk/businessreview/2017/04/11/future-of-work-taking-the-blinkers-off-to-see-new-possibilities/

(their headline not mine).

Future of Work: taking the blinkers off to see new possibilities

Anybody relying for their information on the current headlines would find it hard to make sense of what is happening in the labour market. On the one hand, the news media are awash with apocalyptic forecasts, often backed up by studies from reputable organisations such as the US National Bureau of Economic Research , the Oxford Martin School or the Bruegel think tank, that robots, machine learning, drones, d3D printers, driverless cars and other applications of Artificial Intelligence are going to eliminate very large numbers of jobs, not just in manufacturing but also in service industries, ranging from low-skill tasks like picking and packing in warehouses and home delivery right up to high-skill professional tasks like legal research or stockbroking.

On the other hand, employment levels in the UK are at an all-time high of 74.6 per cent, with the unemployment level, which averaged over 7 per cent from 1971 to 2016, having fallen to just 4.7 per cent in January, 2017.

So, are we facing mass unemployment or not? Here we are, nearly a decade after a major financial crisis that led to job losses, austerity and waves of corporate restructuring including bankruptcies, mergers and acquisitions, seeing the emergence of new winners, with new business models and the birth of new industries, with new technological applications playing a key role. If we take a broad historical view, this is actually quite a familiar story.

We could look, for example, at the development of new industries based on the spread of electrical power and mass entertainment after the 1929 crash, or of computerisation after the 1973 energy crisis, or the explosive growth of the Internet in the decade after the infamous 1987 Black Monday. Each of these technologies was also, of course, instrumental in displacing large numbers of jobs in older industries. And with each wave, livelihoods were irrevocably damaged, because the new jobs were not created in the same areas, or for the same people, as the old ones.

The elderly look on in amazement at the desirable new labour-saving appliances their grandchildren buy, remembering the back-breaking drudgery of the old methods. But for every gleaming new factory in one part of the world, there are piles of rusting machinery in others, along with devastated lives and communities. Such ‘creative destruction’, as Schumpeter called it, is, surely, part and parcel of capitalism as usual.

So why, in the second decade of the 21st century, are so many commentators, on so many different parts of the political spectrum, convinced that this time things will be different: that we are, in Paul Mason’s phrase, moving into a period that could be described as ‘postcapitalist’?

Part of the explanation might lie in the way that capitalism is often seen, especially by the young, as a single, monolithic system that embraces all aspects of life. Perhaps a more useful way of understanding it is a somewhat messy assemblage of different capitalists competing with each other, scrabbling for market share, experimenting with new business models and often failing. In times of crisis, when many are going to the wall, technologies (including some that have been around for a while) may be seized on, not as part of an orchestrated general plan, but in much more piecemeal ways, by particular firms looking for means to restore profitability: to reduce labour costs, develop new products or services or enter new markets.

Obvious first targets for automation are processes where labour costs are high, usually because they require scarce skills or workers are well organised. So it is not surprising that skilled print workers were first in the firing line for digitisation, or auto factories for robots. The first companies to introduce innovations can make a killing – getting ahead of their competitors with a step change in increased productivity.

But such advantages do not last long. Once the technology is generally available, it is open to any competitor to buy it at the lowest market price and copy these production methods. A race to the bottom is started, which can only be sidestepped by firms that continue to innovate. It is fanciful to imagine that it would be possible to populate the world’s factories with 2017 state-of-the-art robots and then just leave them to get on with production. Leaving aside the question of how these robots are to be assembled and maintained, there is no conceivable business model that would make this profitable over any sustained period of time.

A much more likely scenario is that vast new industries will grow up to manufacture these new means of production which, like today’s laptops and mobile phones, will rapidly become obsolete and need replacing. These industries will also give birth to new service jobs, involved in their design, distribution, maintenance and in dealing with the unintended consequences of their widespread adoption (such as cyber-crime and new safety hazards).

Current technologies do not just create new kinds of jobs, they also change the way work is organised, managed and controlled. My research has shown that 2.5 per cent of workers already get more than half their income from online platforms. These new organisational models do not just change the way existing jobs are managed but also bring new areas of economic activity within the direct orbit of capitalism, for instance by drawing into the formal economy the kinds of cash-in-hand work done by window-cleaners, dog-walkers, baby-sitters or gardeners. They may not be jobs in the traditional sense, but they are work, with the potential to be organised differently in the future, that can form the basis of profitable new industries.

Another factor that blinkers thinking about the future of work is a failure to see beyond the boundaries of the existing industrial structure and imagine where other new industries will emerge from. Whether it’s the DNA of plants, the human needs for entertainment, sociality and health or outer space, the universe is full of new opportunities for commodification. The question is, can the planet sustain them?