Mercure Brigstow

To be fair, odds are I would have gone to any counter-demo in any city I happened to be in, even if it had been in front of a hotel I didn’t know, had never been to, and was unlikely to. This is just what I do. In fact, I initially thought this was a different Mercure in Bristol – one farther from the centre – which seemed more in line with the government policy towards asylum seekers, which is to get them out of sight and out of mind, so they can be usefully demonised. It was not until the day before the demo that I realised which Mercure it was, and that I had been there before.

Bristol held me. The first time I came to Bristol (save for a stopover in 2007, when I took a walk on the Quayside, saw a film at the Watershed, and was instantly hooked) was in 2014, for – initially – a seminar that was part of a job I hated and regretted taking. What sounded like a dream – tenure-track postdoc, secure, well-paid in a notoriously precarious academic environment, etc., with the possibility of staying on in what was billed as one of the bastions of social democracy – wasn’t; I was bullied and felt lonely and isolated in the sterile, conformist Danish social environment, ripped out from the precarious but dynamic, ever-changing, international, circle of friends and acquaintances at CEU. My primary relationship buckled under the pressure of another international move, combined with my disappointment in the job, and life, that everyone felt I should be settling down in. I felt very much the opposite, but struggled to see anything in the future that did not involve some version of the same. The only bright spots involved the possibility of extended research stays in Bristol and in Auckland, two of our project partners, so when a work event got scheduled in Bristol for February, I decided to bring a larger suitcase and not return until spring.

I still remember that first night. I don’t know why we were staying at the Mercury – I think the organisers at Bristol decided it was convenient for downtown and probably fitted the budget. I remember it being one of the nicer hotels I had stayed in – years of precarity combined with the desire to travel meant that I stayed in very cheap hostel and hotel accommodation well into my 30s; while Denmark was the first time my disposable income meant I did not have to worry about money, the country itself was prohibitively expensive, meaning that I was still having the more-or-less same lifestyle, just paying much more for it. It was not the hotel itself, however, but sitting outside it, on the Welsh back itself – I had snuck out for a cigarette (sorry!); the night was surprisingly mild, or at least that is how it appeared to me, my blood frozen by the unforgiving Danish northern winds – watching the glistening lights over the canal, that I felt happy for the first time in months.

Bristol melted me. It was not only my blood that had turned to ice over that first winter in Denmark; being in the southwest made me feel human again. It wasn’t only the casual smiles of staff in coffee shops*, or friends I (quickly, and thank you, you know who you are) made; it was also the fact that it was the only place (and to this day, even after more than ten years in the UK, even with the small exception of London) where I felt truly welcome. It was – still is – the only place where people would (occasionally, and casually) ask if I was from Bristol, rather than where I was from. In honour of that, one of my social media profiles still says I am from Bristol.

This, in sense, is true – I was born in Belgrade, but Bristol made me. The next time I came in autumn, I saw Nick Cave’s ’20,000 days on Earth’ – at the Watershed, where else** – went back to my room, and made a decision on how to live the next ten years. The rest, as they say, is history. While most of that history involved living elsewhere (Cambridge, London, and, for the past five years, the north-east), Bristol always felt like coming home.

It is not only that I came at the right time, at the cusp of the upswing of gentrification, but before the major part of the London fallout began. I lived everywhere – from a shared flat above a shop (yep) in Gloucester rd. (a lease I had taken over from a friend who has split up with her boyfriend) to a shared house in Horfield where I rarely saw anyone else to a horrible HMO in Clifton where they insisted the boiler room was an acceptable place to sleep; I stayed in friends’ flats, houses, gardens (usually lovelier than the boiler room). I went everywhere, walking, cycling, on the bus, and the railway. In between, I bid ny days in Copenhagen and elsewhere, waiting to return to Bristol.

It is also true that I was well-positioned as an outsider-in – I was doing research on how universities were engaging with local communities, so this gave me good access to both, at the time when impact had not yet begun to strangle the milder, less instrumentalised public engagement. This does not mean I did not witness, and was explicitly told about conflicts emanating from this; as elsewhere, universities (and particularly elite universities) are almost by definition conduits of gentrification. Even from this perspective, I (almost) always felt welcome; Bristol has no suspicion of ‘outsiders’ the way many other places in England do.

It is also not the proverbial ‘mildness’ of the southwest, memorialised in Banksy’s grafitti over Hamilton House. Yesterday, I watched that mildness scale up very quickly when crowds of angry, shouting men decked out in St. George’s flags showed up on either side of our lines, in front of the Mercure Brigstow. No pasaran.

As I said, I would’ve gone to any anti-fascist demo, anywhere. But the fact that someone is trying to prevent people who are, in a very different but very real way, seeking refuge at exactly the same spot where I found it*** – the Mercure Brigstow – meant there was no place I would have rather been yesterday.

*I write this with very much of an awareness of cultural expectations of emotional labour, especially in the so-called ‘hospitality industries’. While Denmark has a bit of a reputation for staff explicitly not performing it, which we can also attribute to decent labour conditions and thus absence of need to work for a tip, tipping (especially over-the-counter) was not a thing when I first got to Bristol either. People still chatted away in ways that, at least to my human-contact-starved Scandinavian eyes, seemed genuine.

** Of course, I also saw quite a few films at The Cube, including If a Tree Falls, another film that has been very influential on my orientation.

*** I’ve picked up on social media that one of Britain’s racism-loving publications has apparently used a similar angle to justify the far-right racist attacks on hotels hosting asylum seekers – apparently it’s “understandable” that people who have had their weddings there feel aggrieved to see the same places used to host migrants (as you can imagine, with the requisite set of adjectives/qualifiers added, incl. “off public purse” – despite the fact that it is explicitly the policy of the British government to ban asylum seekers from working – and “lounging”, despite extensive reports on how horrific conditions for asylum seekers in hotels actually are). I don’t think the kind of dour-faced conservatism that sees your ‘joyful’ occasion (= wedding) and someone else’s different kind of ‘joy’ (= being able to escape explicit oppression, persecution, starvation and likely death wherever it is you are escaping from) as mutually exclusive or even hierarchical (and if the latter, then in my view it certainly wouldn’t be the posh weddings that should be prioritised) is worth commenting on, but I do think there is another kind of resentment fuelling the far-right that does merit more attention. Given some things we know about the social composition of the British far-right (leaving aside for the time being the social composition of those who fund and direct it), I think it is more likely that their resentment stems from their own (perceived) inability to afford the exact posh weddings in the exact same hotels that the said article (which I won’t link to) is nostalgically referring to. Which only confirms what we already know, which is that one of the aims of far-right mobilisation in the UK is to divert attention of the working/exploited precarious class away from the (very needed) economic redistribution and onto attacking migrants and minorities.

Revolutionary time

I don’t recall when was the last time I slept throughout the night. The barricades are almost literally under my window; the fact I am between two occupied faculties means I spend most nights at one of the blockades, then being woken up intermittently by garbage trucks (which are sent to clear out the road blocks) and the police (which comes to break the blockades, beat and arrest people). There is also the general noise of a protest movement – whistles, cheers, slogans. I generally let the bin people do their work (in Serbia, these kinds of workers often don’t have a choice), not the police. I got used to the sound of protest, though not sufficiently to not wake up.

A month or even a few weeks ago, this would have been a problem.

For a long time, I’ve been reluctant to write about Serbia, or even answer questions about what is going on. For ethical reasons, I believe in platforming people who are on the ground, and while I am from Serbia – and from Belgrade – I have not lived there for a long time. I resisted being pinned down (domained, as I’ve called it a lifetime ago when I still cared about things such as knowledge production) as being ‘from Serbia’ or a ‘Balkans expert’ or even a ‘post-socialism scholar’ (despite the fact I’ve written a book on Yugoslavia and its successors). Whenever someone asked me for an interpretation or an opinion, I kindly pointed to people who are actually in the region and who I thought knew what they were talking about.

Almost equally importantly, I believe the usefulness of ‘interpretation’ is very limited. The intellectual tendency to focus on knowledge production – on writing, publishing, speaking – in crisis can contribute to the perpetuation of status quo; this isn’t to endorse a simplistic ‘words vs. deeds’ dichotomy nor to elevate action beyond the status of questioning (after all, we must think about what we are doing), but to note that, for most of the past decade, most contexts I have encountered have had an overabundance of critique, and a corresponding dearth of action. I have, in fact, written about that too – a whole PhD and some – but in the past year(s), I’ve become very reluctant to offer any interpretations, and have instead focused on calling upon people to act.

Now I must, though, not because I have an investment into the position of an ‘intellectual’ (you can read about my departure from that position here and here), but because the situation calls for it. Not because it needs interpretation, but because it exceeds it.

So, confession number one: this situation has exceeded my interpretative capacity.

I’m conversant in roughly five disciplines, so this isn’t for a lack of options. Sure, I can give you political philosophy names for what’s going on, and sure, I can also give you a Marxist perspective, or maybe a slightly longer-durée historical/political economy explanation. I can waffle on about semi-periphery, extractivism, necropolitics like a champion (I have). Like a retired lute player, my mind occasionally touches the floating signifiers of the concepts others have used: Badiou’s ‘event’…Žižek’s working through of Lacan’s subjective destitution with a ‘radical act’…Butler’s performative theory of assembly…Honig’s morphing of inoperativity (Agamben) and power of assembly (Butler)…and Clover, Osterweil, everyone on riot.

Unimportant.

None of the concepts match stuff I have seen.

Here, another confession.

Yes, as far as identities are concerned, I am an anarchist, and yes, I am an anthropologist, and yes, I am also a Buddhist, but I have always been ambivalent about the idea that people are inclined to do good in situations of crisis. I believe people are inherently capable of acting in any number of ways – selfish, altruistic, appropriative, non-proprietary, exploitative, generous – and that it is a complex set of circumstances and evaluations decides how they will act in specific situations. Hell, my current research – Uncategorical imperatives – was motivated by the belief we’d better find out how we tend to act (or: morally reason about action) in crises before climate change-driven exploitation and wars collapse into global disorder.

Whoops.

But (trigger warning: a Bladerunner ‘time to die’ monologue)

What I have seen on the barricades (blockades) in Belgrade surpasses what I know about social movements, informal organisation, or human behaviour. Don’t get me wrong – I was in protests since my mid-teens, including in 1996/7, 2000, and many, many other. I have been in protests in Budapest, the UK, the US. I also visited the occupied faculties in Belgrade this winter. I’ve been in protests in early June. I’ve watched, half-crazed with worry, the footage of the 15th March and 28th June protests.

This is different.

I have seen examples of mutual aid, solidarity, restraint, and self-organisation that go beyond what textbooks on mutual aid, self-organising, and community building tell you. I have seen examples of courage, protection of the weak, and having each other’s back that I have only seen intermittently before, except that they are now sustained and unquestioned. I have seen or heard, narrated, first-hand, forms of creativity, ingenuity, and resistance that I have read about, but never thought possible. I have seen anarchist theory, in practice.

Confession number three. I never thought this could happen.

Yes, I read the reports, and for a long time, all of that was encouraging, but not impossible. Everything fit the predictable range of human behaviour, one that follows identifying a common enemy, and organising around a similar cause. There was a foreseeable scope of outcomes.

This is different.

Please don’t try to explain, as I still can get angry. It’s different.

One of the things I remember reading (can’t recall the reference – might’ve been Rosa Luxemburg?, and it anyway no longer matters) is that the revolution does not involve only changing the system; it involves changing you. You will, literally, not be the same person after the revolution. We are no longer the same selves, and this is coming from someone who already believes identities are an illusion, and not a very interesting one.

But it is still tangibly different.

I know people tend to lose themselves in historical moments. (Here is Badiou’s event again). I’ve read endless accounts of transformative experiences, from 1917 and 1968 (both in Yugoslav republics – also, my mum participated in it – and beyond), to Occupy, to first environmental protests, Seattle, Gezi Park, Tahrir, you name it. I understand that. But this isn’t a ‘change’, be it ‘regime’ or ‘social’, or even ‘system’. I’ve witnessed happenings – some personal, like love, some interpersonal, like death, some communal, like concerts, some spiritual – that have come close to transcendence. This is not it.

It is also not the (a)voidance of doubt, of incertitude, or anything that seems like discomfort. It is the realisation that this is what we do now. This is how we live. There is no ‘end’ or ‘goal’ or even ‘victory’. There is no teleology.

This is revolutionary time.  

Confession number four. I am afraid.

Some days ago – I do not recall when, where, with whom – I was in a protest, and, as is my habit, tried to get the person with me to shift to the fringes. You see, I am uncomfortable in crowds, and my strategy – for close to 30 years now – is to avoid being kettled, so I can quickly run if the cops descend. It has kept me mostly safe, with one small exception.

Now I stand. Not because “the movement” or “the revolution” is transcendent, worth sacrificing for, or because I’ve lost myself in the adrenaline of the crowd. Because this is a choice. A choice that came earlier, and slightly differently, than I expected (I honestly thought I’d die fighting masked government agents in the UK or the US, but here we are). I stand because this is what we do now, and in perpetuity.

P.S. Thought it important to add a few remarks, lest people start thinking I’ve completely lost critical capacity: (no, I’m just very underslept :)) – critical in the sense of critical friend, not as someone who is looking to form an intellectual position. These are meant to highlight some of the areas for further work, especially if/when the blockades morph into more long-term forms of organising:

  • the fact homophobic language (crowds occasionally chant or spray “gay” as derogatory term for Vučić) regularly makes an appearance should be addressed immediately. homophobia is not funny, not even as a throwback to that mid-90s high-school playground vibe. the country should really move on from there, not only because nobody wants to go back to mid-90s, but also because homophobic violence is still alive in Serbia.
  • the resurgence of ethnic nationalism, not only in terms of rhetoric (which, even if it is your thing – it certainly isn’t mine – is entirely and utterly politically useless, given that the only national(ist) project Serbia could conceivably pursue entails trying to reclaim Kosovo, which no-one in their right mind would want to do), but also in terms of (again) shouts and slurs; e.g. shouting the derogatory term for Kosovar Albanians at police (reminder, the predecessors of those forces were actually engaged in committing crimes against Kosovar Albanians, so the slur is not only racist/chauvinist, it is self-defeating). Same goes for shouting at police to “go to Kosovo” – it should really go without saying that organising against police repression in your own country makes little sense if you are at the same time encouraging the police to go and repress the people in another.
  • glorifying masculinism and masculinity – while elements of this tend to be prominent in all protest movements that feature substantial physical labour/force (e.g., flipping over heavy garbage bins, etc.), it tends to erase (a) the fact that most of this work is *also* done collaboratively (in fact, the most recent one I’ve witnessed was performed by a very gender-mixed team) and (b) the relevance of organisational, logistical, and communication labour, most of which seems to be performed by women and non-binary folk. this has been accompanied by an unprecedented platforming of men (as ‘heroes’, speakers, leaders, experts, commentators, whatnot), often with names, while women mostly appear as generic category (“young women [devojke]”, “student [studentkinja]”). While there are good reasons to stick to anonymity in times like these, this should be equal across the board. There is a good lesson to learn from the Zapatistas here, whose ‘Revolutionary law of women’ was the first and integral part of the Chiapas rebellion, not an afterthought.

Refuse, restrict, redirect

On stepping away from the academic treadmill

This post is written at the start of the academic year 24/5, another year that everyone in academia is approaching with a sense of dread. This is the year in which we are facing institutions’ inability or unwillingness to condemn the genocide that we’ve spent the past year witnessing; their lack of capacity to divest from companies and systems that enable and perpetrate it; their, conversely, willingness and readiness (kudos to exceptions!) to crack down on students and staff who dare to stand up and at least call out the violent collapse of all political norms. Speaking of collapses, we are also witnessing the acceleration of climate collapse, which institutions sometimes pay lip service to, but do little to stop or challenge. 

In the UK, what has been described as financial but should in fact be dubbed higher education’s crisis of legitimacy becomes apparent, as institutions introduce redundancies, including cutting the same staff whose publications, careers, or successes they proudly displayed on their home pages; burnout and what has (in another British penchant for euphemism) been dubbed the mental health crisis but in fact should be called necropolitics of academic labour continues to take those ‘lucky’ to escape the cuts; and there is no, absolutely none, conversation about what is it exactly we are doing, how and why, nor where we hope to be in 5, 10, 25 years. 

It is also the academic year I will begin working part-time. The reasons for this decision (as for any decision) are complex, but they mostly have to do with coming to terms with what I believe to be the moral, political and, if you wish, ontological implications of the above. The assumption that you should always want more – money, status, publications, prestige – goes by so unquestioned even in parts of the academia that like to think of themselves as critical that willingly and visibly choosing less of any or all of these things tends, at best, to elicit incomprehension; at worst, fantastical hypotheses. In lieu of this, I thought I could share some useful or adaptable1 ideas how to create space between yourself and this context, to enable you to survive it – and, hopefully, generate alternatives that are healing, constructive, and revolutionary, rather than harmful, destructive, and reproductive of the exact same systems of oppression. This means I hope these ideas can be repurposed for whatever circumstances you find yourself in. They are, however, generated from my specific positionality, values, and experience; this means they are unlikely to apply to your situation verbatim, even if we occupy structurally similar positions.

  • Refuse forms of recognition and validation that tie you to or make you dependent on exploitative2 institutions. 

There are reams and reams of paper written on the neoliberal techniques of measuring, (e)valuating, and fostering competition between people. Somewhat less on the degree to which academics internalize them. We are all guilty of this. I as well, despite investing a lot of effort to counter this tendency, as well as literally having written a PhD on why it happens, and why we cannot see it (yes, academia makes you stupider). The first step in moving away from the grind, then, is refusing to be judged solely or primarily by these optics and metrics, and developing alternative forms of valorisation, justification or, simply, reasons to exist (and I mean, especially for women, forms of valorisation other than care labour).

For my part, refusing institutional (de)valuation was not exactly a choice: my own institution made it perfectly clear where in their hierarchy of human beings I belonged (about 4 spine points or roughly £4,000/annum below men), and then persisted with differential (de)valuation over the next few years. In this kind of situation, you basically have two options. One, you can accept/internalize norms of the institution even when they are arbitrary and discriminatory (as my research demonstrates, intersectional bias will persist even in ‘soft’ evaluations), and either doubt your own competence, or work yourself to death by trying to overperform to reach the standard that differently-bodied, -accented, and/or -skinned (select combo) colleagues satisfy just by existing. Or you can choose to develop an internal (moral, intellectual, whatever you wish to call it) compass and decide what kinds of work, output, and engagement you truly value and find compelling; what kind of topics, causes, and individuals merit your time, and you really have something to contribute to; and, perhaps, what kind of work will make the world a better place. Of course, no-one (or close to no-one) is lucky enough to be able to do only this sort of work; but you can certainly make the decision to limit your dedication to the mechanisms of your own exploitation and channel that energy into something else. Which brings me to (2): 

  • Restrict the access of exploitative institutions (and individuals) to your time and energy.

This, for the purposes of this post, can primarily be coded as time and energy invested in intellectual labour, but the logic is transposable to emotional labour (hint: that one friend who always expects you to help them navigate life’s dramas) or cognitive labour (hint: the amount of time you spend scrolling on social media, both generating income for digital platforms and training their or third-party algorithms – hence labour – and literally expanding energy, both by directing attention and actually consuming resources, from electricity to food and water). 

As Marxist political economy teaches us, the nature of capitalism is such that it must generate profit (something it is increasingly failing at). In order to do this, it must extract more and more of your work for at least the same if not lower wage. This means that, even if nominally your working hours remain the same, you are – quite likely – working more. Furthermore, due to the nature of academic labour, it is relatively easy for this work to colonize other aspects of your life. As I’ve written before, your interest in, say, disability justice may be objectively independent of your relationship to your employer. But if a) your increasing awareness of disability justice can be converted into ‘EDI’; b) you will be using it to teach, publish, or cite in any way that reproduces academic capital (for instance, by publishing in peer-reviewed journals, or citing academic publications); c) you will be reading in your own spare time (does your workload feature an allowance for reading or ‘scholarship’?); in other words, if any or all of these apply – congratulations, your employer is benefitting from free labour. Yours.

As the example(s) above demonstrates, it is almost impossible to remain in a paid relationship and not be subject to this form of exploitation. This is why this is about restricting, not refusing entirely; of course, if you are entirely independent of paid (or waged) labour, that’s great, but is not the reality for most people. Restricting can take a variety of forms (needless to say, they are not mutually exclusive). One pretty standard form is so-called ‘working to contract’, where employees refuse to perform work or tasks outside of those specified in their contract. Members of UCU in the UK have practised working to contract as part of Action Short of A Strike in a series of recent industrial actions pertaining to pay and pensions (viciousness with which some universities have responded to ASOS is a sad reflection of how much working above contractual obligations has become normalized). Another is ‘quiet quitting’, which had become a buzzword reflecting the growing realization that, to borrow the title of Sarah Jaffe’s brilliant book, work won’t love you back (on why not to quit quietly on some other occasion).

But even just (‘just’) saying no assiduously to demands that overstep that boundary works. This isn’t about being ‘selfish’, or prioritizing own gain (academia gives you plenty of opportunities for that). Very simply, when faced with a demand, ask: who does this serve? What purpose does it serve? Is this a purpose I can get behind? What is the best way in which I can contribute to this purpose? My guess is that, in some contexts at least, you will begin to see that the purpose you believe you are contributing to – for instance, making the world a better place – is better served through other forms of engagement (if it’s not, great, you’re lucky). Which brings me to (3):

  • Redirect the labour time, resources, and energy into something else – ideally something that does not serve the reproduction of capitalism.

Now, of course, many people do work two jobs – including two full- (or close to full-) time jobs – either because this is the only way they can make ends meet or because what they are actually passionate or care about does not really pay (or not yet, or not enough). Equally revered and reviled – depending which side of neoliberalism you fall on – this approach is often contrasted with the security of a full-time job. Under traditional conditions of industrial capitalism, this, of course, makes some sense: a full-time permanent job equalled protections for pay through collective bargaining, benefits, pension and sick leave (and even health insurance, in some cases); in socialism, it even meant collective holidays or access to specific holiday sites (known as ‘corporate perks’ in capitalism); and, of course, it also meant – at least for those working in organizations that are not authoritarian or top-down – the possibility to work together for a future, in other words, to collectively decide what the organization was meant to be about (the meaning of co-op).

I certainly do not need to rehash all the reasons why this is no longer the case. But in addition to oft-repeated diagnoses like ‘neoliberalism’, “Thatcher” or “Toreeeys”, another element appears: the fact that even absent some of these conditions (neoliberalism is visibly dying, though it is hell-bent on taking you with it) few people have the energy, willingness or vision to build a world that’s more than just a dusted-off version of the old one with, you know, slightly better tech (NHS with in-app prescriptions or good pensions with access to online banking).

This has been the slowest and possibly most painful realization for me since moving permanently to the UK, some ten years ago. Most people’s imagination of alternatives is so depleted that the best it can come up with is a slightly less terrible version of the existing order, if not a return to its earlier form (something proponents of geoengineering and other technosolutions realize). Mark Fisher, who had the quirk for being a canary in the mine, encapsulated it well in the sentence “capitalist realism”. But it’s not (even) that the steady colonization of the lifeworld by forms of economic exchange has proceeded to the degree that few people are able to imagine alternatives; it’s that I strongly suspect they would not know what to do with them.

The problem with alternatives, as you learn if you grow up in (real) socialism and/or live in communities that share labour equitably, is that they are not perfect, and they also require hard work. Visions of a post-capitalist utopia where all work is performed by machines are both ludicrous and unsustainable (if nothing else, in terms of climate-wrecking resource extraction). This work, at times, can feel as uninspiring and as gruelling as in capitalism (let’s be honest, no-one likes cleaning toilets); if it is just and equitable, there are no unseen ‘others’ – migrants, women, underpaid research assistants, good citizens – to offload it to. For a lot of people, the preferred option then begins to be selectively shutting your eyes and pretending not to see your own implication in reproducing these systems, whilst making meek pronouncements about commitments to social justice or equality or even the good of non-human others, providing it can be safely done from the safety of your own home, Netflix and Amazon accounts, and Deliveroo meals.

What I want to propose as an antidote to this loss of a world-building capacity is a version of what James Scott dubbed a while ago ‘anarchist calisthenics’, but with a twist. Instead of imagining challenges to authority/status quo, I believe we must, every single day, engage in practising existing differently. I also think this need not (necessarily) take the form of ‘transgression’ or violation; many ways of existing differently are not explicitly proscribed. Perhaps we could dub this ‘existentialist calisthenics‘.

One way to start practising existing differently is engaging in simple acts of not contributing to capitalist reproduction. For instance: instead of going ‘shopping’, go for a walk, but not with an intention or purpose or to ‘exercise’ or to ‘think better’. Just walk. Or do nothing: as Jenny Oddell among others has written, not succumbing to the dictate of constant busyness can be surprisingly difficult for people who got used to being constantly plugged into the digital capitalist machinery (I recently learned that zoomers have become so unaccustomed to, as Pascal would have put it, coexisting with their own thoughts that apparently there is a term for not distracting yourself endlessly during car or plane rides – ‘rawdogging’.)

For instance: instead of spending the weekend preparing for the work ahead, or doomscrolling in an attempt to postpone this work, sleep. Or hang out with friends. Or go to a library and pick up a random book, spend ten minutes reading it, and then return it to the shelf. Do this several times over. Do not do this in order to “select one to take out” or “inform yourself about” or “see what else is new in”. Do it without purpose. The whole point is to break the cycle of ‘usefulness’ or ‘purposefulness’, which has, for most people, come to stand for ‘service to the capitalist economy’. You don’t necessarily need to go to the lengths of spending the weekend painting banners or distributing meals to the homeless or protesting the war in Palestine (though, as you learn to reclaim some of your personal time from the circuits of production, you may find out that there are more worthy ways of investing it than doomscrolling or spending money). 

Making a conscious decision not to invest your energy and time into something that feeds the system, and to redirect it into something that does not, is the first step off the treadmill. It is, of course, even better if you do something that helps other people, non-humans, and causes, even if it’s a tiny thing: plant some flowers, pet a cat, chat to a person in the street. These small acts of redirection – out and away from the circuit of capitalism and into something else – will help sustain your ‘world-building capacity’, your ability not only to dream about a different world (which we are all prone to doing, given how terrible the one we inhabit is), but to begin to create it.  

P.S. It’s important to note that I believe these three steps need to go together, and in sequence: just refusing the validation systems, methods and ceremonies of capitalism (How much do you earn? How many followers do you have? How thin, or coiffed, or made-up – by which we mean, how much money have you spent on looking it – are you? How successfully do you perform the usually unpaid labour of care, either by parenting, or cleaning, cooking, or just making capitalism look nicer?) will probably leave you feeling empty or lacking purpose (plus, possibly, deflated, once you realize how much of your life has been dedicated to them). Just restricting your expenditure on capitalist forms of (re)production will probably leave you with a much larger volume of time and energy, which is obviously fine – most of us have been so wrung out by constant competitive demands of capitalist overwork that everyone can benefit from a bit of extra time to recover, heal, and care for oneself. After that, however, you will probably feel the need to channel that energy somewhere. Old work demands will be quick to offer you relief from the shocking freedom of your own time. Redirecting this time and energy – even if it’s 10 minutes each day or one hour every month – into something that serves dismantling these oppressive systems, or helps other humans/non-humans, or the planet – will both make it easier for other people to exit them, and for you to resist being sucked back in. 

More about how to do that in some future post. For the time being, start practising. 


  1. I would ask you to suspend, if only for the time it takes you to read this post, the impulse to think about all the ways in which we are different (“easy for you, you don’t have children” or “maybe you can do that, you don’t have a student loan” or even “ah but it’s different for those in Russell Group institutions”), and focus on what we might have in common – or what, despite differences, you can use to create your own version. I move to these, however, I want to clarify two major structural affordances, which we do not discuss enough: migration status and finance.

    Migration status: migrants on Skilled Worker (Tier 2) visa in the UK are required to work full time, for a single employer. This is the visa I have been on since I started working at Durham, having switched from Tier 4 (Doctoral Extension Scheme, which has a similar set of rules). 
    On a Tier 2 visa, your right to reside in the country is dependent on your employment status, which is dependent on your employer. So, for instance, if you lose your job – or for any reason, for instance, injury or partial disability, become unable to perform it on a full-time basis – your right to exist in the UK is automatically terminated. You are also not eligible for benefits, as the little sentence “no recourse to public funds” reminds you. In the eventuality that, say, you contracted Covid in the course of doing your job, developed long Covid, and as a consequence became incapable of working full-time, you would receive a kind letter from the Home Office giving you about ten days to leave the country. This, obviously, puts migrant workers into a slightly disadvantaged position. This is in addition to financial inequality (visa application fees, which few academic employers cover, plus the Immigration Health Surcharge, which, to the best of my knowledge, none do, mean that every single migrant worker is by definition between £2,500 and £5,000 poorer than their hypothetical non-migrant counterpart hired on the same salary – and that’s if they don’t have dependents). It also, needless to say, makes the stakes in retaining our jobs – assuming they even meet the minimum income threshold for Tier 2 visas – quite high.
       
    In 2023, I switched to the Global Talent visa, which has a wider scope of flexibility in terms of employment (in itself a telling reflection of UK’s tiered immigration system), after which I became eligible for Indefinite Leave to Remain, the legal resident status that gives one similar rights to full citizens. The sheer feeling of relief came as a surprise even to me – I had not realized, up until that point, how much anxiety I had carried around my immigration status; as a relatively privileged, white, highly educated and securely employed person, I always compared myself with migrants in significantly less secure positions. Now, as anyone who has worked with me will testify, I am hardly the type to not raise their voice when something is unjust or can be made more equitable. But the difference that knowing I am not legally indentured to my employer made came as a shock, not least because it really made me re-appraise the absence of agency among people who did not have the same kind of legal constraint.

    Financial. In summer of 2024, I was promoted to Associate Professor. This meant I was able to drop my working hours without a significant loss of income (though, of course, I did not know this would happen at the point when I chose to reduce my working hours). It also, of course, means I forfeited the additional salary. I had done similar things before, on several occasions; one included leaving a prestigious tenure-track postdoc (in Denmark) to pursue a second PhD (on a doctoral stipend); the other involved leaving a tenured position (in Belgrade) for, initially, a visiting fellowship (at an international university in Hungary). On how to plan for this, what to do, or what not to do, on some other occasion. At this point, one thing worth remembering is that a chunk of your expenditure is probably oriented towards mitigating the effects of (over)work. As Benjamin Franklin has said, whenever faced with a choice between liberty and security, choose liberty; otherwise, you end up with neither. ↩︎

  2. We could spend another 10,000 words just on discussing the meaning of ‘exploitative’ (as with any other term, which I use casually, this being a blog post). If you’re interested in exegesis of concepts, try my academic work. Given that this isn’t academic work, I would say that ‘exploitative’ does not apply to just about any relationship where you give more than you receive (clearly – in some cases, such as parenting, reciprocity is impossible), but to any relationship that tries to extract more than you had committed to, are contractually obliged to, and had agreed to give (of course, ‘agreed to’ involves a lot of variation, depending whether we see choice and consent in purely liberal or a bit more nuanced terms).

    In this sense, exploitative institutions are institutions that, for instance, normalize invisible labour and keep it invisibilized (see: care). Exploitative systems are systems that make your participation in them (for instance, capitalist economy) conditional on willingness to accept some forms of exploitation, regardless of whether done by you or to you, or, frequently, both (see: white feminism and outsourcing of care to migrant, often ethnically-minoritised women, for instance).

    Let me be clear: I don’t think all forms of labour – perhaps even under capitalism, which is a system based on exploitation – need to be exploitative. But I think most are.  I also do not think (despite the academic tendency to allocate all responsibility to “management”) that exploitative relations are limited (or necessary) to explicitly hierarchical relationships. You can have non-exploitative supervisors, and you can have exploitative peers and even (though this is rare in hierarchical systems) ‘juniors’. Nor are organizations, institutions or collectives exploitative by necessity. However, under contemporary capitalism, many are. It should also, at least by now, go without saying that certain characteristics mean you are more likely to be seen as exploitable, including by people who may nurture perfectly equitable relations with others.  ↩︎

Faraway, So Close

My best friend and I used to finish every party at my place sitting by windows that were flung wide open, feet propped up on the ledge, smoking, listening to music, and waiting for the dawn to break. Staying behind to help with the dishes was, back in the day, the ultimate token of friendship: my family did not own a dishwasher (it broke down sometime in the early 1980s and it would not be until mid-2000s that political and financial stability were sufficient to buy a new one); there was always a lot of cleaning up to do after a party. These early-morning moments became our after, where we could watch the day rise, safe in the knowledge both mild ignominies and larger embarrassments of the night before were put to sleep, together with the dishes.

One of the songs we used to listen to in such moments was U2’s ‘Stay (Faraway, So Close!’). I’m not sure whether this was before U2 Sold Out or Became Uncool, or because we were just too cut off from that iteration of the ‘culture wars’, in the country still called Yugoslavia deep in the throes of an actual war, to notice or care. Or maybe we were just a little too enamoured of Wim Wenders’ ‘Das Himmel Uber Berlin’ (‘Wings of Desire’ is its English title, sadly probably one of the worst translations ever) or its eponymous sequel, for which the song was recorded.

The period between these two films was also the period during which the events that would mark our childhoods unravelled. “Wings of Desire” was shot in 1987, in a Berlin whose dividing line will soon turn to rubble. “Faraway, So Close” premiered in 1993. Longer-brewing political conflict in what was then known as the Socialist Federal Republic of Yugoslavia surfaced in 1988/9, and transformed into a full-scale war in 1991.

In 1991, Croatia and Slovenia declared independence. The first anti-Milosevic protests happen in Belgrade. Almost everyone I know is at this protest.

Yugoslav army forces enter Slovenia. Two Serb secessionist entities form in Croatia and in Bosnia. All sides are armed.

In 1992, the siege of Sarajevo begins.

It will take another three years until the Dayton Peace Agreement, and another ten until the war is effectively over. I was eight when I confidently declared to my father that I think Slovenia will secede from Yugoslavia, and twenty when the war ended. I spent most of my childhood and teens alternating between peace & anti-regime protests, and navigating the networks of violence, misogyny, and hate that conflicts like these tend to kick up. In my late twenties and early thirties, part of my career will be dedicated to dealing specifically with post-conflict environments; and so, in the broader sense, was my book.

At any rate, as we sat by the window ledge sometime between the second half of the 1990s and the first half of 2000s, the lyrics of the song precipitated the whole wide world, which was in stark contrast with the fact that sanctions, visa regimes, and plummeting economy made it exceedingly difficult to travel. Those who did mostly did it in one direction.  

Faraway, so close

Up with the static and the radio

With satellite television

You can go anywhere

Miami, New Orleans

London, Belfast and Berlin

Sometimes, we would swap ‘Belfast’ for ‘Belgrade’, just for the fun of it, but also to make clear that we considered our city, Belgrade, to be part of the world. The promise of connection, of ‘satellite television’ (watching MTV through one of the local channels). The promise that there is a world out there, and that just because we could not see it did not mean it has disappeared.

In the intervening years, I would go to London, Belfast, and Berlin (I’ve still not been to Miami or New Orleans). I would live in London – briefly – and also, more permanently, in Oxford, Budapest, Bristol, Copenhagen, Auckland, Cambridge, Durham, and Newcastle. My friend, though she will travel a bit, will remain in Belgrade.

***

It is 3 May 2023 in Belgrade, 8AM Central European Time (CET). CET is one hour ahead of British Summer Time (BST), which is the time zone in the northeast of England, where I normally live. It is also six hours ahead of EDT (Eastern Daylight Time), which is where I am, in upstate New York. I am here on my research leave – that’s sabbatical in British English – from Durham University, at Bard College. It is 2AM, and I am sound asleep.

At this time, in the entrance of an elementary (primary+lower secondary) school in Belgrade, a 13-year-old opens fire from a semi-automatic rifle, hitting and killing a security guard, injuring two students, before moving down the corridor on the right to the classroom on the left, where he opens fire again, injuring a teacher and killing another eight students. The classroom is my classroom – ‘homeroom’ between 1992 and 1996. The school is the elementary school both me and my best friend attended from 1988 to 1996.

It is 7AM EDT in Red Hook; 1 PM CET. I wake up, going through the usual routine of stretching-coffee-breakfast. I go for a run. I do not check social media, because I need to focus on the talk I am giving that afternoon. The talk is part of my fellowship at the Hannah Arendt Center for Politics and Humanities at Bard. It is on spaces and places of thought and violence.

It is 12PM EDT, and 6PM CET. I’m having lunch with Anthony, who’s a friend and also the editor of The Philosopher, the journal whose board I’m on, and another member of the editorial board.

It is 4PM EDT, and 10PM CET. I’m giving the talk. It’s entitled ‘How to think together’, and it’s a product of anything from two months to twenty years of thinking about how to coexist with others, including across political difference. [you can watch the recording here].  

It is 10PM in Red Hook. I have just come back from the post-talk dinner, buzzing from pleasant conversation and the wine. I log on to social media – I see nothing on Twitter, and then, for some reason, I log on to Facebook, which I rarely use (mostly for friends and family in Serbia).

It is 4AM on 4 May in Belgrade. Flowers have been amassed; the candles lit; the vigils held. My friends have hugged and held each other. All of them (quick check on Facebook) are safe, also their children who go to the same school. All of them are safe: none of them are OK.

And, for that matter, neither am I.

***

What is the purpose, the value of mourning at a distance? As the week unfolds, I turn this question over and over again in my head, my ethical, normative, political and affective registers crashing and collapsing against each other.

“I have no right to mourn, I wasn’t even there” to “I wish I could have been there, and I wish I could have taken at least one of those bullets”.

“These kinds of things happen in the USA all the time, why am I suddenly so impacted by this?”

From “Fantasies of self-sacrificing heroism are a wish for immortality/covert fear of death, cut it out” to “There is nothing I can do nor any use I can be of from here, feeling this way is self-indulgent”.

From “I want to go home” to “Home is the north of England, what difference would being there make?”

What right do I have to mourn from a distance?

What does distance do to a feeling?

***

Distance, proximity, detachment and engagement have been among the key themes of my thinking, writing, and, inevitably, life (this blog, for instance, was born out of exploring these themes in both theory and practice). Away is both a mode of escape or distance, and of sustaining desire: being seen but not held (too tight), acknowledged but never (fully) known, alone but never isolated. Or at least that was the ideal. As years went on, it became less and less a moral, ethical or aesthetic choice, and more a simple fact of life. Academic mobility combined with endless curiosity meant I accepted – and, to be honest, welcomed – the constant movement. I regretted that relationships broke apart because of this; I reluctantly accepted that my dislike of heteropatriarchal, monogamous, nuclear family patterns as fundamental social units meant I was likely to struggle to form new ones, especially as more and more friends were having children. A fact of life then became an adaptation strategy: to accept the impermanence of all things; to always have one foot out of the door. Ready to detach and withdraw, there for people should they need me, but not to burden them with my presence, or needs. Or feelings.

Congruent with my other beliefs, being away quietly stopped being a location, and became an answer. 

This mode of inhabiting the world resembles what Peter Sloterdijk in The Art of Philosophy frames as being “dead on holiday”, the practice of studied detachment that first came to define the social role of professional thinkers. This position entails the denial not only of bodily functions and of mortality, but also of time itself; to take up semi-permanent residence in the realm of pure forms means exiting human time as it known. The theme of exiting human space/time is, of course, common to all ‘otherworldly’ practices and belief systems – from Greek philosophy to Christianity to mysticism and whatever happened in between (or: outside, as after all, we do not have to conform to human time). This, of course, is also what both Wenders’ films are about; distance, and desire, and time.

Hannah Arendt, who engaged with this dichotomy and its implications before Sloterdijk, notes that this position – as conducive to thinking as it is – also means we remain isolated from others:

Outstanding among the existential modes of truth-telling are the solitude of the philosopher, the isolation of the scientist and the artist, the impartiality of the historian and the judge (…) These modes of being alone differ in many respects, but they have in common that as long as any one of them lasts, no political commitment, no adherence to a cause, is possible. (…) From this perspective, we remain unaware of the actual content of political life – of the joy and the gratification that arise out of being in company with our peers, out of acting together and appearing in public.

(Arendt, ‘On Politics’, 2005: 62).

Arendt argues that this is what makes the realm of thought – ‘pure speculation’ – separated from politics. Theorizing rests on the ability to distance oneself not only from the immediacy of reality (something Boltanski explores in On Critique), but also on the ability to suspend judgment; that is, to retain a sufficient degree of distance/detachment from the object (of contemplation) so as to be able to comprehend them in their entirety.

The PhD I wrote in 2019 explored this complex operation insofar as it is involved in the production of critical social theory, in particular the critique of neoliberalism as concept [a concise version, in article form, is here; I drew on Boltanski, Chiapello, Arendt, and Sloterdijk but also went beyond them]. I called it ‘gnossification’ for the tendency to turn complex, ambiguous, and affectively-loaded phenomena into objects of knowledge. This isn’t simply to ‘rationalize’ or ‘explain away’ one’s feelings: we can be blindest about our own feelings when we confront them, as it were, head-on. The point is that gnossification also performs the affective work of creating and maintaining that distance, for the mere fact that it locates our field of vision in our own interiority. It literally produces (affective, perceptive, cognitive) space. And because space is relational (or, as Einstein would have put it, relative), it both requires other objects and cannot but treat them as such.

(If you’d like to hear more about this, I’m always happy to expand 😊).

But doing theory or philosophy is not the only way one can take up semi-permanent residence in the realm of the dead. We can do it through relationship choices (or avoidance of choices). In On Not Knowing, Emily Ogden encapsulates this beautifully and succinctly:  

It is not only in death itself that we encounter the temptation to prescind from life. What it means for death to claim us is that the sterile round of our routines claims us. We no longer see the point or the possibility of a pleasant surprise…Death claims us in the passion some of us have for disposing of our lives, equally in the taking of excessive risks and the settling of marriages. And those two things are not even incompatible: it is possible to ‘sow one’s wild oats’ in the name of settling down. Put me, I beg you, in a rut.

Ogden draws extensively on the work of psychoanalyst and philosopher Anne Dufourmantelle. In In Praise of Risk, Dufourmantelle characterizes this kind of strategy as concerned with avoiding the inevitable ambiguity of existence:

the risk of ‘not yet dying’, this gamble that we will always lose in the end, but only after traversing life with more or less plenitude, joy, and most of all, intensity.

Or, of course, pain.

***

To mourn from a distance: to recognize that no amount of distance – linguistic, conceptual, geographical, emotional – can protect us from the pain of others.

To love at a distance: to know that feeling has no natural connection to proximity, and that this is not the answer but the beginning of a question or, more likely, the question: how to care for others – and to let them care for us – even if we have chosen not to be physically close to them.

To feel at a distance: to understand that it is possible to want to feel the pain, joy, and fear of others, not as a spectator, seer, or helper/healer, but because this is what love – and friendship – is.

***

Friendship, Derrida writes, is a contract with time. In friendship, we make a pact of lasting beyond death. We know our friends will remember us even after we die. And, reciprocally, we accept not only the cognitive but also the emotional task of keeping their memory alive: in simpler terms, we accept we will both remember and miss them.

To love is to accept that there are objects whose presence is felt regardless whether we have chosen them as objects of contemplation. It is to receive the reminder that things can’t be ‘switched off’, even for those of us with significant training, capacity, and experience in doing so. To love means to, essentially, live with others even if we choose not to live together. For someone whose probably most successful and effectively longest relationship was predominantly long-distance, but who was also taught to associate this tendency with narcissism and avoidance of intimacy, this is a difficult lesson.

Back in the early oughts, on a website called everything2 (think like anarchist – no, chaotic – Wikipedia, but with stories, poetry and fiction interspersed with information), there was a post written from the perspective of someone who is spending the winter in one of the research stations in the Antarctica (yes, this was a job I’d considered, and were it not for the unfortunate fact of Serbian passport, would have still very much liked to do). I can’t reproduce much of the post – I didn’t save it, and repeat attempts over the years have failed to resurface it – but I remember the line on which it ends: “I still see you, and I love you very, very much”. The point being that distance, at the end of the day (or the end of the world?), makes very little difference at all.

Being dead on holiday officially over, I begin to pack to go back to the UK, and thus also to leave – even if temporarily – the US, which now holds most of these realizations for me. Not screaming ‘Behold, I am Lazarus’, because this is not a miracle, not even a tiny one. It is more of a coincidence, a set of circumstances, though thanks will be given where thanks are due, because I owe this to so, so many people. You know who you are, and I love you.

You can’t ever go back home again

At the start of December, I took the boat from Newcastle to Amsterdam. I was in Amsterdam for a conference, but it is also true I used to spend a lot of time in Amsterdam – Holland in general – both for private reasons and for work, between 2010 and 2016. Then, after a while, I took a train to Berlin. Then another, sleeper train, to Budapest. Then, a bus to Belgrade.

To wake up in Eastern Europe is to wake up in a context in which history has always already happened. To state this, of course, is a cliché; thinking, and writing, about Eastern Europe is always already infused with clichés. Those of us who come from this part of the world – what Maria Tumarkin marks so aptly as “Eastern European elsewheres” – know. In England, we exist only as shadow projections of a self, not even important enough to be former victims/subjects of the Empire. We are born into the world where we are the Other, so we learn to think, talk, and write of ourselves as the Other. Simone de Beauvoir wrote about this; Frantz Fanon wrote about this too.

To wake up in Berlin is to already wake up in Eastern Europe. This is where it used to begin. To wake up in Berlin is to know that we are always already living in the aftermath of a separation. In Eastern Europe, you know the world was never whole.

I was eight when the Berlin Wall fell. I remember watching it on TV. Not long after, I remember watching a very long session of the Yugoslav League of Communists (perhaps this is where my obsession with watching Parliament TV comes from?). It seemed to go on forever. My grandfather seemed agitated. My dad – whom I only saw rarely – said “Don’t worry, Slovenia will never secede from Yugoslavia”. “Oh, I think it will”, I said*.

When you ask “Are you going home for Christmas?”, you mean Belgrade. To you, Belgrade is a place of clubs and pubs, of cheap beer and abundant grilled meat**. To me, Belgrade is a long dreadful winter, smells of car fumes and something polluting (coal?) used for fuel. Belgrade is waves of refugees and endless war I felt powerless to stop, despite joining the first anti-regime protest in 1992 (at the age of 11), organizing my class to join one in 1996 (which almost got me kicked out of school, not for the last time), and inhaling oceans of tear gas when the regime actually fell, in 2000.

Belgrade is briefly hoping things would get better, then seeing your Prime Minister assassinated in 2003; seeing looting in the streets of Belgrade after Kosovo declared independence in 2008, and while already watching the latter on Youtube, from England, deciding that maybe there was nowhere to return to. Nowadays, Belgrade is a haven of crony capitalism equally indebted to Russian money, Gulf real estate, and Chinese fossil fuel exploitation that makes its air one of the most polluted in the world. So no, Belgrade never felt like home.

Budapest did, though.

It may seem weird that the place I felt most at home is a place where I barely spent three years. My CV will testify that I lived in Budapest between 2010 and 2013, first as a visiting fellow, then as an adjunct professor at the Central European University (CEU). I don’t have a drop of Hungarian blood (not that I know of, at least, thought with the Balkans you can never tell). My command of language was, at best, perfunctory; CEU is an American university and its official language is English. Among my friends – most of whom were East-Central European – we spoke English; some of us have other languages in common, but we still do. And while this group of friends did include some people who would be described as ‘locals’ – that is, Budapest-born and raised – we were, all of us, outsiders, brought together by something that was more than chance and a shared understanding of what it meant to be part of the city***.

Of course, the CV will say that what brought us together was the fact that we were all affiliated with CEU. But CEU is no longer in Budapest; since 2020, it has relocated to Vienna, forced out by the Hungarian regime’s increasingly relentless pursuit against anything that smacks of ‘progressivism’ (are you listening, fellow UK academics?). Almost all of my friends had left before that, just like I did. In 2012, increasingly skeptical about my chances to acquire a permanent position in Western academia with a PhD that said ‘University of Belgrade’ (imagine, it’s not about merit), I applied to do a second PhD at Cambridge. I was on the verge of accepting the offer, when I also landed that most coveted of academic premia, a Marie Curie postdoc position attached to an offer of a permanent – tenured – position, in Denmark****.

Other friends also left. For jobs. For partners’ jobs. For parenthood. For politics. In academia, this is what you did. You swallowed and moved on. Your CV was your life, not its reflection.

So no, there is no longer a home I can return to.

And yet, once there, it comes back. First as a few casually squeezed out words to the Hungarian conductors on the night train from Berlin, then, as a vocabulary of 200+ items that, though rarely used, enabled me to navigate the city, its subways, markets, and occasionally even public services (the high point of my Hungarian fluency was being able to follow – and even part-translate – the Semmelweis Museum curator’s talk! :)). Massolit, the bookshop which also exists in Krakow, which I’ve visited on a goodbye-to-Eastern-Europe trip from Budapest via Prague and Krakow to Ukraine (in 2013, right before the annexation). Gerlóczy utca, where is the French restaurant in which I once left a massive tip for a pianist who played so beautifully that I was happy to be squeezed at the smallest table, right next to the coat stand. Most, which means ‘bridge’ in Serbian (and Bosnian, and Croatian) and ‘now’ in Hungarian. In Belgrade, I now sometimes rely on Google maps to get around; in Budapest, the map of the city is buried so deep in my mental compass that I end up wherever I am supposed to be going.

This is what makes the city your own. Flow, like the Danube, massive as it meanders between the city’s two halves, which do not exactly make a whole. Like that book by psychologist Mihaly Csikszentmihalyi, which is a Hungarian name, btw. Like my academic writing, which, uncoupled from the strictures of British university term, flows.

Budapest has changed, but the old and the new overlay in ways that make it impossible not to remember. Like the ‘twin’ cities of Besźel and Ul Qoma in the fictional universe of China Miéville’s The City and the City (the universe was, of course, modelled on Berlin, but Besźel is Budapest out and out, save for the sea), the memory and its present overlap in distinct patterns that we are trained not to see. Being in one precludes being in the other. But there are rumours of a third city, Orciny, one that predates both. Believing in Orciny is considered a crime, though. There cannot be a place where the past and the future are equally within touching distance. Right?

CEU, granted, is no longer there as an institution; though the building (and the library) remains, most of its services, students, and staff are now in Vienna. I don’t even dare go into the campus; the last time I was there, in 2017, I gave a keynote about how universities mediate disagreement. The green coffee shop with the perennially grim-faced person behind the counter, the one where we went to get good coffee before Espresso Embassy opened, is no longer there. But Espresso Embassy still stands; bigger. Now, of course, there are places to get good coffee everywhere: Budapest is literally overrun by them. The best I pick up is from the Australian coffee shop, which predates my move. Their shop front celebrates their 10th anniversary. Soon, it will be 10 years since I left Budapest.

Home: the word used to fill me with dread. “When are you going home?”, they would ask in Denmark, perhaps to signify the expectation I would be going to Belgrade for the winter break, perhaps to reflect the idea that all immigrants are, fundamentally, guests. “I live here”, I used to respond. “This is my home”. On bad days, I’d add some of the combo of information I used to point out just how far from assumed identities I was: I don’t celebrate Christmas (I’m atheist, for census purposes); if I did, it would be on a different date (Orthodox Christian holidays in Serbia observe the Julian calendar, which is 10 days behind the Gregorian); thanks, I’ll be going to India (I did, in fact, go to India including over the Christmas holidays the first year I lived in Denmark, though not exactly in order to spite everyone). But above and beyond all this, there was a simpler, flatter line: home is not where you return, it’s the place you never left.

In Always Coming Home, another SF novel about finding the places we (n)ever left, Ursula LeGuin retraces a past from the point of view of a speculative future. This future is one in which the world – in fact, multiple worlds – have failed. Like Eastern Europe, it is a sequence of apocalypses whose relationship can only be discovered through a combination of anthropology and archaeology, but one that knows space and its materiality exist only as we have already left it behind; we cannot dig forwards, as it were.

Am I doing the same, now? Am I coming home to find out why I have left? Or did I return from the future to find out I have, in fact, never left?

Towards the end of The City and the City, the main character, Tyador Borlú, gets apprehended by the secret police monitoring – and punishing – instances of trespass (Breach) between two cities, the two worlds. But then he is taken out by one of the Breach – Ashil – and led through the city in a way that allows him to finally see them not as distinct, but as parts of a whole.

Everything I had been unseeing now jostled into sudden close-up. Sound and smell came in: the calls of Besźel; the ringing of its clocktowers; the clattering and old metal percussion of the trams; the chimney smell; the old smells; they came in a tide with the spice and Illitan yells of Ul Qoma, the clatter of a militsya copter, the gunning of German cars. The colours of Ul Qoma light and plastic window displays no longer effaced the ochres and stone of its neighbour, my home.

‘Where are you?’ Ashil said. He spoke so only I could hear. ‘I . . .’

‘Are you in Besźel or Ul Qoma?’

‘. . . Neither. I’m in Breach.’ ‘You’re with me here.’

We moved through a crosshatched morning crowd. ‘In Breach. No one knows if they’re seeing you or unseeing you. Don’t creep. You’re not in neither: you’re in both.’

He tapped my chest. ‘Breathe.’

(Loc. 3944)

Breathe.

*Maybe this is where the tendency not to be overtly impressed by the authority of men comes from (or authority in general, given my father was a professor of sociology and I was, at that point, nine years old, and also right).

** Which I also do not benefit from, as I do not eat meat.

*** Some years later, I will understand that this is why the opening lines of the Alexandria Quartet always resonated so much.

**** How I ended up doing a second PhD at Cambridge after all and relocating to England permanently is a different story, one that I part-told here.

Life or business as usual? Lessons of the USS strike

[Shortened version of this blog post was published on Times Higher Education blog on 14 March under the title ‘USS strike: picket line debates will reenergise scholarship’].

 

Until recently, Professor Marenbon writes, university strikes in Cambridge were a hardly noticeable affair. Life, he says, went on as usual. The ongoing industrial action that UCU members are engaging in at UK’s universities has changed all that. Dons, rarely concerned with the affairs of the lesser mortals, seem to be up in arms. They are picketing, almost every day, in the wind and the snow; marching; shouting slogans. For Heaven’s sake, some are even dancing. Cambridge, as pointed out on Twitter, has not seen such upheaval ever since we considered awarding Derrida an honorary degree.

This is possibly the best thing that has happened to UK higher education, at least since the end of the 1990s. Not that there’s much competition: this period, after all, brought us the introduction, then removal of tuition fee caps; abolishment of maintenance grants; REF and TEF; and as crowning (though short-lived) glory, appointment of Toby Young to the Office for Students. Yet, for most of this period, academics’ opposition to these reforms conformed to ‘civilised’ ways of protest: writing a book, giving a lecture, publishing a blog post or an article in Times Higher Education, or, at best, complaining on Twitter. While most would agree that British universities have been under threat for decades, concerted effort to counter these reforms – with a few notable exceptions – remained the provenance of the people Professor Marenbon calls ‘amiable but over-ideological eccentrics’.

This is how we have truly let down our students. Resistance was left to student protests and occupations. Longer-lasting, transgenerational solidarity was all but absent: at the end of the day, professors retreated to their ivory towers, precarious academics engaged in activism on the side of ever-increasing competition and pressure to land a permanent job. Students picked up the tab: not only when it came to tuition fees, used to finance expensive accommodation blocks designed to attract more (tuition-paying) students, but also when it came to the quality of teaching and learning, increasingly delivered by an underpaid, overworked, and precarious labour force.

This is why the charge that teach-outs of dubious quality are replacing lectures comes across as particularly disingenuous. We are told that ‘although students are denied lectures on philosophy, history or mathematics, the union wants them to show up to “teach-outs” on vital topics such as “How UK policy fuels war and repression in the Middle East” and “Neoliberal Capitalism versus Collective Imaginaries”’. Although this is but one snippet of Cambridge UCU’s programme of teach-outs, the choice is illustrative.

The link between history and UK’s foreign policy in the Middle East strikes me as obvious. Students in philosophy, politics or economics could do worse than a seminar on the development of neoliberal ideology (the event was initially scheduled as part of the Cambridge seminar in political thought). As for mathematics – anybody who, over the past weeks, has had to engage with the details of actuarial calculation and projections tied to the USS pension scheme has had more than a crash refresher course: I dare say they learned more than they ever hoped they would.

Teach-outs, in this sense, are not a replacement for education “as usual”. They are a way to begin bridging the infamous divide between “town and gown”, both by being held in more open spaces, and by, for instance, discussing how the university’s lucrative development projects are impacting on the regional economy. They are not meant to make up for the shortcomings of higher education: if anything, they render them more visible.

What the strikes have made clear is that academics’ ‘life as usual’ is vice-chancellors’ business as usual. In other words, it is precisely the attitude of studied depoliticisation that allowed the marketization of higher education to continue. Markets, after all, are presumably ‘apolitical’. Other scholars have expanded considerable effort in showing how this assumption had been used to further policies whose results we are now seeing, among other places, in the reform of the pensions system. Rather than repeat their arguments, I would like to end with the words of another philosopher, Hannah Arendt, who understood well the ambiguous relationship between the academia and politics:

 

‘Very unwelcome truths have emerged from the universities, and very unwelcome judgments have been handed down from the bench time and again; and these institutions, like other refuges of truth, have remained exposed to all the dangers arising from social and political power. Yet the chances for truth to prevail in public are, of course, greatly improved by the mere existence of such places and by the organization of independent, supposedly disinterested scholars associated with them.

This authentically political significance of the Academe is today easily overlooked because of the prominence of its professional schools and the evolution of its natural science divisions, where, unexpectedly, pure research has yielded so many decisive results that have proved vital to the country at large. No one can possibly gainsay the social and technical usefulness of the universities, but this importance is not political. The historical sciences and the humanities, which are supposed to find out, stand guard over, and interpret factual truth and human documents, are politically of greater relevance.’

In this sense, teach-outs, and industrial action in general, are a way to for us to recognise our responsibility to protect the university from the undue incursion of political power, while acknowledging that such responsibility is in itself political. At this moment in history, I can think of no service to scholarship greater than that.

Between legitimation and imagination: epistemic attachment, ontological bias, and thinking about the future

Greyswans
Some swans are…grey (Cambridge, August 2017)

 

A serious line of division runs through my household. It does not concern politics, music, or even sports: it concerns the possibility of large-scale collapse of social and political order, which I consider very likely. Specific scenarios aside for the time being, let’s just say we are talking more human-made climate-change-induced breakdown involving possibly protracted and almost certainly lethal conflict over resources, than ‘giant asteroid wipes out Earth’ or ‘rogue AI takes over and destroys humanity’.

Ontological security or epistemic positioning?

It may be tempting to attribute the tendency towards catastrophic predictions to psychological factors rooted in individual histories. My childhood and adolescence took place alongside the multi-stage collapse of the country once known as the Socialist Federal Republic of Yugoslavia. First came the economic crisis, when the failure of ‘shock therapy’ to boost stalling productivity (surprise!) resulted in massive inflation; then social and political disintegration, as the country descended into a series of violent conflicts whose consequences went far beyond the actual front lines; and then actual physical collapse, as Serbia’s long involvement in wars in the region was brought to a halt by the NATO intervention in 1999, which destroyed most of the country’s infrastructure, including parts of Belgrade, where I was living at the time*. It makes sense to assume this results in quite a different sense of ontological security than one, say, the predictability of a middle-class English childhood would afford.

But does predictability actually work against the capacity to make accurate predictions? This may seem not only contradictory but also counterintuitive – any calculation of risk has to take into account not just the likelihood, but also the nature of the source of threat involved, and thus necessarily draws on the assumption of (some degree of) empirical regularity. However, what about events outside of this scope? A recent article by Faulkner, Feduzi and Runde offers a good formalization of this problem (the Black Swans and ‘unknown unknowns’) in the context of the (limited) possibility to imagine different outcomes (see table below). Of course, as Beck noted a while ago, the perception of ‘risk’ (as well as, by extension, any other kind of future-oriented thinking) is profoundly social: it depends on ‘calculative devices‘ and procedures employed by networks and institutions of knowledge production (universities, research institutes, think tanks, and the like), as well as on how they are presented in, for instance, literature and the media.

Screen shot 2017-12-18 at 3.58.23 PM
From: Faulkner, Feduzi and Runde: Unknowns, Black Swans and the risk/uncertainty distinction, Cambridge Journal of Economics 41 (5), August 2017, 1279-1302

 

Unknown unknowns

In The Great Derangement (probably the best book I’ve read in 2017), Amitav Gosh argues that this can explain, for instance, the surprising absence of literary engagement with the problem of climate change. The problem, he claims, is endemic to Western modernity: a linear vision of history cannot conceive of a problem that exceeds its own scale**. This isn’t the case only with ‘really big problems’ such as economic crises, climate change, or wars: it also applies to specific cases such as elections or referendums. Of course, social scientists – especially those qualitatively inclined – tend to emphasise that, at best, we aim to explain events retroactively. Methodological modesty is good (and advisable), but avoiding thinking about the ways in which academic knowledge production is intertwined with the possibility of prediction is useless, for at least two reasons.

One is that, as reflected in the (by now overwrought and overdetermined) crisis of expertise and ‘post-truth’, social researchers increasingly find themselves in situations where they are expected to give authoritative statements about the future direction of events (for instance, about the impact of Brexit). Even if they disavow this form of positioning, the very idea of social science rests on (no matter how implicit) assumption that at least some mechanisms or classes or objects will exhibit the same characteristics across cases; consequently, the possibility of inference is implied, if not always practised. Secondly, given the scope of challenges societies face at present, it seems ridiculous to not even attempt to engage with – and, if possibly, refine – the capacity to think how they will develop in the future. While there is quite a bit of research on individual predictive capacity and the way collective reasoning can correct for cognitive bias, most of these models – given that they are usually based on experiments, or simulations – cannot account for the way in which social structures, institutions, and cultures of knowledge production interact with the capacity to theorise, model, and think about the future.

The relationship between social, political, and economic factors, on the one hand, and knowledge (including knowledge about those factors), on the other, has been at the core of my work, including my current PhD. While it may seem minor compared to issues such as wars or revolutions, the future of universities offers a perfect case to study the relationship between epistemic positioning, positionality, and the capacity to make authoritative statements about reality: what Boltanski’s sociology of critique refers to as ‘complex externality’. One of the things it allowed me to realise is that while there is a good tradition of reflecting on positionality (or, in positivist terms, cognitive ‘bias’) in relation to categories such as gender, race, or class, we are still far from successfully theorising something we could call ‘ontological bias’: epistemic attachment to the object of research.

The postdoctoral project I am developing extends this question and aims to understand its implications in the context of generating and disseminating knowledge that can allow us to predict – make more accurate assessments of – the future of complex social phenomena such as global warming or the development of artificial intelligence. This question has, in fact, been informed by my own history, but in a slightly different manner than the one implied by the concept of ontological security.

Legitimation and prediction: the case of former Yugoslavia

Socialist Federal Republic of Yugoslavia had a relatively sophisticated and well developed networks of social scientists, which both of my parents were involved in***. Yet, of all the philosophers, sociologists, political scientists etc. writing about the future of the Yugoslav federation, only one – to the best of my knowledge – predicted, in eerie detail, the political crisis that would lead to its collapse: Bogdan Denitch, whose Legitimation of a revolution: the Yugoslav case (1976) is, in my opinion, one of the best books about former Yugoslavia ever written.

A Yugoslav-American, Denitch was a professor of sociology at the City University of New York. He was also a family friend, a fact I considered of little significance (having only met him once, when I was four, and my mother and I were spending a part of our summer holiday at his house in Croatia; my only memory of it is being terrified of tortoises roaming freely in the garden), until I began researching the material for my book on education policies and the Yugoslav crisis. In the years that followed (I managed to talk to him again in 2012; he passed away in 2016), I kept coming back to the question: what made Denitch more successful in ‘predicting’ the crisis that would ultimately lead to the dissolution of former Yugoslavia than virtually anyone writing on Yugoslavia at the time?

Denitch had a pretty interesting trajectory. Born in 1929 to Croat Serb parents, he spent his childhood in a series of countries (including Greece and Egypt), following his diplomat father; in 1946, the family emigrated to the United States (the fact his father was a civil servant in the previous government would have made it impossible for them to continue living in Yugoslavia after the Communist regime, led by Josip Broz Tito, formally took over). There, Denitch (in evident defiance of his upper-middle-class legacy) trained as a factory worker, while studying for a degree in sociology at CUNY. He also joined the Democratic Socialist Alliance – one of American socialist parties – whose member (and later functionary) he would remain for the rest of his life.

In 1968, Denitch was awarded a major research grant to study Yugoslav elites. The project was not without risks: while Yugoslavia was more open to ‘the West’ than other countries in Eastern Europe, visits by international scholars were strictly monitored. My mother recalls receiving a house visit from an agent of the UDBA, the Yugoslav secret police – not quite the KGB but you get the drift – who tried to elicit the confession that Denitch was indeed a CIA agent, and, in the absence of that, the promise that she would occasionally report on him****.

Despite these minor throwbacks, the research continued: Legitimation of a revolution is one of its outcomes. In 1973, Denitch was awarded a PhD by the Columbia University and started teaching at CUNY, eventually retiring in 1994. His last book, Ethnic nationalism: the tragic death of Yugoslavia came out in the same year, a reflection on the conflict that was still going on at the time, and whose architecture he had foreseen with such clarity eighteen years earlier (the book is remarkably bereft of “told-you-so”-isms, so warmly recommended for those wishing to learn more about Yugoslavia’s dissolution).

Did personal history, in this sense, have a bearing on one’s epistemic position, and by extension, on the capacity to predict events? One explanation (prevalent in certain versions of popular intellectual history) would be that Denitch’s position as both a Yugoslav and an American would have allowed him to escape the ideological traps other scholars were more likely to fall into. Yugoslavs, presumably,  would be at pains to prove socialism was functioning; Americans, on the other hand, perhaps egalitarian in theory but certainly suspicious of Communist revolutions in practice, would be looking to prove it wasn’t, at least not as an economic model. Yet this assumption hardly stands even the lightest empirical interrogation. At least up until the show trials of Praxis philosophers, there was a lively critique of Yugoslav socialism within Yugoslavia itself; despite the mandatory coating of jargon, Yugoslav scholars were quite far from being uniformly bright-eyed and bushy-tailed about socialism. Similarly, quite a few American scholars were very much in favour of the Yugoslav model, eager, if anything, to show that market socialism was possible – that is, that it’s possible to have a relatively progressive social policy and still be able to afford nice things. Herein, I believe, lies the beginning of the answer as to why neither of these groups was able to predict the type or the scale of the crisis that will eventually lead to the dissolution of former Yugoslavia.

Simply put, both groups of scholars depended on Yugoslavia as a source of legitimation of their work, though for different reasons. For Yugoslav scholars, the ‘exceptionality’ of the Yugoslav model was the source of epistemic legitimacy, particularly in the context of international scientific collaboration: their authority was, in part at least, constructed on their identity and positioning as possessors of ‘local’ knowledge (Bockman and Eyal’s excellent analysis of the transnational roots of neoliberalism makes an analogous point in terms of positioning in the context of the collaboration between ‘Eastern’ and ‘Western’ economists). In addition to this, many of Yugoslav scholars were born and raised in socialism: while, some of them did travel to the West, the opportunities were still scarce and many were subject to ideological pre-screening. In this sense, both their professional and their personal identity depended on the continued existence of Yugoslavia as an object; they could imagine different ways in which it could be transformed, but not really that it could be obliterated.

For scholars from the West, on the other hand, Yugoslavia served as a perfect experiment in mixing capitalism and socialism. Those more on the left saw it as a beacon of hope that socialism need not go hand-in-hand with Stalinist-style repression. Those who were more on the right saw it as proof that limited market exchange can function even in command economies, and deduced (correctly) that the promise of supporting failing economies in exchange for access to future consumer markets could be used as a lever to bring the Eastern Bloc in line with the rest of the capitalist world. If no one foresaw the war, it was because it played no role in either of these epistemic constructs.

This is where Denitch’s background would have afforded a distinct advantage. The fact his parents came from a Serb minority in Croatia meant he never lost sight of the salience of ethnicity as a form of political identification, despite the fact socialism glossed over local nationalisms. His Yugoslav upbringing provided him not only with fluency in the language(s), but a degree of shared cultural references that made it easier to participate in local communities, including those composed of intellectuals. On the other hand, his entire professional and political socialization took place in the States: this meant he was attached to Yugoslavia as a case, but not necessarily as an object. Not only was his childhood spent away from the country; the fact his parents had left Yugoslavia after the regime change at the end of World War II meant that, in a way, for him, Yugoslavia-as-object was already dead. Last, but not least, Denitch was a socialist, but one committed to building socialism ‘at home’. This means that his investment in the Yugoslav model of socialism was, if anything, practical rather than principled: in other words, he was interested in its actual functioning, not in demonstrating its successes as a marriage of markets and social justice. This epistemic position, in sum, would have provided the combination needed to imagine the scenario of Yugoslav dissolution: a sufficient degree of attachment to be able to look deeply into a problem and understand its possible transformations; and a sufficient degree of detachment to be able to see that the object of knowledge may not be there forever.

Onwards to the…future?

What can we learn from the story? Balancing between attachment and detachment is, I think, one of the key challenges in any practice of knowing the social world. It’s always been there; it cannot be, in any meaningful way, resolved. But I think it will become more and more important as the objects – or ‘problems’ – we engage with grow in complexity and become increasingly central to the definition of humanity as such. Which means we need to be getting better at it.

 

———————————-

(*) I rarely bring this up as I think it overdramatizes the point – Belgrade was relatively safe, especially compared to other parts of former Yugoslavia, and I had the fortune to never experience the trauma or hardship people in places like Bosnia, Kosovo, or Croatia did.

(**) As Jane Bennett noted in Vibrant Matter, this resonates with Adorno’s notion of non-identity in Negative Dialectics: a concept always exceeds our capacity to know it. We can see object-oriented ontology, (e.g. Timothy Morton’s Hyperobjects) as the ontological version of the same argument: the sheer size of the problem acts as a deterrent from the possibility to grasp it in its entirety.

(***) This bit lends itself easily to the Bourdieusian “aha!” argument – academics breed academics, etc. The picture, however, is a bit more complex – I didn’t grow up with my father and, until about 16, had a very vague idea of what my mother did for a living.

(****) Legend has it my mother showed the agent the door and told him never to call on her again, prompting my grandmother – her mother – to buy funeral attire, assuming her only daughter would soon be thrown into prison and possibly murdered. Luckily, Yugoslavia was not really the Soviet Union, so this did not come to pass.

A fridge of one’s own

OLYMPUS DIGITAL CAMERA
A treatise on the education of women, 1740. Museum of European Students, Bologna

 

A woman needs a fridge of her own if she is to write theory. In fact, I’d wager a woman needs a fridge of her own if she is to write pretty much anything, but since what I am writing at the moment is (mostly) theory, let’s assume that it can serve as a metaphor for intellectual labour more broadly.

In her famous injunction to undergraduates at Girton College in Cambridge (the first residential college for women that offered education to degree level) Virginia Woolf stated that a woman needed two things in order to write: a room of her own, and a small independent income (Woolf settled on 500 pounds a year; as this website helpfully informed me, this would be £29,593 in today’s terms). In addition to the room and the income,  a woman who wants to write, I want to argue, also needs a fridge. Not a shelf or two in a fridge in a kitchen in a shared house or at the end of the staircase; a proper fridge of her own. Let me explain.

The immateriality of intellect

Woolf’s broader point in A Room of One’s Own is that intellectual freedom and creativity require the absence of material constraints. In and of itself, this argument is not particularly exceptional: attempts to define the nature of intellectual labour have almost unfailingly centred on its rootedness in leisure – skholē – as the opportunity for peaceful contemplation, away from the vagaries of everyday existence. For ancient Greeks, contemplation was opposed to the political (as in the everyday life of the polis): what we today think of as the ‘private’ was not even a candidate, being the domain of women and slaves, neither of which were considered proper citizens. For Marx, it was  the opposite of material labour, with its sweat, noise, and capitalist exploitation. But underpinning it all was the private sphere – that amorphous construct that, as feminist scholars pointed out, includes the domestic and affective labour of care, cleaning, cooking, and, yes, the very act of biological reproduction. The capacity to distance oneself from these kinds of concerns thus became the sine qua non of scholarly reflection, particularly in the case of theōria, held to be contemplation in its pure(st) form. After all, to paraphrase Kant, it is difficult to ponder the sublime from too close.

This thread runs from Plato and Aristotle through Marx to Arendt, who made it the gist of her analysis of the distinction between vita activa and vita contemplativa; and onwards to Bourdieu, who zeroed in on the ‘scholastic reason’ (raison scolastique) as the source of Homo Academicus’ disposition to project the categories of scholarship – skholē – onto everyday life. I am particularly interested in the social framing of this distinction, given that I think it underpins a lot of contemporary discussions on the role of universities. But regardless of whether we treat it as virtue, a methodological caveat, or an interesting research problem, detachment from the material persists as the distinctive marker of the academic enterprise.

 

What about today?

So I think we can benefit from thinking about what would be the best way to achieve this absolution from the material for women who are trying to write today. One solution, obviously, would be to outsource the cooking and cleaning to a centralised service – like, for instance, College halls and cafeterias. This way, one would have all the time to write: away with the vile fridge! (It was anyway rather unseemly, poised as it was in the middle of one’s room). Yet, outsourcing domestic labour means we are potentially depriving other people of the opportunity to develop their own modes of contemplation. If we take into account that the majority of global domestic labour is performed by women, perfecting our scholarship would most likely be off the back of another Shakespeare’s (or, for consistency’s sake, let’s say Marx’s) sister. So, let’s keep the fridge, at least for the time being.

But wait, you will say, what about eating out – in restaurants and such? It’s fine you want to do away with outsourced domestic labour, but surely you wouldn’t scrap the entire catering industry! After all, it’s a booming sector of the economy (and we all know economic growth is good), and it employs so many people (often precariously and in not very nice conditions, but we are prone to ignore that during happy hour). Also, to be honest, it’s so nice to have food prepared by other people. After all, isn’t that what Simone de Beauvoir did, sitting, drinking and smoking (and presumably also eating) in cafés all day? This doesn’t necessarily mean we would need to do away with the fridge, but a shelf in a shared one would suffice – just enough to keep a bit of milk, some butter and eggs, fruit, perhaps even a bottle of rosé? Here, however, we face the economic reality of the present. Let’s do a short calculation.

 

£500 a year gets you very far…or not

The £29,593 Woolf proposes as sufficient independent income comes from an inheritance. Those of us who are less fortunate and are entering the field of theory today can hope to obtain one of many scholarships. Mine is currently at £13,900 a year (no tax); ESRC-funded students get a bit more, £14,000. This means we fall well short of today’s equivalent of 500 pound/year sum Woolf suggested to students at Girton. Starting from £14,000, assuming that roughly £2000 pounds annually are spent on things such as clothes, books, cosmetics, and ‘incidentals’ – for instance, travel to see one’s family or medical costs (non-EU students are subject to something called the Immigration Health Surcharge, paid upfront at the point of application for a student visa, which varies between £150 and £200 per year, but doesn’t cover dental treatment, prescriptions, or eye tests – so much for “NHS tourism”) – this leaves us with roughly £1000 per month. Out of this, accommodation costs anything between 400 and 700 pounds, depending on bills, council tax etc. – for a “room of one’s own”, that is, a room in a shared house or college accommodation – that, you’re guessing it, almost inevitably comes with a shared fridge.

So the money that’s left is supposed to cover  eating in cafés, perhaps even an occasional glass of wine (it’s important to socialise with other writers or just watch the world go by). Assuming we have 450/month after paying rent and bills, this leaves us with a bit less than 15 pounds per day. This suffices for about one meal and a half daily in most cheap high street eateries, if you do not eat a lot, do not drink, nor have tea or coffee. Ever. Even at colleges, where food is subsidised, this would be barely enough. Remember: this means you never go out for a drink with friends or to a cinema, you never buy presents, never pay for services: in short, it makes for a relatively boring and constrained life. This could turn writing, unless you’re Emily Dickinson, somewhat difficult. Luckily, you have Internet, that is, if it’s included in your bills. And you pray your computer does not break down.

Well, you can always work, you say. If the money you’re given is not enough to provide the sort of lifestyle you want, go earn more! But there’s a catch. If you are in full-time education, you are only allowed to work part-time. If you are a foreign national, there are additional constraints. This means the amount of money you can get is usually quite limited. And there are tradeoffs. You know all those part-time jobs that pay a lot, offer stability and future career progression, and everyone is flocking towards? I don’t either. If you ever wondered where the seemingly inexhaustible supply of cheap labour at universities – sessional lecturers, administrative assistants, event managers, servers etc. came from, look around you: more likely than not, it’s hungry graduate students.

 

The poverty of student life

Increasingly, this is not in the Steve Jobs “stay hungry” sense. As I’ve argued recently, “staying hungry” has quite a different tone when instead of a temporary excursion into relative deprivation (seen as part of ‘character building’ education is supposed to be about) it reflects the threat of, virtually, struggling to make ends meet way after graduation. Given the state of the economy and graduate debt, that is a threat faced by growing proportions of young people (and, no surprise, women are much more likely to end up in precarious employment). Of course, you could always argue that many people have it much worse: you are (relatively) young, well educated, and with likely more cultural and social capital than the average person. Sure you can get by. But remember – this isn’t about making it from one day to another. What you’re trying to do is write. Contemplate. Comprehend the beauty (and, sometimes, ugliness) of the world in its entirety. Not wonder whether you’ll be able to afford the electricity bill.

This is why a woman needs to have her own fridge. If you want access to healthy, cheap food, you need to be able to buy it in greater quantities, so you don’t have to go to the supermarket every other day, and store it at home, so you can prepare it quickly and conveniently, as well as plan ahead. For the record, by healthy I do not mean quinoa waffles, duck eggs and shitake mushrooms (not that there’s anything wrong with any of these, though I’ve never tried duck eggs). I mean the sort of food that keeps you full whilst not racking up your medical expenses further down the line. For this you need a fridge. Not half a vegetable drawer among opened cans of lager that some bro you happen to share a house with forgot to throw away months ago, but an actual fridge. Of your own. It doesn’t matter if it comes with a full kitchen – you can always share a stove, wait for your turn for the microwave, and cooking (and eating) together can be a very pleasurable way of spending time. But keep your fridge.

 

Emotional labour

But, you will protest, what about women who live with partners? Surely we want to share fridges with our loved ones! Well, good for you, go ahead. But you may want to make sure that it’s not always you remembering to buy the milk, it’s not always you supplying fresh fruit and vegetables, it’s not always you throwing away the food whose use-by date had long expired. That it doesn’t mean you pay the half of household bills, but still do more than half the work. For, whether we like it or not, research shows that in heterosexual partnerships women still perform a greater portion of domestic labour, not to mention the mental load of designing, organising, and dividing tasks. And yes, this impacts your ability to write. It’s damn difficult to follow the line of thought if you need to stop five times in order to take the laundry out, empty the bins, close the windows because it just started raining, pick up the mail that came through the door, and add tea to the shopping list – not even mentioning what happens if you have children on top of all this.

So no, a fridge cannot – and will not – solve the problem of gender inequality in the academia, let alone gender inequality on a more general level (after all, academics are very, very privileged). What it can do, though, is rebalance the score in the sense of reminding us that cooking, cleaning, and cutting up food are elements of life as much as citing, cross-referencing, and critique. It can begin to destroy, once and for all, the gendered (and classed) assumption that contemplation happens above and beyond the material, and that all reminders of its bodily manifestations – for instance, that we still need to eat whilst thinking – should be if not abolished entirely, then at least expelled beyond the margins of awareness: to communal kitchens, restaurants, kebab vans, anywhere where they do not disturb the sacred space of the intellect. So keep your income, get a room, and put a fridge in it. Then start writing.

 

Zygmunt Bauman and the sociologies of end times

[This post was originally published at the Sociological Review blog’s Special Issue on Zygmunt Bauman, 13 April 2017]

“Morality, as it were, is a functional prerequisite of a world with an in-built finality and irreversibility of choices. Postmodern culture does not know of such a world.”

Zygmunt Bauman, Sociology and postmodernity

Getting reacquainted with Bauman’s 1988 essay “Sociology and postmodernity”, I accidentally misread the first word of this quote as “mortality”. In the context of the writing of this piece, it would be easy to interpret this as a Freudian slip – yet, as slips often do, it betrays a deeper unease. If it is true that morality is a functional prerequisite of a finite world, it is even truer that such a world calls for mortality – the ultimate human experience of irreversibility. In the context of trans- and post-humanism, as well as the growing awareness of the fact that the world, as the place inhabited (and inhabitable) by human beings, can end, what can Bauman teach us about both?

In Sociology and postmodernity, Bauman assumes the position at the crossroads of two historical (social, cultural) periods: modernity and postmodernity. Turning away from the past to look towards the future, he offers thoughts on what a sociology adapted to the study of postmodern condition would be like. Instead of a “postmodern sociology” as a mimetic representation of (even if a pragmatic response to) postmodernity, he argues for a sociology that attempts to give a comprehensive account of the “aggregate of aspects” that cohere into a new, consumer society: the sociology of postmodernity. This form of account eschews the observation of the new as a deterioration, or aberration, of the old, and instead aims to come to terms with the system whose contours Bauman will go on to develop in his later work: the system characterised by a plurality of possible worlds, and not necessarily a way to reconcile them.

The point in time in which he writes lends itself fortuitously to the argument of the essay. Not only did Legislators and interpreters, in which he reframes intellectuals as translators between different cultural worlds, come out a year earlier; the publication of Sociology and postmodernity briefly precedes 1989, the year that will indeed usher a wholly new period in the history of Europe, including in Bauman’s native Poland.

On the one hand, he takes the long view back to post-war Europe, built, as it was, on the legacy of Holocaust as a pathology of modernity, and two approaches to preventing its repetition – market liberalism and political freedoms in the West, and planned economies and more restrictive political regimes in Central and Eastern parts of the subcontinent. On the other, he engages with some of the dilemmas for the study of society that the approaching fall of Berlin Wall and eventual unification of those two hitherto separated worlds was going to open. In this sense, Bauman really has the privilege of a two-facing version of Benjamin’s Angel of History. This probably helped him recognize the false dichotomy of consumer freedom and dictatorship over needs, which, as he stated, was quickly becoming the only imaginable alternative to the system – at least as far as imagination was that of the system itself.

The present point of view is not all too dissimilar from the one in which Bauman was writing. We regularly encounter pronouncements of an end of a whole host of things, among them history, classical distribution of labour, standards of objectivity in reporting, nation-states, even – or so we hope – capitalism itself. While some of Bauman’s fears concerning postmodernity may, from the present perspective, seem overstated or even straightforwardly ridiculous, we are inhabiting a world of many posts – post-liberal, post-truth, post-human. Many think that this calls for a rethinking of how sociology can adapt itself to these new conditions: for instance, in a recent issue of International Sociological Association’s Global Dialogue, Leslie Sklair considers what a new radical sociology, developed in response to the collapse of global capitalism, would be like.

As if sociology and the zeitgeist are involved in some weird pas-de-deux: changes in any domain of life (technology, political regime, legislation) almost instantaneously trigger calls for, if not the invention of new, then a serious reconsideration of old paradigms and approaches to its study.

I would like to suggest that one of the sources of continued appeal of this – which Mike Savage brilliantly summarised as epochal theorising – is not so much the heralding of the new, as the promise that there is an end to the present state of affairs. In order for a new ‘epoch’ to succeed, the old one needs to end. What Bauman warns about in the passage cited at the beginning is that in a world without finality – without death – there can be no morality. In T.S. Eliot’s lines from Burnt Norton: If all time is eternally present, all time is irredeemable. What we may read as Bauman’s fear, therefore, is not that worlds as we know them can (and will) end: it is that, whatever name we give to the present condition, it may go on reproducing itself forever. In other words, it is a vision of the future that looks just like the present, only there is more of it.

Which is worse? It is hard to tell. A rarely discussed side of epochal theorising is that it imagines a world in which social sciences still have a role to play, if nothing else, in providing a theoretical framing or empirically-informed running commentary of its demise, and thus offers salvation from the existential anxiety of the present. The ‘ontological turn’ – from object-oriented ontology, to new materialisms, to post-humanism – reflects, in my view, the same tendency. If objects ‘exist’ in the same way as we do, if matter ‘matters’ in the same way (if not in the same degree) in which, for instance, black lives matter, this provides temporary respite from the confines of our choices. Expanding the concept of agency so as to involve non-human actors may seem more complicated as a model of social change, but at least it absolves humans from the unique burden of historical responsibility – including that for the fate of the world.

Human (re)discovery of the world, thus, conveys less a newfound awareness of the importance of the lived environment, as much as the desire to escape the solitude of thinking about the human (as Dawson also notes, all too human) condition. The fear of relativism that postmodern ‘plurality’ of worlds brought about appears to have been preferable to the possibility that there is, after all, just the one world. If the latter is the case, the only escape from it lies, to borrow from Hamlet, in the country from whose bourn no traveller has ever returned: in other words, in death.

This impasse is perhaps felt strongest in sociology and anthropology because excursions into other worlds have been both the gist of their method and the foundations of their critical potential (including their self-critique, which focused on how these two elements combine in the construction of epistemic authority). The figure of the traveller to other worlds was more pronounced in the case of anthropology, at least at the time when it developed as the study of exotic societies on the fringe of colonial empires, but sociology is no stranger to visitation either: its others, and their worlds, delineated by sometimes less tangible boundaries of class, gender, race, or just epistemic privilege. Bauman was among theorists who recognized the vital importance of this figure in the construction of the foundations of European modernity, and thus also sensitive to its transformations in the context of postmodernity – exemplified, as he argued, in contemporary human’s ambiguous position: between “a perfect tourist” and a “vagabond beyond remedy”.

In this sense, the awareness that every journey has an end can inform the practice of social theory in ways that go beyond the need to pronounce new beginnings. Rather than using eulogies in order to produce more of the same thing – more articles, more commentary, more symposia, more academic prestige – perhaps we can see them as an opportunity to reflect on the always-unfinished trajectory of human existence, including our existence as scholars, and the responsibility that it entails. The challenge, in this case, is to resist the attractive prospect of escaping the current condition by ‘exit’ into another period, or another world – postmodern, post-truth, post-human, whatever – and remember that, no matter how many diverse and wonderful entities they may be populated with, these worlds are also human, all too human. This can serve as a reminder that, as Bauman wrote in his famous essay on heroes and victims of postmodernity, “Our life struggles dissolve, on the contrary, in that unbearable lightness of being. We never know for sure when to laugh and when to cry. And there is hardly a moment in life to say without dark premonitions: ‘I have arrived’”.

On ‘Denial’: or, the uncanny similarity between Holocaust and mansplaining

hero_denial-2016

Last week, I finally got around to seeing Denial. It has many qualities and a few disadvantages – its attempt at hyperrealism treading on both – but I would like to focus on the aspect most reviews I’ve read so far seem to have missed. In other words: mansplaining.

Brief contextualization. Lest I be accused of equating Holocaust and mansplaining (I am not – similarity does not denote equivalence), my work deals with issues of expertise, fact, and public intellectualism; I have always found the Irving case interesting, for a variety of reasons (incidentally, I was also at Oxford during the famous event at the Oxford Union). At the same time, like, I suppose, every woman in the academia and beyond with more agency than a doormat, I have, over the past year, become embroiled in countless arguments about what mansplaining is, whether it is really so widespread, whether it is done only by men (and what to call it when it’s perpetrated by those who are not men?) and, of course, that pseudo-liberal what-passes-as-an-attempt at outmaneuvering the issue, which is whether using the term ‘mansplaining’ blames men as a group and is as such essentialising and oppressive, just like the discourses ‘we’ (feminists conveniently grouped under one umbrella) seek to condemn (otherwise known as a tu quoque argument).

Besides logical flaws, what many of these attacks seem to have in common with the one David Irving launched on Deborah Lipstadt (and Holocaust deniers routinely use) is the focus on evidence: how do we know that mansplaining occurs, and is not just some fabrication of a bunch of conceited females looking to get ahead despite their obvious lack of qualifications? Other uncanny similarities between arguments of Holocaust deniers and those who question the existence of mansplaining temporarily aside, one of undisputable qualities of Denial is that it provides multiple examples of what mansplaining looks like. It is, of course, a film, despite being based on a true story. Rather than presenting a downside, this allows for a concentrated portrayal of the practice – for those doubting its verisimilitude, I strongly recommend watching the film and deciding for yourself whether it resembles real-life situations. For those who do not, voilà, a handy cinematic case to present to those who prefer to plead ignorance as to what mansplaining ‘actually’ entails.

To begin with, the case portrayed in the film is a par excellence instance of mansplaining  as a whole: after all, it is about a self-educated (male) historian who sues an academic historian (a woman) because she does not accept his ‘interpretation’ of World War II (namely, that Holocaust did not happen) and, furthermore, dares to call him out on it. In the case (and the film), he sets out to explain to the (of course, male) judge and the public that Lipstadt (played by Rachel Weisz) is wrong and, furthermore, that her critique has seriously damaged his career (the underlying assumption being that he is entitled to lucrative publishing deals, while she, clearly, has to earn hers – exacerbated by his mockery of the fact that she sells books, whereas his, by contrast, are free). This ‘talking over’ and attempt to make it all about him (remember, he sues her) are brilliantly cast in the opening, when Irving (played by Timothy Spall) visits Lipstadt’s public talk and openly challenges her in the Q&A, ignoring her repeated refusal to engage with his arguments. Yet, it would be a mistake to locate the trope of mansplaining only in the relation Irving-Lipstadt. On the contrary – just like the real thing – it is at its most insidious when it comes from those who are, as it were, ‘on our side’.

A good example is the first meeting of the defence team, where Lipstadt is introduced to people working with her legal counsel, the famous Anthony Julius (Andrew Scott). There is a single woman on Julius’ team: Laura (Caren Pistorius), who, we are told, is a paralegal. Despite it being her first case, it seems she has developed a viable strategy: or at least so we are told by her boss, who, after announcing Laura’s brilliant contribution to the case, continues to talk over her – that is, explain her thoughts without giving her an opportunity to explain them herself. In this sense, what at first seems like an act of mentoring support – passing the baton and crediting a junior staff member – becomes a classical act in which a man takes it onto himself to interpret the professional intervention of a female colleague, appropriating it in the process.

The cases of professional mansplaining are abundant throughout the film: in multiple scenes lawyers explain the Holocaust as well as the concept of denial to Lipstadt despite her meek protests that she “has actually written a book about it”. Obvious irony aside, this serves as a potent reminder that women have to invoke professional credentials not to be recognized as experts, but in order to be recognized as equally valid participants in debate. By contrast, when it comes to the only difference in qualifications in the film that plays against Lipstadt – that of the knowledge of the British legal system – Weisz’s character conveniently remains a mixture of ignorance and naïveté couched in Americanism. One would be forgiven to assume that long-term involvement in a libel case, especially one that carries so much emotional and professional weight, would have provoked a university professor to get acquainted with at least the basic rules of the legal system in which the case was processed, but then, of course, that would have stripped the male characters of the opportunity to shine the light of their knowledge in contrast to her supposed ignorance.

Of course, emotional involvement is, in the film, presented as a clear disadvantage when it comes to the case. While Lipstadt first assumes she will, and then repeatedly asks to be allowed to testify, her legal team insists she would be too emotional a witness. The assumption that having an emotional reaction (even if one that is quite expected – it is, after all, the Holocaust we are talking about) and a cold, hard approach to ‘facts’ are mutually exclusive is played off succinctly in the scenes that take place at Auschwitz. While Lipstadt, clearly shaken (as anyone, Jewish or not, is bound to be when standing at the site of such a potent example of mass slaughter), asks the party to show respect for the victims, the head barrister Richard Rampton (Tom Wilkinson) is focused on calmly gathering evidence. The value of this, however, only becomes obvious in the courtroom, where he delivers his coup de grâce, revealing that his calm pacing around the perimeter of Auschwitz II-Birkenau (which makes him arrive late and upsets everyone, Lipstadt in particular) was actually measuring the distance between the SS barracks and the gas chambers, allowing him to disprove Irving’s assertion that the gas chambers were built as air raid shelters, and thus tilt the whole case in favour of the defence.

The mansplaining triumph, however, happens even before this Sherlockian turn, in the scene in which Rampton visits Lipstadt in her hotel room (uninvited, unannounced) in order to, yet again, convince her that she should not testify or engage with Irving in any form. After he gently (patronisingly) persuades her that  “What feels best isn’t necessarily what works best” (!), she, emotionally moved, agrees to “pass her conscience” to him – that is, to a man. By doing this, she abandons not only her own voice, but also the possibility to speak for Holocaust survivors – the one that appears as a character in the film also, poignantly, being female. In Lipstadt’s concession that silence is better because it “leads to victory”, it is not difficult to read the paradoxical (pseudo)pragmatic assertion that openly challenging male privilege works, in fact, against gender equality, because it provokes a counterreaction. Initially protesting her own silencing, Lipstadt comes to accept what her character in the script dubs “self-denial” as the only way to beat those who deny the Holocaust.

Self-denial: for instance, denying yourself food for fear of getting ‘fat’ (and thus unattractive for the male gaze); denying yourself fun for fear of being labeled easy or promiscuous (and thus undesirable as a long-term partner); denying yourself time alone for fear of being seen as selfish or uncaring (and thus, clearly, unfit for a relationship). Silence: for instance, letting men speak first for fear of being seen as pushy (and thus too challenging); for instance, not speaking up when other women are oppressed, for fear of being seen as too confrontational (and thus, of course, difficult); for instance, not reporting sexual harassment, for fear of retribution, shame, isolation (self-explanatory). In celebrating ‘self-denial’, the film, then, patently reinscribes the stereotype of the patient, silent female.

Obviously, there is value in refusing to engage with outrageous liars; equally, there are issues that should remain beyond discussion – whether Holocaust happened being one of them. Yet, selective silencing masquerading as strategy – note that Lipstadt is not allowed to speak (not even to the media), while Rampton communicates his contempt for Irving by not looking at him (thus, denying him the ‘honour’ of the male gaze) – too often serves to reproduce the structural inequalities that can persist even under a legal system that purports to be egalitarian.

Most interestingly, the fact that a film that is manifestly about mansplaining manages to reproduce quite a few of mansplaining tropes (and, I would argue, not always in a self-referential or ironic manner) serves as a poignant reminder how deeply the ‘splaining complex is embedded not only in politics or the academia, but also in cultural representations. This is something we need to remain acutely aware of in the age of ‘post-truth’ or ‘post-facts’. If resistance to lying politicians and the media is going to take the form of (re)assertion of one, indisputable truth, and the concomitant legitimation of those who claim to know it – strangely enough, most often white, privileged men – then we’d better think of alternatives, and quickly.

What after Brexit? We don’t know, and if we did, we wouldn’t dare say

[This post originally appeared on the Sociological Review blog, Sunday 3rd July, 2016]

In dark times
Will there also be singing?
Yes, there will be singing
About the dark times.

– Bertolt Brecht

Sociologists are notoriously bad at prediction. The collapse of the Soviet Union is a good example – not only did no one (or almost no one) predict it would happen, it also challenged social theory’s dearly-held assumptions about the world order and the ‘nature’ of both socialism and capitalism. When the next big ‘extraneous’ shocks to the Western world – 9/11 and the 2008 economic crisis – hit, we were almost as unprepared: save for a few isolated voices, no one foresaw either the events or the full scale of their consequences.

The victory of the Leave campaign and Britain’s likely exit from the European Union present a similar challenge. Of course, in this case, everyone knew it might happen, but there are surprisingly few ideas of what the consequences will be – not on the short-term political level, where the scenarios seem pretty clear; but in terms of longer-term societal impact – either on the macro- or micro-sociological level.

Of course, anyone but the direst of positivists will be quick to point out sociology does not predict events – it can, at best, aim to explain them retroactively (for example). Public intellectuals have already offered explanations for the referendum result, ranging from the exacerbation of xenophobia due to austerity, to the lack of awareness of what the EU does. However, as Will Davies’ more in-depth analysis suggests, how these come together is far from obvious. While it is important to work on understanding them, the fact that we are at a point of intensified morphogenesis, or multiple critical junctures – means we cannot stand on the side and wait until they unfold.

Methodological debates temporarily aside, I want to argue that one of the things that prevent us from making (informed) predictions is that we’re afraid of what the future might hold. The progressive ethos that permeates the discipline can make it difficult to think of scenarios predicated on a different worldview. A similar bias kept social scientists from realizing that countries seen as examples of real socialism – like the Soviet Union, and particularly former Yugoslavia – could ever fall apart, especially in a violent manner. The starry-eyed assumption that exit from the European Union could be a portent of a new era of progressive politics in the UK is a case in point. As much as I would like to see it happen, we need to seriously consider other possibilities – or, perhaps, that what the future has in stock is beyond our darkest dreams. In the past years, there has been a resurgence of thinking about utopias as critical alternatives to neoliberalism. Together with this, we need to actively start thinking about dystopias – not as a way of succumbing to despair, but as a way of using sociological imagination to understand both societal causes of the trends we’re observing – nationalism, racism, xenophobia, and so on – and our own fear of them.

Clearly, a strong argument against making long-term predictions is the reputational risk – to ourselves and the discipline – this involves. If the failure of Marx’s prediction of the inevitability of capitalism’s collapse is still occasionally brought up as a critique of Marxism, offering longer-term forecasts in the context where social sciences are increasingly held accountable to the public (i.e. policymakers) rightfully seems tricky. But this is where the sociological community has a role to play. Instead of bemoaning the glory of bygone days, we can create spaces from which to consider possible scenarios – even if some of them are bleak. In the final instance, to borrow from Henshel – the future cannot be predicted, but futures can be invented.

Jana Bacevic is a PhD researcher in the Department of Sociology at the University of Cambridge. She tweets at @jana_bacevic.

Education – cure or symptom?

[This post originally appeared on the website of REKOM, the initiative for the establishment of a reconciliation commission for former Yugoslavia].

When speaking of the processes of facing the past and reconciliation within the context of violent conflict, education is often accorded a major role. Educational practices and discourses have the ability to reproduce or widen existing social inequalities, or even to create new divisions. The introduction of textbooks which have painted a “purified” picture of a nation’s participation in and responsibility for the war crimes perpetrated during the wars in the 1990s, or the abolition of educational programmes and classes taught in minority languages, are just some of the examples found in the former Yugoslavia. Such moves are usually linked with a repressive politics that existed before, during and sometimes after the conflict itself.

Because of that, reconciliation programmes are often aimed at achieving formal equality within institutions or an equal representation of differing views in public discourses. Such an approach is based on the idea that a change of the public paradigm is the necessary first step in coming to terms with the past. In this particular case, the process of reconciliation is being led by the political and social elites which influence the shaping of public opinion. Similar to the “trickle-down theory” in economics, the assumption is that a change in the official narrative through the institutions, including those in the educational field, will, in time, bring about a change in public awareness – that is, lead the rest of the population to face its traumatic past.

Although the influence of formal discourses cannot be neglected, it is important that we understand that the causes and consequences of conflict, and thus the prosecution of those responsible, usually depend on a whole array of social and economic factors. It is highly unlikely that critical narratives examining the past will find a fertile ground in the educational institutions of divided and isolated societies. In this respect, the textbooks are just the metaphorical tip of the iceberg. It bears repeating that all educational institutions in Bosnia and Herzegovina, from elementary schools to universities, are ethnically segregated. The situation is similar in Kosovo, where this institutional segregation is virtually complete – just like in the nineties, there are in practice two parallel systems in existence. The universities in Macedonia also reflect its constitutional make-up, based on the division of political power between its two largest ethnic groups. Even in more ethnically homogenous communities, such as those found in parts of Serbia or Croatia, the presence of religious education in school curricula – a subject which, in its present format, segregates students according to their faith – stands as a lasting symbol of the impact of identity-based politics on the education system.

The institutionalization of divisions rooted in the legacy of the conflict fought in the former Yugoslavia does not end with education, but instead pervades other relationships and activities as well, such as employment, freedom of movement, family structure and the creation of informal social networks. It goes without saying that the political parties in all the successor-states are, by and large, made up of those who have profited in some way from the breakup of Yugoslavia. The transition from socialist self-governance to neoliberal capitalism has served to further degrade the stability and independence of social institutions. Such a context fosters political ideologies such as chauvinism and nationalism, and breeds fear of all that is different. What we must therefore ask ourselves is, not just how to change the content and the paradigm of education in the former Yugoslavia, but also – who profits from it staying the way it is?

These questions require critical analysis, not just of the responsibility for the crimes perpetrated during the conflict in the former Yugoslavia, but also of the economic and political legacy of its breakup. This is a huge challenge, which implies dialogue between the different parts of society in each successor-state. Educational institutions, universities and science institutes in particular, can play a potentially major role in establishing such a dialogue. This implies, first and foremost, an agreement on what its rules and goals are – which Habermas considered a crucial element in the development of the public sphere. For as long as there is no such agreement in place, deliberations on contemporary history will remain fragmented along the lines of ideological affiliation or political belief. Education based on such interpretations of the past thus continues to serve as an instrument of the proliferation of the same (or at least similar) divisions which shaped the dynamics of the conflict following the breakup of the former Yugoslavia, rather than as a motor of change.

This, of course, does not mean that every change in education requires the whole social structure to be changed beforehand, but it does mean that these two elements go hand in hand. Although this change is very likely to be gradual, it is far more important to ensure that it is permanent. In the end, the educational narratives we are dealing with might brush up against the past, but they concern the future.

Jana Bacevic works on social theory and the relationships between knowledge (and education) and political agency. She is presently writing her PhD in sociology at the University of Cambridge, Great Britain, and has a PhD in anthropology from the University of Belgrade. She has worked as a Marie Curie Fellow at the University of Arhus and taught at the Central European University in Budapest and Singidunum University in Belgrade. Her book “From Class to Identity: Politics of Education Reforms in Former Yugoslavia” was published in 2014 by Central European University Press.

Higher education and politics in the Balkans

In this entry of the thematic week on crisis, Jana Bacevic from the Department of Public Policy, Central European University (Budapest)  examines higher education in the context of  ethnic and religious divisions in recent Balkan history. 

In situations of crisis – whether it’s economic, environmental, or humanitarian – higher education is hardly the first to come to mind. Aid and development packages tend to focus on primary education, essential for teaching reading, writing and calculus, as well as successful socialization in peer groups, and, in some cases, on secondary – usually vocational – education, supposed to enable people to work both during and in the immediate aftermath of the crisis. However, slowly but steadily, higher education is beginning to occupy a more prominent place in contexts of crisis. Why is this the case?

Critics would say higher education is a luxury, and that focus on higher education is hardly anything but empty rhetoric aimed at rallying support for the agendas of politicians or trade unions. However, there are many reasons why higher education should not be ignored, even in times of crisis. Issues and policies related to higher education hardly ever stay confined to the university campus, or even to the boundaries of nation-states, whether new or old.

Access to higher education is directly linked to the access to work, income, and, to some extent, social and political participation. In this sense, who and how can access higher education (and under which conditions) are questions that have explicit political consequences for human and minority rights, social stratification and (in)equality,  and the overall quality of life. Higher education institutions do not only reflect the dominant ethos of a society; they also create and reproduce it. Politicians and policymakers know this, and this is why higher education can become such a politically charged issue.

The recent history of higher education in the successor states of former Yugoslavia provides many examples of the interplay between higher education and political dynamics. Early during the conflict, two universities in Bosnia and Herzegovina were divided between ethnic groups. The Serbian staff and students of the University of Sarajevo founded the separate University of East Sarajevo in 1992. The University of Mostar was split between the Croatian part (University of Mostar, or “Sveučilište u Mostaru”) and the Muslim part (University of Mostar “Džemal Bijedić”). In Kosovo, the University of Prishtina was at the very center of political contestation between the two biggest ethnic groups, Albanians and Serbs. Following series of Kosovo Albanian demonstrations at the end of the 1980s, the Serbian authorities forbade the university to accept any more Albanian students. The result was a complete split of the academic sphere into two domains – the “official”, Serbian one, and the “parallel”, Albanian, which existed outside of the institutional frameworks.

After the NATO intervention in 1999, the Serbian students and staff fled to the northern part of the province, predominantly controlled by the central Serbian government, re-establishing the university as the “University of Prishtina temporarily located in Kosovska Mitrovica”. Meanwhile, Albanian students and staff returned to the premises of the university in Prishtina, developing a new system under close supervision of the international administration. Just like in Bosnia, the configuration of higher education today reflects the deep ethnic and social cleavages that are the legacy of the conflict.

Higher education can become a subject of political contestation even in the absence of a large-scale armed conflict. For instance, one of the issues that precipitated the conflict between ethnic Albanians and Macedonian police in the Former Yugoslav Republic of Macedonia in 2001 was the demand of ethnic Albanian parties for a separate university in their own language. Following the de facto consociational arrangement provided by the terms of the Ohrid Framework Agreement peace treaty, the previously private Tetovo University was given public status in 2004. However, the same town was already home to the Southeast European University, founded in 2001 by the international community (primarily the OSCE) in order to work on the post-conflict development and foster integration of the ethnic Albanian and ethnic Macedonian youth. Currently, two universities coexist, teaching similar programmes and even sharing staff, although differing in the approach to the use of languages, as well as in the composition of student body.

A similar story can be told about Novi Pazar, the administrative center of Sandžak, a multiethnic region of Serbia with high proportion of Bosniak Muslims. The private International University of Novi Pazar was founded by a local Muslim religious leader in 2002, with support from the government in Belgrade who, at the time, thought it would be a good solution for the integration of Bosniak Muslims within the framework of the state. Two years later, however, after the change of government and political climate, the state founded a new university, named the State University of Novi Pazar, withdrawing support from the International University. The two universities continue to exist side by side, teaching similar programmes and, in theory, competing for the same population of students. Their internal rivalries reflect and reproduce the political, social and, not least of all, ethnic cleavages in Sandžak.

Universities in the Western Balkans are just some of the examples in which the links between higher education and social divisions can be seen most clearly. However, they are neither isolated nor unique: conflicts can persist and occur across and outside of ethnic and religious lines, sometimes teeming below the surface even in societies that, from the outside, appear peaceful and stable. This is why higher education should not only be reactive, responding to cleavages and conflicts once they become visible, but rather proactive, revealing and working to abolish the multiple and often hidden structures of power that reproduce inequalities. On the one hand, this can be done through policies that seek to ensure equal access to and representation in higher education institutions. On the other, it can also mean engagement in research and activism aimed at raising awareness of the mechanisms through which inequalities and injustice are perpetuated. This latter mission, however, requires that higher education institutions turn a critical eye towards their own policies and practices, and examine the ways in which they are – perhaps unwittingly – reproducing the societal divisions that, in times of crisis, can easily evolve into open conflicts. Frequently, this is the hardest task of all.

—–

Jana Bacevic holds a PhD (2008) in Social Anthropology from the University of Belgrade. Previously she taught at the University of Belgrade and Singidunum University and worked as higher education expert on a number of projects aimed at developing education in the post-conflict societies of the Western Balkans. Her research interests are in the intersection between sociology, anthropology, politics and philosophy of knowledge, and her book, “From class to identity: politics of education reforms in former Yugoslavia” is being published by CEU Press in 2013.