did you feel it?

Our Techno Jouissance

How Could Intelligent and Affectively Orientated Technologies Effect the Brain?

Ben Burtenshaw

September 2, 2015artist contribution,

This essay (part of the Open! Co-Op Academy project “did you feel it?”) is concerned with how affective forces move through our technological encounters and how technology itself can alter these, as well as their perception within our brain. It’s not so much about the exchanges between one another, as about the situations where other, non-human, intelligences come into the fray. How and where do these intelligences manifest and what psychology do they bring about? In the latter part of the essay, I’ll compare some examples with existing theories of affective labour in order to give us an idea of what could come of these forces in the future. Affective labour has become central to consumer capitalism, seen in the prevalence of roles like the call centre worker, and so related theories can show how capital already functions affectively.

Film still, Ex Machina, 2015.

To give us a bearing on the concerns in question, it is necessary to define what aspect of technology we’re talking about. In the film Ex Machina (2015), the character Nathan is challenged with discerning whether the “robot” in the film is intelligent. He proposes using philosopher Frank Jackson’s thought experiment, Mary in a black and white room:

Mary is a brilliant scientist who is, for whatever reason, forced to investigate the world from a black and white room via a black and white television monitor. She specializes in the neurophysiology of vision and acquires, let us suppose, all the physical information there is to obtain about what goes on when we see ripe tomatoes, or the sky, and use terms like “red”, “blue”, and so on. She discovers, for example, just which wavelength combinations from the sky stimulate the retina, and exactly how this produces via the central nervous system the contraction of the vocal cords and expulsion of air from the lungs that results in the uttering of the sentence “The sky is blue”. ... What will happen when Mary is released from her black and white room or is given a color television monitor? Will she learn anything or not?
—Frank Jackson, 1982

In response, Nathan asks “but does she know what colour feels like?”. Nathan’s distorting perspective of knowledge, intelligence and feeling that sits somewhere on the edge of Mary’s door is the mixed intelligence I wish to address. It could be called the technological threshold,1 or the space between us and technology, which we mediate in our interaction with it.

Nathan’s query is an effort in ex-machina to debunk the myth that brains and computers are sort of the same. French philosopher Catherine Malabou proposes that this analogy was held up in the fifties, as a way of advancing early developments in artificial intelligence, but is now a hindrance (Catherine Malabou, 2008, 34). The analogy is based on the idea of a program and that thinking is kind of like calculating or programming; if you can thus get a computer to calculate enough, it will be as if it is thinking. Jackson’s thought experiment and Nathan’s question highlight the errors in this analogy; there is a fundamental difference in the way a human and a machine thinks. That difference Nathan talks about is at the very edge of human thought. It is what neuroscientist Daniel Dennett calls “qualia”, “an unfamiliar term for something that could not be more familiar to each of us: the ways things seem to us” (Daniel Dennett, 1985). Qualia is the archetypal argument of anti-materialist consciousness:

The sensation of color cannot be accounted for by the physicist's objective picture of light-waves. Could the physiologist account for it, if he had fuller knowledge than he has of the processes in the retina and the nervous processes set up by them in the optical nerve bundles and in the brain? I do not think so.
—Erwin Schrödinger, 1944

Qualia is that thin little edge of our consciousness, those things that feel idiosyncratic, personal or unexplainable. So the concern of this essay is how this edge is acted upon, when pressed up against technology. This is where plasticity comes in. Plasticity is a term used in neuroscience and also by Malabou, which refers to changes in the brain’s synapses and pathways, due to external factors. These can be caused by environment, behaviour or physical trauma; as well as many more. So how can technology cause plasticity?

<a href='https://www.youtube.com/watch?v=i5ufgkJ-uVE' target=_blank>Drive</a>, 2010.

In the film Drive (2010), starring Ryan Goslin and Carey Mulligan, Goslin’s character the unnamed “driver” is a manifestation of a machinic convergence, or a personification of a technological threshold. He is unlike the more established figure of the cyborg, like Arnold Schwarzenegger’s terminator, that is literally some mix of man and machine. Instead the driver’s machinification takes place in his mind, he has the cold and ruthless mentality of the very car he drives. As if the car has left an imprint on his brain from the repetitive affective interactions with it, or an artificial plasticity. This could be a real concern for the advancement of affective technology. As already touched on by Hannah Arendt in The Human Condition:

If it should turn out to be true that knowledge (in the modern sense of know-how) and thought have parted company for good, then we would indeed become the helpless slaves, not so much of our machines as of our know-how, thoughtless creatures at the mercy of every gadget which is technically possible, no matter how murderous it is.
—Hannah Arendt, 1958

This fearful comment by Arendt has some obvious and immediate comparisons with contemporary society, but within the scope of this essay the most resonant is the term “our know-how”. This is where the practical knowledge or skill needed to use a device acts as a kind of technological imprint. In essence, this is the same process that takes place in the driver’s mind, where the technological threshold is produced by social, emotional, spiritual and political factors that we mediate through technology. As these relationships become more complex via various intelligent technologies and intense interactions, it is “our know-how” that is at stake (Arendt 1958). Arendt’s notion takes this complexity into account in turning over the hardware of things, devices, robots and artificial intelligence to the mental psychological nature of things – or the imprint of this threshold. So, Arendt’s “know-how” is the psychology within us that this hardware leaves behind. This imprint is a kind of murky spirit or, in the driver’s case, the personality of a car manifest in his character,2 which for Ryan Goslin’s character is brought about by his affective relationships with loved ones and his personal social conditions.

The imprint could be understood as the outcome of a consciousness pressing up against the technological threshold, through the affective relations embedded into technology. According to philosopher Brian Massumi, when “you affect something, you are opening yourself up to being affected in turn”, and it is not possible to have a singular affective direction (Brian Massumi, 2015, 103). The question is then: What happens when we enter into these affective relations with technology?

We should at this stage clear up our understanding of affective forces. In psychological terms, affect is the process of feeling something through the body.3 Massumi elaborates on this physicality, to say that affect occurs within the half-second before the rational decisions of the brain play a part. He attempts to ask: “What happens during the missing half second?”

[T]he half-second is missed not because it is empty, but because it is overfull, in excess of the actually performed action and of its ascribed meaning. Will and consciousness are subtractive. They are limitative, derived functions which reduce a complexity too rich to be functionally expressed. It should be noted in particular that during the mysterious half-second, what we think of as “higher” functions, such as volition, are apparently being performed by autonomic, bodily reactions occurring in the brain but outside consciousness, and between brain and finger, but prior to action and expression.
—Brian Massumi, 1995, 29

By asserting that functions such as volition can happen almost autonomously and outside of consciousness, Massumi confiscates functions that humans pride themselves on from the brain. Affect is then the “feeling” of this missed physicality, or the feeling of feeling.

It is within this space that I would like to slide our assessment of technology and affective forces. Because in explicit terms, through algorithmic data mining and abstraction, feelings like these, as well as those autonomous bodily actions, are used by technology companies to advertise, research, evaluate and trade. This is presented in more psychologically simplistic terms than Ex Machina or Drive, but one can still imagine how sustained interaction with these algorithms is affected. What happens when these actions become a more insurgent and radical force within that missed half-second? What happens when these algorithms and intelligent technologies know and inform our “goals”?

The Nike+ Running app, is the most popular fitness application for smartphones. The software works by users entering their height and weight and beginning to run whilst carrying their device. Using the phone’s GPS and the heart rate monitor option the user’s biometric data is recorded. The user can then select – or is prompted to select – a main goal: for example, to run 10 kilometres in under 45 minutes or run a marathon. The software will then configure a training program over the coming weeks and, using an audio interface whilst running, dietary and motivational prompts whilst not running and social competition to motivate the user to complete his or her goal.

The application uses a combination of factual and motivational prompts. These inform users of how far and long they’ve run, and interestingly, how much further and longer they should run. This creates a blurring where the runner is at the centre of fact and speculation. Swapping back and forth between these increases the effectivity of the affective language on the runner. In the tiring state that comes with any exercise, it is easy to imagine what this swapping does in that half-second: it blurs the boundary between real and imposed horizons. The interface’s voice itself reeks of the perpetual forward march of capital and the goal orientated society we’re a part of.

For this essay though, the Nike app emphasises the technology and affect within physical proximity. It shows technology serving as a site for our fixations, cravings and desires to be mixed together and not just by human will, but by a certain technological will as well. This is how the threshold becomes less about the a-to-b communication of affect, and more about a murky conglomerate that our minds are trying to deal with.

An important issue here has involved figuring out what this technological threshold looks and feels like, and how we play a part in it both physically and socially.

<a href='https://www.youtube.com/watch?v=y-waTi8BPdk' target=_blank>Apple watch</a>, 2014.

As we have noted, one of the major mechanisms operating in this threshold are affective forces. They are more complex than human feelings; they are what bind people together, such as sorrow, hunger or passion. Ultimately, they rely on some form of relationship to the body and, in turn, feed on the carnal energies from that. Affective forces are now the coal face of advancement to the technologies we use, and can be considered the next unconquered ground. I would like to compare these technologies, to how affective forces that have already manifested themselves in labour practice.

<a href='https://www.youtube.com/watch?v=Isn0uhVa5-s' target=_blank>Call centre</a>, BBC, 2014.

The call centre worker is an archetypal affective labourer. Affective labour is work that is intended to produce or modify emotional experience in people, a service with a smile. The call centre worker’s job is to be considerate in order to sell things or handle their complaints. They are often treated badly and have to deal with people’s hatred, but respond in a polite and soothing manner. Like the fordist labourer who must offer up his or her physical body to the factory, the affective labourer must offer up his or her feelings. The affective labourer, in its many guises, has become a common symbol and face of capitalism.

Affective labour is relevant here because it exemplifies a more established effect of the machine on brain plasticity. Marx in the 19th century described a mix of dead and living labour, and although the concern here relates to social manifestations of affective forces, we can use labour to understand how social interactions will progress. Let’s look at Griff’s4 dismay with the monotony of daily life, and how ultimately labourers hold a job that will eventually be replaced by a machine. In order to perform his or her task, and interact with “dead labour” for prolonged periods of time, a part of him or her must clearly die. Or at least not acting as if he or she was living or emotional, hence his eventual emotional outburst leaving him unemployed. There are many other affective labour roles in today’s society, and Griff’s struggle is pretty common, but will this same effect also occur socially through interactions with more intelligent technologies?

Ultimately no, is our guess. Why would a person pay for, and volunteer to enter into this type of relationship? But, interestingly, voluntarism is becoming a more and more integral component of society. If we take David Cameron’s “big society”, which was basically the idea that we should all contribute by volunteering in our local communities, but actually just became a way to get citizens to do things for free that a state should be doing. The big society, like most post-Thatcherite politics, is a liberalised form of conservatism. Traditional tory politics involves reducing the commitment of the state whilst increasing the responsibility of the now-unpaid worker. This is a common application of the free market onto the social sphere as labour. In other words, transforming us into labourers by appealing to morality: “Don’t you want to help out?” So, maybe there is a chance for the political right to overcome voluntarism and convince us to go along with the ”big” plan. But how – and why – would technology be involved?

To slightly complexify how and why we engage these technologies, and to speculate on how this could be manifested, let’s mix between the social and labour once again. I’d like to propose that we view the affective joy we take from technologies like the Apple watch, as “jouissance”, a French term used by philosophers (including Jean-Francois Lyotard) to describe a certain kind joy that one craves but that can also be harmful.

[T]he English unemployed did not have to become workers to survive, they … enjoyed the hysterical, masochistic, whatever exhaustion it was of hanging on in the mines, in the foundries, in the factories, in hell, they enjoyed it, enjoyed the mad destruction of their organic body which was indeed imposed upon them, they enjoyed the decomposition of their personal identity, the identity that the peasant tradition had constructed for them, enjoyed the dissolutions of their families and villages, and enjoyed the new monstrous anonymity of the suburbs and the pubs in morning and evening.
—Jean-Francois Lyotard, 1991

However, I don’t agree with Lyotard’s view of the English working class here, and I’d argue that there is a much longer and more nuanced process of subjugation. A process which stretches back, yes, to peasant traditions, but also to a much stauncher class divide. This divide was then psychologically entrenched by the first and second World Wars, unlike many other European countries that, by this time, had begun to shed their class systems. This is then met by the working class’s support through the construction of the National Health Service and other post-war politics, which are subsequently smashed by Margaret Thatcher’s neo-conservative project, which brings us to the moment Lyotard is referring to. In essence, the British proletariat was at the centre of an economic whirligig, which eventually produced labourers like Griff. I’m trying here to slightly elaborate on the longer process that Lyotard is referring to, in order to see if a process of jouissance might have worked affectively. In turn, this helps us to imagine how a jouissance could exist in the techno-social context of companies like Apple and Nike.

We could say that technology’s advancement within this affective arena, and towards the goals of capital, will result in the transposal of the fickle relationships Griff encounters in the call centre, onto our social relations. Once we end up with these fickle relationships, it is unlikely that we will drive into disarray, go onto the streets and demand our meaningful relationships back. We will probably end up craving these fickle relationships instead. Not unlike in Driver and Rambo where the characters are trained to be killing machines, and not unlike how Griff has become a trained answering machine, will we eventually end up being simply trained feeling machines? Will we then just revel in the decomposition of our personal identities and transform ourselves into affective labourers for good; just for the sake of a semblance of horizon and connection? This is not necessarily inherent to the technology itself, but in our minds or our “know-how”, and it occurs through the imprinting of a technological threshold. It is an imprint that is caused by the inherent two-way nature of affective forces and results in the techno-jouissance that we crave. Like Lyotard’s hypothetical working class, we are left craving our own destruction.

References

  • Apple, Apple Watch – Introducing Apple Watch., 2014, www.youtube.com, accessed 10 October 2014.
  • Arendt, H., The Human Condition, 2nd ed., London: University of Chicago Press, 1958.
  • BBCThe Call Centre, 2014, www.youtube.com, accessed 2 May 2015.
  • Brassier, R., “Prometheanism and its Critics”, In: A. A. R. Mackay, ed. #ACCELERATE: The Accelerationist Reader, Falmouth, UK: Urbanomic, 467–488, 2014.
  • Dennett, D. C., Quining Qualia, Medford (Massachusetts): Tufts University, 1985.
  • Drive, 2011, film directed by Nicolas Winding Refn. US: FilmDistrict, 2011.
  • Ex Machina, film directed by Alex Garland, UK: Film4, 2015.
  • First Blood, film directed by Ted Kotcheff, US: Anabasis N.V., 1982.
  • Hardt, M. and Negri, A., Empire, Cambridge (Massachusetts): Harvard University Press, 2001.
  • Jackson, F., „Epiphenomenal Qualia“, Philosophical Quarterly, vol. 32 (1982), 127–136.
  • Lyotard, J.-F., The Inhuman: Reflections on Time, Oxford: Basil Blackwell Publishers, 1991.
  • Malabou, C., What Should We Do with Our Brain?, New York: Fordham University Press, 2008.
  • Massumi, B., “The Autonomy of Affect”, in: Parables for the Virtual, Durham (North Carolina): Duke University Press, 23–45, 1995.
  • Massumi, B., The Power at the End of the Economy, Durham (North Carolina): Duke University Press, 2015.
  • Nike, Nike+ Running (version 1.6.2), 2015, play.google.com, accessed 2 May 2015.
  • Schrödinger, E., What Is Life?: The Physical Aspects of the Living Cell, 2001 ed. Cambridge (Massachusetts): Cambridge University Press, 1944.

1. Threshold is a key term in Brian Massumi’s theories of affect. He uses it to describe the two-way relationship inherent in affect, but also a proverbial threshold that, in being affected, one irrevocably steps over. “When you affect something, you are opening yourself up to being affected in turn, and in a slightly different way than you might have been the moment before. You have made a transition, however slight. You have stepped over a threshold. Affect is this passing of a threshold, seen from the point of view of the change in capacity” (Massumi, 2015).

2. In cinema this sort of imprinted character is used often, with Rambo being an archetypal example. He was reprogrammed as a killing machine by the US government and so knew nothing but killing (First Blood, 1982). Drive, however, supplies a less primitive example that doesn’t rely on state force.

3. The study of affect began with Baruch Spinoza. In 1677, Spinoza defined these as states of mind and body like, pleasure, joy, pain, hunger or desire; emotional states that are inherently bound to the body itself. Although Spinoza’s notion of affect is relatively crude compared to a more contemporary affect thinker like Brian Massumi, Spinoza offers a really direct way of understanding how affect is connected to the body.

4. A figure from the 2014 BBC documentary television program The Call Centre. Online available on www.youtube.com (accessed 2 May 2015).

Ben Burtenshaw graduated from the Dutch Art Institute in 2015. His work as an artist is interested in how art can mediate a perceived technological divide. In other words, how the ‘truths’ of science are understood and utilised culturally.

did you feel it?
‘did you feel it?’