Beyond Privacy

Margins of Freedom

Privacy and the Politics of Labour and Information

Armin Medosch

April 23, 2010essay,

Media artist, writer and curator Armin Medosch researches the development in the meaning of the term ‘freedom’ and the idea of privacy that goes with it. The solution to the current crisis concerning privacy stretches beyond finding a new balance between private and public. According to Medosch, the solutions should be sought in the realm of the digital commons, where freedom is not seen as something to achieve on one’s own by accumulating possessions, but as something that is created by sharing knowledge.

‘Privacy is the claim of individuals, groups or institutions to determine for themselves when, how and to what extent information about them is communicated to others.’1 If we accept this definition, it is only too obvious how little control we have over information about ourselves. The gathering of personalized information is not an involuntary by-product of technology but a key component of the way the ‘information society’ works. The rationale for information gathering stems partly from the ‘need’ of modern societies to have enough knowledge about themselves to keep functioning; but this involves further ‘needs’, such as to control labour, shape consumption and create a ‘database state’. The foundational myths of the information age have inscribed themselves into the developmental path of information and communication technologies (ict). While a desire for automated surveillance has long existed, it is now matched by an amplified capacity to actually carry it out.

A number of national and international campaign groups such as Foebud. E.V. in Germany, quintessenz.at in Austria, the eff and aclu in the usa, and the European umbrella organization edri are fighting the erosion of privacy. Some of them organize the annual ‘Big Brother Awards’ (bba), where the worst anti-privacy measures are ‘honoured’. In the uk, where the bba was invented, the physical object awarded is a statue of a military boot stamping on a head. But the jackboot is not an image that people living in liberal democracies associate with their reality. Warning against an outdated critique of totalitarianism doesn’t mean that liberal democracies don’t produce totalitarian techniques. This text tries to support effective strategies for counter-surveillance by developing a richer heuristic model, connecting the historic function of privacy in liberal democracies with the overall technopolitical dynamics fostering its rise then and its decline today.2

Privacy in a Free Democracy

Privacy is an important category for the political-philosophical framework of liberalism3 and has a constitutive function within the legal framework of liberal democracy by expressing the idea of the protection of individual freedom and autonomy from unjust intrusions or regulations of the state.4 Out of the intimate as a nucleus of the private sphere, the public sphere was created by bourgeois citizen journalists, argues Habermas.5 ‘The public sphere of civil society… ultimately came to assert itself as the only legitimate source of [the] law.’6 Habermas acknowledges that this political function of the public sphere could gain valency only once ‘commodity exchange and social labour became largely emancipated from governmental directives’. The market, Habermas concludes, was ‘the social precondition for a “developed” bourgeois public sphere’.7

E.P. Thompson’s account of the making of the English working class shows that the reading public was not restricted to the bourgeoisie.8 Inspired by the French Revolution, ‘English Jacobins’ met in taverns and private houses, bookshops and cafes to read revolutionary literature and demand political reforms. These ‘plebeian radicals’ placed high value on self-education, egalitarianism, rational criticism of religious and political institutions, a conscious republicanism and a strong internationalism.9 The ruling class reacted through the suspension of habeas corpus and a series of repressive laws such as the Seditious Meetings and Combination Acts.As a result, the ‘plebeian radicals’ were driven leftwards and underground,10 so that they failed to create stronger ties with those parts of the bourgeoisie who, under different conditions (no war with France, for example), might have sided with them. The early working class pre-configured many aspects of the working-class ideology after 1830, which held in high regard ‘the rights of the press, of speech, of meeting and of personal liberty’, writes Thompson, dismissful of the ‘the notion to be found in some late “Marxist” interpretation’ that these values have been inherited from ‘bourgeois individualism’.11

The establishment of the bourgeoisie as a privileged legal subject was based on legislation that enshrined into law the suppression of the English working class, argues Saskia Sassen.12 Habermas’s concept of the reasoning public is an idealization that needs to be called into question. Maybe the public sphere does not necessarily develop out of the intimacy of the private sphere but rather out of a political process of the shaping of class consciousness, whether between members of the bourgeoisie or a very diverse group of artisans, craftsmen and -women and labourers.

The particular conditions set by the early defeat of the English working class had a determining influence on the path of technological development out of antagonistic class relationships. A specific version of technological progress under capitalist conditions was set in motion, which sought direct control of workers at the site of production and the displacement of skilled human labour through machines. ‘It is a result of the division of labour in manufacture that the worker is brought face to face with the intellectual potentialities of the material process of production as the property of another and as a power that rules over him,’ wrote Karl Marx,13 capturing a basic tendency that is still at work and has only intensified .

Armand Mattelart14 argues that an information-age-before-the-name started in France with Concordet’s conception of statistics as a ‘social physics’ at the time of the French Revolution. Enlightenment philosophers made mathematical thinking the yardstick for ‘judging the quality of citizens and the values of universalism’. From Concordet via the British tax system during the Napoleonic wars, the development of statistics leads in the course of the nineteenth century to an ‘insurance society’15 where the profitability of businesses and the success of governments depends on the ability to apply probabilistic ‘technologies’ for the prediction and management of the future.

The philosopher and historian of science Simon Schaffer sees a link between the ‘growing system of social surveillance in Great Britain in the early 19th century and the emerging mechanisation of natural philosophies of mind’.16 According to Schaffer the ‘politics of intelligence’ of the time located ‘intelligence’ in machinery and its conception, while at the same time the unity of manual and mental labour was broken. A key protagonist in this ideological battle was Charles Babbage, the designer of the ‘difference engine’ and the ‘analytic engine’. Babbage was inspired by Gaspard de Prony’s application of the principle of the division of labour to the task of converting old measurements into the new uniform decimal system. Babbage’s ‘dream’ was to implement such a division of labour in his calculating machines. The displacement of human mental labour by a machine was instantly connected with the analogy of artificial ‘intelligence’ by the circle around Babbage. This ‘vision’ was developed alongside an analogy between the internal organization of Babbage’s mechanical calculators and the view of the mechanized factory as a Benthamite Panopticon. Babbage and other ‘factory tourists’ – middle-class intellectuals who travelled to the new factory districts in the north of England – gave accounts ‘of the factory as a transparent and rational system designed to demolish traditional and customary networks of skill and artisan culture’, reports Schaffer. Not only did the new factories make artisans unemployed, but their contribution to the development of new technologies was talked down to legitimize the existing class structure. The Babbage principle states: ‘That the master manufacturer by dividing the work to be executed into different processes, each requiring different degrees of skill or of force, can purchase exactly that precise quantity of both which is necessary for each process.’17

This legacy contributes to the blueprint of the factory as well as the calculation engine, according to Schaffer. The early nineteenth-century ‘politics of intelligence’ can be understood as the forerunner of the project of artificial intelligence (ai) developed by the pioneers of the computer age, Turing, Shannon, Von Neumann and Wiener.18

Harry Braverman’s critique of Taylorism exposes the key principles that shaped the emergence of ‘modern management’. Claiming Babbage as a direct forerunner of F.W. Taylor,19 Braverman argues that the ‘absolute necessity’ to control each step of the labour process and its mode of execution makes necessary the creation of a monopoly of knowledge about the work process.20 Management assumes ‘the burden of gathering together all of the traditional knowledge which in the past has been possessed by the workmen and then of classifying, tabulating and reducing this knowledge to rules, laws, and formulae . . .’21 This logic also requires that ‘every activity in production have its several parallel activities in the management center’.22 Parallel to the flow of things a flow of paper comes into existence, created by the new professional class of middle managers who are busy with the gathering of data, the planning, organization and supervision of production.23 Their work is subjected to the same carefully occasioned fragmentation designed by top management to keep the strings of control tightly in their hands.24 The ‘flow of paper’ created by the parallel work of planning has meanwhile been transformed into a flow of information: the accumulated ‘intelligence’ of management encoded in software.

The introduction of mass production brought such increased levels of material flows, argues Beniger,25 that it triggered a ‘crisis of control’ by the mid nineteenth century. The crisis gets resolved through the combination of a number of innovations such as the development of modern management, of modern accounting and the introduction of modern media such as the telegraph, telephone and typewriter. Together, they enable the creation of large-scale bureaucracy resulting in the particular form of organization embodied in the ‘modern corporation’. There are strong co-dependencies in those technoeconomic ‘revolutions’. Railroads and the telegraph grow across the North American continent literally ‘together’. The first companies to develop modern management techniques are themselves ‘networks’: railroad, telegraph and telephone networks.26 The control revolution drives capitalism’s hunger for ‘information’ and may provide a non or at least pre-military explanation for the need to invent the computer.

From Fordism to Post-Fordism

When Fordism became the leading technological paradigm after the Second World War, it depended on certain macroeconomic stabilization factors which resulted in the requirement not only to control the production process but also the markets.27 For the corporations, predicting and influencing future levels of consumption became a key part of their activity. In the early twentieth century a number of ‘mass feedback’ techniques were developed, such as market research, the Gallup poll, opinion surveys, indices of retail sales and Nielsen’s radio rating.28 New sociological schools started empirical research on ‘the effects of media on receivers and the constant evolution of knowledge, behaviour, attitudes, emotions, opinions and actions’. This research was not purely academic but carried out in response to practical objectives.29 ‘The sponsors of those studies were concerned about the effects of government information campaigns, advertisement campaigns and army propaganda during wartime.’30 The measurement of audiences with a view on regulating their behaviour as consumers and voters became the basis of what Brian Holmes calls Neilsenism, an interpretation of society as a cybernetic system with informational flows as control loops.31 The notions of ‘information’, ‘feedback’ and ‘systems’ serve as an intermediate for a number of different processes which all depend on the gathering of ‘information’ about social properties of individuals and groups.

By the end of the 1960s Fordism enters a crisis resulting from the rigidities of the system, successful imitation by competitors and student and worker protest. From within the old technoeconomic paradigm a new paradigm based on microprocessors, telecommunications and information unfolds.32 Concomitant with those shifts and transformations is the emergence of an advanced version of a more complex cybernetic system of control and seduction. More than ever the integration of feedback circuits into larger control systems relies on predictive algorithms, to paraphrase Brian Holmes. This upgraded paradigm of cybernetic control is no longer based on narrow functionalist and behaviourist ideas of ‘manipulation’. Instead, it relies on more indirect, more internalized, more capillary forms of power and self-control. In the new postindustrial societies, the ‘major professional preoccupation is pre-emptively shaping the consciousness of the consumer’.33 The conditions of the networked society, the restructuring of management hierarchies, more decentralization, increased autonomy of workers in production and more individualism and freedom in society in general all point towards a greater margin of autonomy. The rise of financial markets, however, strengthens the capacity for the centralization of capital and power, making excessive use of informational tools for risk management. The atomized individuals are allowed to dance more freely as long as central power functions are not affected or may even be better served by that increased margin of freedom.

In informational capitalism, the same technologies that appear to be fun and a vehicle for self-realization at the front-end have an entirely different dimension at the back-end. At the front-end, the aesthetics of the commodity34 makes seductive promises about the use-value of goods. It is in the nature of informational capitalism to emphasize the front-end while hiding the back-end function. The relationship between front-end and back-end is in technical terms the one between server and client, both connected by the metaphor of the ‘interface’. The interface can be a web-page for e-commerce or an e-government platform, or a cashier’s desk in a bank or a retail store.

On the web, the ‘empowerment’ of the user on Web 2.0 platforms has been emphasized by many authors. Those platforms, however, are based on centralized server infrastructures, entirely under the control of the company hosting those social interactions. When it comes to harnessing the accumulation of knowledge, the server back-end is the privileged site. The techniques developed during the first decades of the twentieth century summarized under ‘mass feedback’ have become greatly enhanced through digitalization and the ready availability of user data in server log-files and on Internet exchanges. The automated analysis of data flows passing through networked information structures creates the new knowledge of power. At the front-end this promises greater use-value: Amazon started it with proposing new books; Facebook automatically proposes new friends. At the server side ever more precise knowledge allows the targeting of individuals and their social networks based on data mining and ‘profiling’. The user profiles, maps of individuals and their networked relationships, become tradable commodities themselves.

Shared Interests

With the increased pervasiveness of icts ever more areas in society have a dual existence as both virtual and real, the analogue space is connected to and interwoven with electronic space registering real-time information. The system of Just-In-Time production (jit) is a key component of economic globalization which depends on tight control at the intersection of the virtual and the real. So-called ‘logistics’ or ‘supply chain management’ (scm), stretches over continents and involves sophisticated technologies such as rfid tags to manage the flow of raw materials, manufactured parts and end products. Those many components are linked in such a way, that ‘it can be argued that jit production is responsible for the change in capitalist production from a push economy to a pull economy’, writes Brian Ashton.35 That means that when a customer takes a can of baked beans from a shelf at Tesco’s the information is transmitted to all those along the supply chain and the process to replace the item is put in motion. According to Ashton, workers in the logistics industries are ‘bearing the brunt of the competitive pressures in those global supply chains’, while their privacy is also compromised by new laws and regulations in the wake of 9 / 11. The International Ship and Port Facility Security Code enforced the building of visible and invisible security walls around ports. The police and security services have been given new rights to carry out checks on dock workers and to share information with foreign intelligence agencies.

The example of the logistics industry shows converging interests of the state and corporations to put workers under automated surveillance. The use of software with certain ‘decision making support functions’ at the front-end or the ‘user interface’ of businesses subjects both workers and consumers to the same surveillance logic. In jubilant stories in trade journals the benefits of new intrusive technologies called ‘workforce management software… such as click2staff’ are being praised.36 The software matches activity logs with customer statistics and produces automated recommendations for the allocation of staff according to ‘overtime adherence’ and ‘salary adherence’ policies.37 One step further go products such as the Verint Witness Actionable Solutions, a package that promises to deliver ‘actionable intelligence’38 and to ‘capture customer interactions in their entirety, selectively, on demand, or randomly’.

Verint is an industry leader in surveillance services working with ‘law enforcement, national security, intelligence, and government agencies’. Their catalogue of services39 is not so different from that of competitors such as Siemens Nokia, who promise to ‘integrate data from many sources’ such as ‘data retention systems’, ‘Internet addresses merged with geographical information systems’, ‘traffic control points’, ‘credit card transactions’ and ‘dna analysis database’, to give just a few examples of a much longer list. The collation of data from such a diverse range of sources would be illegal in most European countries;40 thus it highlights how the convergence of state and business interests in monitoring critical hubs of the network infrastructure deeply compromises privacy.

The European Data Retention Directive of 200641 mandates that all suppliers of telecommunications services keep the log-files of all communications of their users – not the actual content, but the ‘who’, ‘when’, ‘where’, type of meta-information – and that ‘legal authorities’ be granted automated access to it. Meta-information is actually much more useful for data mining than the ‘noise’ of content. The Austrian journalist Erich Möchel42 is one among a number of investigative journalists who have uncovered the long trail of the secret backroom dealings which opened up a plethora of surveillance capacities at the business end of the net. For years, equipment manufacturers such as Siemens have been actively involved in working groups of the European Telecom Standards Institute (etsi) who occupy themselves with defining the data handover-interface for Legal Interception. In other words, backdoors are being built systematically into equipment such as mobile phone switches and Internet routers, so that hardware-filtering devices can sift through the Internet traffic at speeds of 10 Gigabits per second and more. In eu funded research projects,43 search engines are to be developed that combine all those data to automatically recognize ‘abnormal behaviour’ of ‘mobile objects’.

As Saskia Sassen has noted, recent decades have seen a ‘reconstruction of the divide’ between the public and the private sphere ‘partly through the policies of deregulation, privatization and marketization’.44 Sassen argues that globalization strengthens the power of the executive branches of the state while it weakens the power of the legislative and therefore of democratic control. The privatization or deregulation of state tasks and responsibilities to private companies creates a move towards ‘a privatized executive vis-à-vis the people and the other parts of government along with an erosion of citizens privacy’.45 The other side of the coin is that the executive grants itself ever more secrecy over its own decision-making. We can ask, with Saskia Sassen, what potentials exist to bring those tendencies to a tipping point where they can be reversed?

Digital Commons

This text has shown the usefulness of ict for monopolizing knowledge and control in the hands of management and the executive branch of government. Some of the social forces shaping the path of development of technologies have been described. The systemic character of surveillance and dataveillance techniques at the workplace and in relation to consumers has been demonstrated. The automated detection of ‘abnormal behaviour’ binds together the data flows on the net with physical, spatial reality.

For all those reasons together, the problem is not simply to rebalance the private-public divide, but to find a more comprehensive answer to the current crisis of the information society. In the current transition, the digital commons opens a different path for economic and technological development. It should not be seen as a ready solution but more like a process that triggers other corresponding changes. Having originated from the Free Software movement in the 1980s, the digital commons has meanwhile found widespread support in the arts, culture, scientific publishing and research. As a new layer in societies that is growing from inside the most advanced sectors of cognitive capitalism, the digital commons offers new mechanisms for cooperation and free association. For instance, if people work out of self-motivation rather than coercion, a big motivation for technically mediated control falls away. The digital commons reaches beyond the notion of software, information or informational cultural commodities. It is a new way of doing things rather than a thing. It allows new alliances to be forged between digital commoners, knowledge workers, garage experimentalists, organic farmers, environmental activists and social movements. The digital commons is built on the recognition that freedom is not something that can best be attained individually through the possession of property, but collectively through the sharing of knowledge. But this proposal is necessarily incomplete, as the digital commons still faces many obstacles and challenges. Its further prospering is not a foregone conclusion, and its existence is owed to many patterns still associated with the old paradigm. However, the key point is that only a shift of such paradigmatic dimensions will get us off the hooks of the surveillance society.

1. Alan F. Westin, Privacy and Freedom (New York: Atheneum, 1967), 7, quoted in Beate Rössler, Der Wert des Privaten (Frankfurt am Main: Suhrkamp, 2001), 22.

2. An effort to understand this ‘overall dynamics’ is made through the collaborative research project Technopolitics developed jointly by Brian Holmes, the author and others on www.thenextlayer.org.

3. This conception of liberalism was formulated during the Great Englisch Revolution in the seventeenth century and philosophically in the work of John Locke. For a critique, cf. C.B. McPherson, The Political Theory of Possessive Individualism: Hobbes to Locke (Clarendon: OUP, 1962/2009).

4. Rössler, Der Wert, op. cit. (note 1), 27.

5. Jürgen Habermas, The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society (Studies in Contemporary German Social Thought) (Cambridge, MA: MIT Press, 1989).

6. Ibid., 54.

7. Ibid., 73-74.

8. E.P. Thompson, The Making of the English Working Class (Harmondsworth: Penguin, 1975).

9. Ibid., 199-201.

10. Ibid., 200.

11. Ibid., 805.

12. Saskia Sassen, Territory, Authority, Rights: From Medieval to Global Assemblages (Princeton, NJ: Princeton University Press, 2006), 110 et passim, in particular footnote 66.

13. Karl Marx, Capital Vol I (city: publisher, 1976), 482.

14. Arman Mattelart, The Information Society: An Introduction (city: publisher, 2001), 5.

15. Ian Hacking, The Taming of Chance (Cambridge: Cambridge University Press, 1990).

16. Simon Schaffer, babbage’s intelligence (2007), online: www.imaginaryfutures.net, no pagination.

17. Charles Babbage, On the Economy of Machinery and Manufactures (1832), quoted from Project Gutenberg: www.gutenberg.org.

18. Simon Schaffer, OK Computer (2007), online: www.imaginaryfutures.net.

19. Harry Braverman, Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century (New York: Monthly Review Press, 1975), 89.

20. Ibid., 119-120.

21. F.W. Taylor, The Principles of Scientific Management (city: publisher, year), 111, quoted in Braverman, Labor, op. cit. (note 19), 112.

22. Braverman, Labor, op. cit. (note 19), 125.

23. Ibid., 126.

24. Ibid., 127.

25. James R. Beniger, The Control Revolution: Technological and Economic Origins of the Information Society (Cambridge, MA: Harvard University Press, 1986).

26. Alfred D. Chandler, The Visible Hand: The Managerial Revolution in American Business (Cambridge, MA: Belknap Press, 1977).

27. Michael J. Piore and Charles F. Sabel, The Second Industrial Divide: Possibilities for Prosperity (New York: Basic Books, 1984).

28. Beniger, The Control, op. cit. (note 25), 20.

29. Armand and Michèle Mattelart, Theories of Communication: A Short Introduction (London: Sage Publications, 1998), 28.

30. Ibid.

31. Brian Holmes, Future Map or: How the Cyborgs Learned to Stop Worrying and Love Surveillance (2007), online: brianholmes.wordpress.com

32. Carlota Perez, Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages (Cheltenham, uk: E. Elgar Pub., 2002).

33. Holmes, Future Map, op. cit. (note 31).

34. Wolfgang Fritz Haug, Critique of Commodity Aesthetics: Appearance, Sexuality, and Advertising in Capitalist Society (Minneapolis: University of Minnesota Press, 1986).

35. Brian Ashton, ‘Logistics – Factory without walls’, Mute Magazine (2006), online: www.metamute.org, (no pagination).

36. Banktech.com, on 8 July 2002 Banks start to embrace workforce technology.

37. Cf. Bank of America Sucks, 10 January 2009, online: www.bankofamericasucks.com.

38. cominfosys.com.

39. cominfosys.com.

40. Futurezone 03.04.2008 Das Siemens-Monster und die Legalitaet.

41. en.wikipedia.org.

42. Many of the examples in this section are based on Moechel’s research, published on the website of www.quintessenz.at.

43. www.indect-project.eu.

44. Sassen, Territory, op. cit. (note 12), 184-185.

45. Ibid., 184.

Armin Medosch is a researcher in digital arts and network culture, based in London and Vienna. His latest projects include the exhibition Waves and the collaborative research platform Thenextlayer (www.thenextlayer.org).