Towards an Evil Media Studies

(for The Spam Book, Jussi Parikka and Tony Sampson eds., forthcoming, Hampton Press, New Jersey)

Matthew Fuller, Andrew Goffey

Evil media studies is not a discipline, nor is it the description of a category of particularly unpleasant media objects. It is a manner of working with a set of informal practices and bodies of knowledge, characterised as stratagems, which pervade contemporary networked media and which straddle the distinction between the work of theory and of practice.

Evil media studies deliberately courts the accusation of anachronism so as to both counter and to enhance the often tacit deception and trickery within the precincts of both theory and practice.

stratagem one: bypass representation
The basic strategy is not to denounce, nor to advocate but rather to create a problem of a different order to that of representation and then follow through practically what it entails. Whilst it is quite plausible to analyse the developments in digital media in terms of a problematic of representation, with its associated postulates about meaning, truth and falsity and so on, a problematic of the accomplishment of representation is badly adapted to an understanding of the increasingly infrastructural nature of communications in a world of digital media. Whilst networked media may well be shaped by cultural forces they have a materiality which is refractory to meaning and to symbolism. At the same time, digital media work largely through the formal logic of programmed hardware and software, that is, as something which more closely approximates the order of language. Language here becomes object, in a number of senses: objectified by a range of practices which submit communication processes to the quantificational procedures of programming; invested as a crucial factor in the economy; and an element in the purely objective order of things in themselves, escaping from the complementarity of subject and object and the range of processes we normally think of as mediating between the two.

stratagem two: exploit anachronisms
We use the word ‘evil’ here to help us get a grip on contemporary media practices of trickery, deception and manipulation. The shift to this register must be understood in the context of a desire to escape the order of critique and the postulates of representation so obviously at work in the way thinking is made available about the media more generally. To talk of an evil media studies, is to draw attention to a range and style of practices which are badly understood when explicitly or implicitly measured against the yardstick of autonomous rationality and the ideal of knowledge. Indeed, an evil media studies has immense capacity for productive use. As Jonathan Crary has argued, “that human subjects have determinate psycho-physiological capacities and functions that might be susceptible to technological management, has been the underpinning of institutional strategies and practices (regardless of the relative effectiveness of such strategies) for over a hundred years, even as it must be disavowed by critics of those same institutions”i. The fact of the matter is, as Crary points out, a vast amount of time and effort is spent on studies devoted to looking at the ways in which the experience of media subjects can be operated on. The point here is not whether such studies – frequently behaviourist in inspiration, frequently located in the field of psychology – are scientific or not. The point is that like the famous study by Stanley Milgramii they point very directly towards techniques, practices which are efficacious even if they don’t lead to, or ultimately derive from, scientific knowledge.

This given, it is important to talk about whether things work, not about whether or not they are right. Isabelle Stengers and Phillippe Pignarre have recently spoken of the sorcery of capitalism, a sorcery which implies that practices maligned by the ascendancy of critical rationality, such as hypnosis, be treated far more seriously. In the therapeutic use of hypnosis, what is significant is not the ways in which the powers of suggestion can encourage patients to imagine events that didn’t happen (although this might be an outcome) but the way in which patients are initiated into a specific form of reality – which may help to cure them, but may not. What occurs is a “‘production of reality’ which the hypnotist conjures up, in a precarious and ambiguous manner, without being able to explain or justify his or her ‘power’ in the matter”iii. Unlike the outmoded model of media spectacle, which simply proffered an image of a ‘hidden’ or occulted reality, hypnotic suggestion, – a fact long known to the inventors of PR – is one of a number of means that are directly productive of a reality. Taking advantage of such mechanisms calls for the delicate negotiation of a different position to that commonly adopted in media studies. For those professionally or even incidentally embedded in media, to say that we are manipulated, that trickery and deception are effectively exercised on a regular basis, is not to deny that people cannot or do not think, but it would be to further deceive and manipulate ourselves to think that rational subjects are not outstripped by events.

stratagem three: stimulate malignancy
To talk of evil is also to insist on an ontological dimension of the reality to which the order of communication belongs: the non-sense of something that cannot be exchanged for meaning, which is infinitely tangential to representation (but is not necessarily ‘repressed’). It is in this sense that Jean Baudrillard talks about a ‘principle’ of evil and can argue that “in every process of domination and conflict is forged a secret complicity, and in every process of consensus and balance, a secret antagonism.”iv If there is thus an agonism inherent in every form, this is in the sense that the form fights against the transpiring of its secret alterity. An example often repeated by Baudrillard is the cruel irony that the more media represent events, the more certain that, following an inexorable spiral of semantic inflation, they are to disappear, only to curve back in on themselves and replace reality. More simply, one can admire the way in which the hyper-sophisticated technology of the war machine of a global superpower reverts, on contact with any form of friction into a terroristic, technological primitivism. And these are perhaps only the most obvious manifestations of this ‘principle’. To put it another way, evil is a good name for the strategies of the object, for what things do in themselves without bothering to pass through the subjective demand for meaning. If secrecy is inherent to this agonism, this is perhaps because it is a process without subject, a machination, a process which depends on its imperceptibility and which must for that very reason surprise us, fox us or outwit usv. As such, this strategy secretly reverts from malignancy to innocencevi.

stratagem four: machine the commonplace
For a number of recent commentators, language and communication are now absolutely central components of the economyvii. Long considered a vehicle for the communication of ideas – and thus an element of the superstructure separate from the economic base – language and communication more generally should, these writers contend, instead be considered part of the infrastructure. This shift in the place assigned to communication in the economy opens up a new range of issues to consider and casts new light on the changing nature of work in the contemporary economy. From the restricted practical analysis being sketched out here, the general claim, if not the specific details, suggests some unlikely antecedents for contemporary practices in digital media.

Recent attempts to rethink the changing shape of work in the contemporary economy – and the shifts in political subjectivity such changes imply have taken a curious inspiration from Aristotelian rhetoric and the principle of performativity which this embodies. For the Italian theorist Paolo Virno, contemporary political subjectivity involves a sort of principle of virtuosity. “Each one of us, Virno contends, is, and has always been, a virtuoso, a performing artist, at times mediocre or awkward, but, in any event, a virtuoso. In fact, the fundamental model of virtuosity, the experience which is the base of the concept, is the activity of the speaker”viii. If Virno’s analysis provides an interesting way to refigure the understanding of the link between labour and language, it is perhaps also true to say that it only goes so far in exploring the paradigmatic ways in which media and communication practices exemplify the changing nature of modern production. For Virno, to be a producer today, to be a virtuoso, involves working on and with commonplaces, the finite, fluctuating stock of points around which language as performance coheres and the skeletal forms of intelligence which these embody. If media then become paradigmatic of the mutations which have occurred in the labour-capital relationship, this is because they too work on commonplaces. In digital media, the rudimentary set of operators utilized in SQL, the relational database query language, to analyse data might be described as a series of machinic commonplaces (=, !=, < ,>, < =, >=, and so on) . A general intellect characterised by a set of ‘generic logical-linguistic forms’ in this way becomes central to contemporary production, provided that we accord no automatic privilege to natural language and provided also that we recognize that the instantiating of languages within media technology necessarily marks a zone in which language becomes inseparable from the senselessness of an object without a subject.

Virno’s approach, like that of Maurizio Lazzarato and Christian Marazzi, has enormous merits, not the least of which is to pinpoint some of the rudimentary forms of intelligence (i.e. those relational terms such as equal to, not equal to, more and less etcetera we characterize in terms of commonplaces) which inform machine processes. However, as a way of understanding media, this approach is insufficient. The first indicator of why this is the case results from the fact that in Aristotle’s work, a point not lost on Hannah Arendtix, much of the argument about language is dictated by the need to rout the sophists, the consummate yet paradoxical masters of the secret antagonism of communicative form. Indeed, the machination of the consensus thought to be tacitly presupposed in all communicative action (by the likes of Habermas), is accomplished by Aristotle precisely by excluding a range of communicative techniques previously the stock in trade of sophistry. Whether we think of communicative action as definitely separated from instrumental activity (as Habermas does) or as inseparable, as Virno does, is immaterial from the moment we understand that consensual communication and co-operation has the function of excluding and thus of distorting our understanding of practices which are not necessarily rational in form. So, starting with sophistry is a way to open up the study of media forms as a response to the rationalist disavowal of manipulation and mind control which need to be surpassed by a truly useful, and hence evil, media studies.


stratagem five: make the accidental the essential

In Ancient Greece, the sophists were consummate exploiters of the faults, disturbances and idiosyncrasies of language, its non-sense. Installing themselves within the cracks of language, the fissures which open up where one word could mean many things, two different words could sound exactly alike, where sense and reference was confused, sophistry sometimes humourously and playfully, sometimes with apparently more sinister demagogical intent, exploited the ‘semiurgical’ quality of language and the seething cauldron of affective charge it contained to make and remake our relations to the world. For this, history shows, they were vilified, slandered and excluded from the community of normal human users of language. Philosophy and the right (thinking) use of reason was the prime agent in this historical expulsion. By the genial invention of principles such as that of non-contradiction and entities such as rhetoric to absorb the excesses of language, philosophy not only created strong normative principles for communication arguably operating on a transcendental basis (recently rehabilitated by Jürgen Habermas and Karl-Otto Apel), it also created a perception of language and of logic in which faults, glitches and bugs started to be seen simply as accidents, trivial anomalies easily removed by means of the better internal policing of language. Short of being a two-headed monster or a plant of some sort, you could not possibly say one thing and mean two. The norms of reason precluded this: transparency should be the elimination of agonism, not its secret accumulation. But as the sophists knew and practised, double-speak was something which politicians did all the time, more or less knowingly, more or less well. Twenty-five centuries later, with the advent of deconstruction and other approaches, we discover that in fact double-speak is the ‘repressed’, disavowed norm of reasonx.

stratagem six: recurse stratagems
A study of media that does not shy away from possibilities such as mind control should be elaborated as a series of stratagems. Why? Because agreement and co-operation, the rational assent of the reader, are outcomes not presuppositions. A consequential study of mind control should therefore be recursive and apply to itself. In any case, the stratagematic approach gives us something to do: the autonomy of code, its independence from human interference, is not incompatible with the existence of the strategically marshalled multitude of agents who bring it into being. A stratagematic approach to arguments was proposed in the mid to late 19th century by the pessimistic German philosopher Arthur Schopenhauer in his short text The Art of Always Being Right. Schopenhauer’s little text is a practical manual in the tradition of Machiavelli’s The Prince and Baltasar Gracian’s The Art of Worldly Wisdom. All three of these texts are non-naturalistic, practical guides to the operations of power and the manipulation, deceit and other forms of linguistic enhancement required to exercise it effectively. Schopenhauer’s text is a distant inheritor of the opportunist charlatanism of the sophists and exercises a similar effect: suspension of the right/wrong, true/false, good/evil oppositions as a priori guidelines for winning arguments. Consequently it focuses on the strategies of persuasion which emerge out of the fissures of argumentative performance.

But if such a study borrows Schopenhauer’s stratagematic approach, it doesn’t share his exclusive focus on the dialectical situation of dialogic interaction or the exclusive focus on natural language. The vast majority of communications processes which take place in contemporary media are not of this type. Indeed, the vast majority of agents in a digitally networked world are not even humans and do not operate using natural language. But the processes of message exchange are still a part of the proper operations of power, and that is what we are interested in.

stratagem seven: the rapture of capture
A useful term for trying to understand what is going on in the world of digital communications is capture. We live in a ‘world of captures’, a world wherein power – as Foucault and others before him have shown – operates not primarily by repressing, suppressing or oppressing (although sometimes it involves the active celebration of all of these qualities) but by inciting, seducing, producing, and even creating. Capture operates most commonly, and indeed most economically, by imposing slight deviations of force, by scarcely perceptible inflections of agency. Language is both more than and less than language. The suggestions of the hypnotist redirect unconscious affect, a word (‘education, education, education’) or a slogan (‘the axis of evil’) acts as an attractor. Being captured makes sense for us of the feeling we have that the social today is a more or less clumsily designed open prison, that we don’t need to be locked away to feel trapped, that we don’t need to have committed a crime in order to sense ourselves permanently judged, submitted, even through the knowledge we understood might make us free, to an abominable, stultifying stupefying faculty for the routinisation of life. Capture equally provides a way of characterising what happens in the relationship between humans and machines, formal and natural languages, affect and technology. Stratagems are event handlers: they trap agency.

stratagem eight: sophisticating machinery
From a somewhat different point of view, the media theorist Friedrich Kittler has hypothesised an adventurous analogy between the Lacanian unconscious and the computer which might help us start to understand how these techniques of capture work across platforms (those based on natural language, those based on machine language). Applying the Lacanian dictum that for there to be a world of the symbolic (i.e. culture), something must function in the real independently of any subjectivity (there would be no way of symbolising it otherwise), Kittler argues that the operations of computer hardware on the basis of the oscillations of silicon crystal chips demonstrates that the famous notion of the unconscious as the discourse of the other is equivalently a discourse of the circuit. In the world of the symbolic “information circulates as the presence/absence of absence/presence”. In the real, in the hardware of the computer, this is the flip-flopping of gates according to simple voltage differences. The exploitation of the potentials of silicon quartz allows Lacan/Kittler to draw together the discourse of the unconscious and the operations of the circuit and so better to develop a literal understanding of technologies of power. Let’s not get too bogged down in this. The point to be made here is a simple one. The presence/absence of absence/presence which is at work in the basic operations of computer hardware points towards the systematisation of a regime of signs which, according to structural psychoanalysis, figure desire or affect as an elementarily coded phenomenon. Lacan for one felt that all the figures of speech codified as rhetoric provided an excellent means for understanding the operations of the unconscious. In practical terms, this implies that our machines speak (through) us, rather than the other way around, a point Kittler/Lacan makes very succinctly: we are today “to a greater extent than [we] could ever imagine, the subjects of all types of gadgets, from the microscope to ‘radio-television’”xi. When people find it surprising to be addressed by a machine, we should note that this is perhaps correct: the machines are usually busy enough communicating with each other.

These comparisons point for us towards a ‘technicity’ of sophistry and its operations on the quasi-autonomous workings of affect in both natural and formal language. Regrettably, Kittler’s approach to the ‘technics’ of discourse, in its determinedly inflexible parsing of the instruction stack of history offers no way out: the unconscious workings of the hardware circuit are always already overcoded, captured by the binary logic of the digital signifier, a signifier which gains its effect of power by the way in which Kittler absolutises a particular set of scientific discourses and profits from their tendency to drift into the power game of exclusion and dismissal. Unsurprisingly perhaps, in a repetition of the classic gesture of reductionism, for Kittler software – and with it programming – becomes an illusion, a simulation concealing the truth which is the desire of, or for, the machine.

If we are automatically subjects of the machines which speak us, there would be little point in trying to elaborate an analysis of the stratagems operative within digital communications. In fact, it would be difficult to understand why such strategies exist. This problem can be avoided by substituting the aleatory chaos of discursive and material concrescence for the necessities discovered in technoscience: the latter, paradoxically, are made to emerge from an ensemble of practices as realities in their own right. This paradox has been explored in science studies by the likes of Bruno Latour and Isabelle Stengers, for whom it is precisely the construction of reality through contingent networks of actors human and non-human which endows reality with its autonomy. As Stengers puts it (speaking of the neutrino) “it becomes all the more ‘in itself’ the actor of innumerable events in which we seek the principles of matter, as it starts to exist ‘for us’, the ingredient of practices, of apparatuses and of ever more innumerable possibilities”xii.

stratagem nine: what is good for natural language is good for formal language
The problem we are dealing with here is not simply an abstract philosophical issue. It has immediate purchase in fields of knowledge which tie-in directly to our communicational infrastructure and the many kinds of work which sustain it. For the computer scientist Marvin Minsky, common sense reasoning, in comparison with that of formal logic, was unavoidably buggy. Bugs, which he glossed as ‘ineffective or destructive thought processes’ were those faults that had to be avoided precisely because they were so unproductive and ‘unreliable for practical purposes’xiii. Minsky’s work is suggestive of the extent to which the need to police language, a process inaugurated over twenty-five centuries ago in the long march of critical rationality to world domination, has migrated into the fields of software development, computing technology and cognitive science. Today, however, rather than philosophy, it is formal logic (and for Minsky, artificial intelligence, a certain image of thought) which somewhat problematically defines the parameters of what constitutes healthy ‘productive’ reasoning and suppresses or represses the affective bugs which make no contribution to the economy of rational communication. But Minsky’s application of Plato needs a sophistic plug-in. If glitches, bugs, faults and fissures are unavoidable (because even formal systems are incomplete), then technological norms, the constant injunction to optimise and the unreasonable exactness of the formal logic necessary to the programming of software, are themselves generative of aberrant movements, movements which exploit the idiosyncrasies of language both formal and natural. Incipit the viral.

stratagem ten: know your data
Not all forms of capture work in quite a blatant fashion (not that such techniques necessarily lose any applicabilty for being equally blatantly dumb) nor are they quite so apparently anomalous. In terms of the production of communication, the policing of language that has historically been accomplished by specific norms of rationality and the institutions in which they are staged and advanced, is today accomplished more and more frequently by specific technological apparati. This is to say, by algorithms, and, a matter of equal importance, by the way that these can only operate on the basis of their links with commonplace data structures. Algorithms without data structures are useless. This goes as much for relations between software governed by APIs (Abstract Programming Interfaces: typically a library of classes allowing a programmer to write one piece of software which interacts with another) as it does between software and those components figured as users. The possibility of abstracting useful knowledge from the end user of a website, for example, is dependent upon the extent to which data is structured. Effective demagoguery depends on knowing one’s audience. For the sophisticated machine, the virtuoso performance depends on knowing one’s data.

We might think of the consequent processes of imposing structure on data as one of recoding. The simple fact of designing a web page using fields linked by an appropriate form of technology (PHP, Perl, ASP.Net) to a database is an incredibly simple way to accomplish this process. Simply by entering information in separate fields, the user facilitates the tractability of that information to data classification mining and other beneficial processes. Outside of the visible regime of which forms generate, imposing data validation on user input accomplishes slight, micrological shifts within the semiotic order of language, the transformation of a quirk, a momentary stutterance, into an error, the state of a referent to be verified. In the ergonomic rationale of the studies of experts in Human-Computer Interaction, such blips are generally to be smoothed away and the fissure that opens up, the distinction between one linguistic regime and another papered over. This can work in a number of ways. The user completes a form on a website. The developer of the site has written a bit of JavaScript which, sitting on the client machine, is executed before the data in the form is sent back to the server for processing. That bit of JavaScript would probably do something quite innocuous like capitalise initials or the first letters of proper names (tough luck, bell hooks, ee cummings). A ‘web service’ might be invoked to return a risk assessment on your post code (you’re being judged). When the data from the form is returned to the server, a whole range of ‘business rules’ might be applied to your data. From being the putative ‘subject’ of enunciation who input the information in the first place, the user is now situated in relation to a number of machine (encoded) statements.

The inattention that frequently assails the individual end-user is equally applicable at a trans-individual level. You could call it forgetfulness, you could call it habituation, it doesn’t really matter: specific techniques of capture benefit from a sort of pseudo-continuity with the techniques and practices they replace or displace, which makes it easier to miss the yawning gaps which separate them. The shift from IPv4 to IPv6 illustrates this marvellously: increasing the size of IP addresses from 32 to 64 bits creates a qualitative discontinuity in the way in which TCP/IP networks can operate: the extra address space available makes it possible to discriminate between different types of traffic at the transport layer of a network and relativise the ‘end to end’ principle hitherto characteristic of the way in which the TCP/IP protocol operatesxiv.

stratagem eleven: liberate determinism
A useful and highly generic stratagem has long been known to computer programmers working with the core tools of software development – parsers, compilers and so on. Computer programmers and formal logicians have long recognised the existence of two kinds of abstract machines – deterministic finite automatons (DFA) and non-deterministic finite automatons (NFA). These logical machines are transition diagrams – abstract expressions for all the different possible moves which can be made from a given initial state to some set of terminal states. These machines function as recognisers in the sense that they define the range of acceptable inputs or valid expressions for any given system or language by testing whether those inputs give rise to an acceptable final statexv.

More specifically, a DFA is a logical, or abstract, machine that with a given set of instructions and a particular input will always react in the same way by going through a fixed set of states. An NFA by contrast, is one that, faced with the same input, may respond differently, may go through more than one next state. The problem faced is how to convert NFAs into DFAs. How, that is, to have an NFA stop repressing its inner DFA. An elementary exercise in computing science, this can be done by including a range of non-determined points of choice within the states of a determined algorithm.

Users emerge as individuated clusters of specimen characteristics within a complex network of social relations and computational supplements offering them tools and augmentation networks. Through systems such as blogs, social display sites or groupware they are able to make their thoughts readily available, sharable and codified as: favourites, groups, users, networks and extended networks, blurbs, metatags, forms, fields, Resource Description Framework entries, lists, search algorithms, ranking systems, user names, and systems for managing images, background tracks, media files, feeds, aggregators, links, friends, clip art libraries and other entities. Aggregating more choice layers into deterministic paths makes such complexity manageable and friendly. Civilization advances by extending the number of important operations which we can perform without thinking about them.

The most significant fraction of blogs, wikis or guestbooks that are opened in what is described as a newly participatory web, cease new entries after a short period. Of these, a majority leave the facility of commenting open. It is a simple matter to write a program that automatically adds comments, including URL links, to these sites. These comments help in two ways. Firstly they generate linkage to a site that is registered by search engines, allowing it to move up in a ranking system. Secondly they allow users the chance to find new and valuable services as they freely roam, participating in the infosphere with alacrity.

Social networking services assist such processes because they allow users to describe and determine themselves by factors such as demographic categories that they can share with other users. On such sites, temporary accounts generated to match specific demographic indicators or combinations of them can be used to send repressed information to those that may find it interesting.

Repetition of such messages makes them untimely, allowing the user the possibility of stepping outside of the frames allotted to them. Thousands of pointers to a casino, pharmacy or adult entertainment site appended to a blog that was never more than a temporary whim are ways too of keeping the Internet alive. This is not only because links are inherently meaningful. As Warhol knew, repetition, taking something forward in time is the strongest means of changing it, and in doing so affirming the capacity for change in users. That labour-saving commentary on such sites also points people towards their means of change is part of their pleasure.

It would not be amiss then to suggest that the various tools of textual analysis, word frequency, co-occurrence, predictive input that have become so much a part of the vocabulary of today’s ‘switched on’ culture might usefully couple with the ease of automatic generation of personality within social networks to enable bots to carry out most of the work. DFA could also mean Designed Fraternity Algorithm.

stratagem twelve: inattention economy
The end user has only finite resources for attention. She will slip up sooner or later. Maybe she has Repetitive Strain Injury (RSI) or her keyboard has been badly designed. A keen interest in the many points at which fatigue, overwork, stress make her inattentive is invaluable. In an attention economy, where the premium is placed on capturing the eye, the ear, the imagination, the time of individuals, it is in the lapses of vigilant, conscious, rationality that the real gains are to be made. The sheer proliferation of web sites coupled with the propensity for discipline to generate its own indiscipline generates the possibility of capitalising upon inattentiveness.

As the Internet (I haven’t checked throughout for consistency, but can we settle for Internet?) started its first phase of massification in the nineteen-nineties, domain squatters took the strategy of buying thousands of domain names, especially those likely to be wanted by well known companies. These were then sold at a steep mark-up, or later, as the trade became partially regulated, legally force-purchased. Visiting the URL would result simply in an ‘under construction’ notice. No use was made of the actual visit. The financial gain was in the warehousing of tracts of lexical space. Buy domain names and hold onto them until someone wants to pay more, possibly much more, than what you paid for it. Contemporarily, domain squatting does not simply mean occupying a space defined solely by registered ownership of a sequence of alphanumeric characters, it also means putting these sites to work.

In his project ‘DNvorscher’ the artist Peter Luining has made a useful initial map of the use of domain names. Amassing world wide web domain names has over time become a more technically, economically and culturally sophisticated operation in which fake search engines, spyware, search-engine spamming and manipulation are deployed both at their crudest and most refined levels. Visitors to a site maintained by a domain name investor might arrive there because they typed in a ‘typo’ name of a popular site, misspelling it by a letter or two. Equally, the name of a popular site, but with the final Top Level Domain part of the name, (.org, .com, co.uk, .info, .int) changed for another. In the lexicon of the World Wide Web, such typos are the homonyms and synonyms, the words that allow a user to pass over into another dimension of reference. The mistyping of site names, phrases that would otherwise be consigned to oblivion, are rescued for special functionality. All errors remain valuable and deictics recuperates the propensity to paraglossia inherent in the twitching of hands and crude sensors that is called typing.

An alternate stratagem is to exploit the transience of websites: the name of a site whose original registration has lapsed and subsequently been bought up by a domain name trader might now be assigned to a site that aggregates requests for thousands of such names. Such a site simply prints the name of the requested URL as its title or headline accompanied by a generic image or slogan. Underneath, the happy user will usually find links to thousands of sites divided by category. The best that the Internet has to offer is there, casinos, pornography, online retailing and search engines. As well as directories, other genres of sites are used such as dating services. These sites use IP address data to determine user location in order to funnel ‘local’ information, such as photos and member-data for eager dates in the user’s home town. Handily, from anywhere in the world, only the given location changes and a user is able to receive the same pictures of the same wet and ready partners at any point in the grid. When clicked, such sites link to providers of other services, largely providers of visual and video material. What the sites linked to all have in common is that they all pay the owners of these generic link aggregator sites a fixed amount for any click-through that is generated.

stratagem thirteen: brains beyond language
The previous stratagem illustrated the rather obvious point about proliferating network culture: the massive predominance of capital in the repurposing of digital technologies. In a sophisticated world, this self-evidence can itself occlude the real stakes of network practices. Whilst Lyotard the Cynic suggests that, “all phrase universes and their linkages are or can be subordinated to the sole finality of capital,”xvi a far more realistic approach is offered in the most developed theory of contemporary advertising. Affect is one parameter of the matrix by which it can be known, domination, running the gamut from awe to shock, another.

Recent interest in media theory in the domain of affect has worked well to reduce an ability to engage with technicity, its relation to language, and their mutual interlacing with politicsxvii. In order to reinstate the materiality of the body it has proceeded to make such workings invisible and even to directly efface them in favour of the unmanageable shock of dissonance or novelty parsed directly into the nervous system. Such work senses the speech of violence, not as speech operating by multiple registers and compositional dynamics of phrasing but as a discomfiting assault or a feeling or sparkliness in a refreshed cerebellum. Whether it runs away in horror or gushes sublime, what is important is the willing constraint of the registers it opens up to, it homogenises. Such work, quite welcomely, were it to achieve any kind of hegemony, leaves an evil media theory far less to do.

Whilst with general broadcast or print advertising it is never clear if there is a direct effect, a crucial innovation of online advertising was its ability to apply sharpened metrics to users. Under such a regime, advertisers only paid for actual clicks linking from the acquiring site to their own, for the completion of forms or other inherently quantifiable sequences of actions. Increasingly advertisers are also being billed for less tangible but still numerically knowable results such as ambient exposure of users to advertisers’ symbology, data and content. As with display advertising in traditional media, simply having users know you are there is a valuable occupation of territory and one that must be maintained. But the emphasis on affect raises the stakes. If we are to require relentless investment in the form of love and respect, the brain must also be used, cogitational hooks sink deepest into its abundantly soft tissue.

Affect promises a ‘secret’ route into the user at a low-level. It is however, not yet fully diagrammed and worked out as a probabilistically determined aspect of a media spend. What is required is a means for coupling the new primacy of affect with the rigour of analysis and the diagrammatic reproducability of technology. Capital as such, in Lyotard’s sense, simply becomes a stopping point, merely a temporary device of mediation before the opportunity presented by a more substantial means of integration.

stratagem fourteen: keep your stratagem secret as long as possible
Viral marketing is symptomatic of a shift in this regard. Part of the appeal of viral marketing in the perpetually downsizing, perpetually rationalising corporate world is that it shifts the burden of marketing labour onto the consumer. As one industry white paper has it, the low intensity, informal networks of relationships between people, incarnated for example in an email address book, do all the work of promoting an application, ideally without anybody realising that there is a corporate strategy at work, or at the very least not caring. The user is simply a node for the passing on of a segment of experience. However, much as viral marketing points towards the efficacy of the circulation of anonymous affect, the possibilities that this practice opens up are compromised by the endgame of appropriation. In this respect, viral marketing is an imperfect crime, because the identity of the criminal needs to be circulated along with the act itself. By pushing marketing into the realm of experiential communication, by attempting thereby to become part of the flow of material affect, virals move ever further away from strictly coded messages into the uncertain realm of pervasive communication. Yet to overcome the reasoned resistance of subjects to their inscription within a designer socius, crude attempts must be made to keep the marketing stratagem imperceptible, a requirement that runs strictly counter to the very principle of branding as such. At the limit though, viral marketing simply isn’t viral enough: it draws back just at the point where what it could do would become a pure set of means without endsxviii.

stratagem fifteen: take care of the symbols, the sense will follow

Attempts to model natural languages using computers have not, it is true to say, been entirely successful. Experts have generally considered that it is the incurably semantic quality of natural language that poses the principle obstacle to developing convincing models of language – that and the way that meaning is generally highly context-specific. In the world of digital media, it is argued, the development of the Semantic Web, some versions of which, it is imagined, will allow for infinite chains of association and for relay from one subjectival perspective to another, would ostensibly go some way to resolving the apparent stupidity of a form of communication that works on a ‘purely’ syntactic basis. Yet it is not clear that a closer approximation to the way that humans think and act will make digital communications processes any more intelligent – this is the anthropocentric conceit of a good deal of artificial intelligence research. It is not in their resemblance to humans that computers are intelligent. In a world in which the human is an adjunct to the machine, it would be preferable either for humans to learn to imitate machines, or for machines to bypass humans altogether. Bots, spiders and other relatively simple web-based programs are exemplary in this regard. Harvesting data from websites is a matter of using and then stripping off the markup language by which web pages are rendered in order to retrieve the data of interest and returning this to a database, ready for mining. At this point, semantics is largely irrelevant.

stratagem sixteen: the creativity of matter
It is not insignificant that the persistent intractability of user-interfaces to the user’s presumed autonomous powers of thought so frequently ends in acts of material violence. Studies of anger management frequently report the tendency of computer users to attack their machines at moments of system unavailability. For Jean-Francois Lyotard, the slippage between one phrase regime and another, such as that which often – but doesn’t always – occur when the user produces statements parsed as input, can result in a differend. Differends arise, Lyotard argues, because there is no common regime into which all phrases are translatable without remainder. In other words, they testify to the fissures in language, its cracks, faults and disturbances. It is, he says, “the unstable state and instant of language wherein something which must be able to be put into phrases cannot yet be”xix. Information, in the computing science sense of the term, on Lyotard’s account would belong to a cognitive regime – it is always a matter of verifying the state of the referent. The treatment of enunciations as input not only implies a delicate shift in the processing of language, it also, as the breakdown of the semiotic flow from human to machine shows, produces affect. Whilst not necessarily perceived, a differend can become manifest in the feeling that something must be put into words but cannot be.

Of course, it is a mistake to think that material violence is only the end result of the persistent translation of everything into data or an outcome of blockages in the process of circulation of signs. The breaking down of the machine and the sleek , personalised engendering of the simulation of total control in the intermittent irruption of explosive affect is symptomatic of the insistence of brute force as an elementary quality of the materiality of media as such. Technoscientific positivism produces an enforced materialisation of cognitive processes that seeks to localise ‘thinking’ in the ‘stuff’ of the brain. But it also translates into an extensive experimentation with the physical aspects of media technologies as such. In this respect material violence not only manifests itself in the fissures within language through which affect bubbles up. Material violence can itself be actively employed for its productive value within media forms, demonstrating something of a continuum in evil media from the semiotic to the physical.

For the actual study of psychology, at least within the constraints of working timescales, the stuff of consciousness remains ‘insoluble’. For operational purposes however the question of ‘stuff’ remains of limited interest. There are certain obvious long-term advantages in being able to trace the activity of the brain with increasing fineness and in the developing ability of being able to match such mapping with coupled stimuli. Equally, developing understanding of metabolic, developmental, neural and ecological traits and inter-relations provide promising new grounds for new methods. However, pragmatism also requires that we move on with achievements in the field. Media Studies has historically involved a strand with a strong emphasis on the understanding of the materiality of media.xx Unlike the current standing of the knowledge of the brain, this is something that can already be technically known and incorporated into the body of our work. Where such work becomes most promising of new applications is in the finding of new capacities in media systems which are blocked by their normalised use within economies of consumption and the circulation of signs. Non-representational use of media systems designed to effect a direct and non-mediated engagement with the target user are often to be found where the constraints and mediocratising effects of the market are least hegemonic. One of the areas benefiting most strongly from such freedom is defence.

Whilst the area of the military most closely concerned with the effects of media, units engaged in Psy-Ops operations on home and enemy-embedded populations have often been laughably crude, other areas of military developments of media systems may provide some promise. Psy-Ops by Western forces is renowned for often acting with reverse intentionality. It is assumed that the more dumbness and crudity exhibited in attempts to cajole, bully, inform and seduce enemy-embedded populations the more effective it is. Leaflets dropped by plane, or information formatted and delivered by television stations aimed primarily at home audiences and secondarily at ‘leakage’ viewers work not from any finesse but simply because of the horror inspired at the thought of the dim-wittedness and crudity of those who strategised and implemented such media. The sought-after effect is to inspire in target users the imagination of what physical actions might be undertaken by such senders.

If we can imagine the continuum stretching from the purely semiotic to the purely material use of media systems, Psy-Ops stands largely at the former end. Violence done to the capacity of the imagination inspires an understanding of the real physical violence that can be drawn to the target-user by non-compliance. The greater the semiotic debasement exhibited in Psy-Ops, the less, by means of their own cogitational work, the need for physical intervention.

A non-representational theory of media would allow us to understand the effectiveness of systems such as sonic weapons, micro-wave weapons, and the physical end of the techniques of infowar. What is particularly interesting is the military capacity to develop new capacities for becoming in media systems. As an example, the standard understanding of a ‘loudspeaker’ producing sonic waves has historically been constrained by the semiotic end of the continuum. Given the liberation of forces from such constraints allowed for by the military we find here that new avenues for sound are opened up in their direct interaction with human and nonhuman bodies. Flat panel speakers are a relatively recent technology in which dynamic surfaces are agitated to produce audio waveforms. This technology is currently being developed by weapons companies as a cladding surface for submarine vessels. If the waveform pumped out by the speakers can be generated at sufficient scale it can act both as a sound dampening technology and also as a means of repelling attacks by torpedo. As with contemporary musical aid ventures, sound acts directly to save lives. But more importantly, recognising the material effectiveness of media, without constraint to merely semiotic registers or the interminable compulsion to communicate allows media themselves to become fully expressive.

Further Exercises
There is perhaps as little chance of providing a definitive catalogue of evil media strategies as there is of coming to a well-regulated distinction between good and evil. Cunning intelligence has, since Ancient Greece, slipped into the interstices of publicly sanctioned knowledge, requiring an equivalently wily intelligence to decipher. For Nietzsche, the breakdown of any self-evidently discernible distinction between good and evil was precisely the province occupied by sophistry: another good reason to take inspiration from these maligned outsiders of Western intellectual history. The indiscernability and secret antagonism of good and evil is not a cause for lamentation or reproach: indeed requiring as it does that we rethink our approach to media outside of the (largely tacit) morality of representationxxi, it allows us to explore digital or networked media forms without the categorical distinction between theory and practicexxii.

Of course it is not just the theory/practice distinction that finds itself challenged within digital media. Distinctions between material and mental, between work and leisure, between the accidental and the necessary are equally challenged. If there is anything approaching a theoretical claim to be advanced here, it perhaps concerns what recent theories of work have called the new revolutions of capitalism: the novel types of political subjectivity which emerge from such analyses need to consider the wisdom of passing over into these paradoxical strategies of the object.

NOTES

i See Jonathan Crary, Suspensions of Perception (Cambridge MA: MIT Press,2001) p.72
ii See Stanley Milgram Obedience to Authority: An Experimental View (New York NY: HarperCollins, 2004)
iii See Leon Chertok and Isabelle Stengers A Critique of Psychonalytic Reason: Hypnosis as a Science Problem from Lavoisier to Lacan (Stanford CA: Stanford University Press,1992) p.164, Philliipe Pignarre and Isabelle Stengers La sorcellerie capitaliste (Paris: Editions La Découverte, 2005).
iv Jean Baudrillard, The Intelligence of Evil or the Lucidity Pact (London: Berg, 2005)
p.163
v Although this is not the place to engage in such discussions, our invocation of Baudrillard here is, somewhat paradoxically, in line with a certain realist strand of thinking, which is to say, a strand of thinking which bypasses the critical, anthropocentric prejudice of a kind of thinking which says that any talk of reality necessarily passes through a human subject. See Graham Harman, Tool-being. Heidegger and the Metaphysics of Objects (Chicago IL: Open Court, 2002).
vi Writing of the ‘principle of evil’, Baudrillard comments “to analyse contemporary systems in their catastrophic form, in their failures, their aporia, but also in the manner in which they succeed too well and lose themselves in the delirium of their own functioning is to make the theorem and the equation of the accursed share spring up everywhere, it is to verify its indestructible symbolic power everywhere”. Jean Baudrillard La transparence du mal (Paris: Galilée, 1990) p.112
vii Most notably Paolo Virno, Christian Marazzi and Maurizio Lazzaratto. See Paolo Virno A Grammar of the Multitude (New York NY: Semiotext(e), 2003), Christian Marazzi La place des chaussettes (Paris: Editions de l’eclat, 1997), Maurizio Lazzarato Les révolutions du capitalisme. (Paris: Les empêcheurs de penser en rond, 2004)
viii Virno A Grammar of the Multitude, 55
ix Hannah Arendt Between Past and Future (New York NY: Viking Press, 1961. Revised edition, 1968) most clearly displays this sense of the importance of sophistry. See the commentary on Arendt in Barbara Cassin L’effet sophistique (Paris: Gallimard, 1995)
x A whole range of references would support this claim. Obviously Derrida’s work exemplifies this discovery. But so, more prosaically, does that of Bruno Latour and Isabelle Stengers, both of whom make approving winks to sophistry. The ‘ethical’ issues raised by the invention of the principle of non-contradiction by Aristotle have been explored notably in Cassin L’effet sophistique and by Giorgio Agamben Remnants of Auschwitz: The Witness and the Archive (New York NY: Zone, 2002).
xi Jacques Lacan, quoted in Friedrich Kittler Literature, Media, Information Systems (London: Routledge, 1997) p.143
xii We take the term ‘concrescence’ from Alfred North Whitehead. See Alfred North Whitehead Process and Reality. An Essay in Cosmology (London: Free Press, 1979). See Bruno Latour Petite réflexion sur le culte moderne des dieux faitiches (Paris: Les empêcheurs de penser en rond, 1996) for a discussion of the process by which things are ‘made’ to be autonomous, and the discussion in Isabelle Stengers Cosmopolitiques (Paris: La Découverte, 1997) p.29. Latour’s work can be understood in a stratagematic sense in his discussion of the process of fact writing in Science in Action.
xiii See Marvin Minsky ‘Jokes and their Relation to the Cognitive Unconscious’ http://www.ai.mit.edu/people/minsky/papers/jokes.cognitive.txt (1981)
xiv Data is piloted through the networks that the Internet is made up of by means of the characteristic 4 number addresses required for any machine to be ‘visible’ in a network. An internet address resolves into a sequence of four numbers each between 0 and 255 – 255 being the largest number possible in a string of 8 bits (binary digits, or 1s and 0s). An address ‘space’ made up of 64 bits is obviously considerably larger than one made up of 32 bits. The extra address space not only allows for many more addresses (think about how many more telephones could be used at any one time if you doubled the basic length of a phone number). It could equally allow for the fine discrimination of types of traffic (as if rather than adding more phones to a telephone network, you used the longer number as a way of differentiating between different types of phone user.
xv An excellent discussion may be found in Aho, Sethi and Ullman Compilers. Principles, Techniques and Tools (Boston MA: Addison Wesley, 1974).
xvi See §236 -264 in Jean-Francois Lyotard The Differend (Minnesota MN: University of Minnesota Press, 1989) pp.171 – 181
xvii Massumi’s essay on the ‘autonomy’ of affect is a good starting point for this work. See Brian Massumi Parables for the Virtual (Durham NC: Duke University Press, 2002). A concern with affect equally concerns the work of Lazzarato, Virno, Berardi and the likes. For work which more closely follows the agenda of subordinating other faculties to that of affect see, Mark Hansen’s, New Philosophy for New Media (Cambridge, MA: MIT Press, 2004).
xviii The notion of means without ends comes from the Italian philosopher Giorgio Agamben. ‘Means without ends’ simply involve the communication of communicability “properly speaking it has nothing to say, because what it shows is the being-in-language of the human as pure mediality”. Giorgio Agamben Means without Ends: Notes on Politics (Minnesota MN: University of Minnesota Press, 2000) p.70
xix Lyotard, The DIfferend p.13
xx Media studies’ attention to materiality can be found in a number of currents in the field, including the work of Elizabeth L. Eisenstein, Marshall McLuhan, Raymond Williams, Friedrich Kittler, N. Katherine Hayles, etc.
xxi The classic development of this argument can be found in Deleuze’s essay Difference and Repetition, specifically in the chapter on ‘The Image of Thought’. See Gilles Deleuze, Difference and Repetition (London: Athlone, 1994)
xxii The notion of the ‘autonomy’ of reason which shapes the theory/practice distinction ties in to the supposed a priori affinity of rationality and the good.

BIOS

Matthew Fuller is David Gee Reader in Digital Media at the Centre for Cultural Studies, Goldsmiths College, University of London. He is the author of a number of books including, Media Ecologies, materialist energies in art and technoculture and Behind the Blip, esays on the culture of software. His research for this paper and others was supported by the Fonds voor Beeldende Kunst, Vormgeving en Bouwkunst of the Netherlands.

Andrew Goffey is Senior Lecturer in Media, Culture and Communication at Middlesex University, London, and writes in the space between philosophy, science and culture.

Bibliography
Agamben, Giorgio. Means without Ends: Notes on Politics. Minnesota MN: University of Minnesota Press, 2000.
Agamben, Giorgio. Remnants of Auschwitz: The Witness and the Archive. New York NY: Zone, 2002.
Aho, Alfred V., Ravi Sethi, and,Jeffrey D. Ullman. Compilers. Principles, Techniques and Tools. Boston MA: Addison Wesley, 1974.
Arendt, Hannah. Between Past and Future. New York NY: Viking Press, 1961. Revised edition, 1968.
Baudrillard, Jean. La transparence du mal. Paris: Galilée, 1990.
Baudrillard, Jean. The Intelligence of Evil or the Lucidity Pact. London: Berg, 2005.
Cassin, Barbara. L’effet sophistique Paris: Gallimard, 1995.
Chertok, Leon and Isabelle Stengers. A Critique of Psychoanalytic Reason: Hypnosis as a Science Problem from Lavoisier to Lacan. Stanford CA: Stanford University Press, 1992.
Crary, Jonathan. Suspensions of Perception. Cambridge MA: MIT Press, 2001.
Deleuze, Gilles. Difference and Repetition. London: Athlone, 1994.
Gracian, Balthasar. The Art of Worldy Wisdom. London: Macmillan, 1892.
Hansen, Mark. New Philosophy for New Media. Cambridge, MA: MIT Press, 2004.
Harman, Graham. Tool-being. Heidegger and the Metaphysics of Objects. Chicago IL: Open Court, 2002.
Kittler, Friedrich. Literature, Media, Information Systems. London: Routledge, 1997.
Latour, Bruno. Petite réflexion sur le culte moderne des dieux faitiches. Paris: Les empêcheurs de penser en rond, 1996.
Latour, Bruno. Science in Action. Cambridge MA: Harvard University Press, 1987.
Lazzarato, Maurizio. Les révolutions du capitalisme. Paris: Les empêcheurs de penser en rond, 2004.
Lyotard, Jean-Francois. The Differend. Minnesota MN: University of Minnesota Press, 1989.
Machiavelli, Nicolo. The Prince London: Penguin, 2003.
Marazzi, Christian. La place des chaussettes. Paris: Editions de l’éclat, 1997.
Massumi, Brian. Parables for the Virtual. Durham NC: Duke University Press, 2002.
Milgram, Stanley. Obedience to Authority: An Experimental View. New York NY: HarperCollins, 2004.
Minsky, Marvin. ‘Jokes and their Relation to the Cognitive Unconscious’ http://www.ai.mit.edu/people/minsky/papers/jokes.cognitive.txt (accessed December 10, 2006).
Pignarre, Philippe and Isabelle Stengers. La sorcellerie capitaliste. Paris: Editions La Découverte, 2005.
Schopenhauer, Arthur. The Art of Always Being Right. London: Gibson, 2004.
Stengers, Isabelle. Cosmopolitiques. Paris: La Découverte, 1997.
Virno, Paolo. A Grammar of the Multitude. New York NY: Semiotext(e), 2003.
Whitehead, Alfred North. Process and Reality. An Essay in Cosmology. London: Free Press, 1979.