Do You Believe in What You See?

We might usefully turn our attention to... the ways in which new media might alter the conditions of identity performance (Meyrowitz, 1985). Standards of authenticity should not be seen as absolute, but are situationally negotiated and sustained.”

Hine (2000)

Personal experiences of social online spaces can, no matter how haphazard, unique or random they may seem, be the result of highly deliberate and manipulating processes by the designers of those spaces. This is, of course, also the case in many architectural and social spaces in the physical world (e.g. Norman 2002, Architecture for Humanity 2009) and the manipulation may be subtle but influential in much the same way (e.g. boyd 2008). Fire escape signs, the choice of chairs or placement of doors can all influence our relationships in the physical world and, consciously or not, we are used to interpreting the meanings of such manipulations. We are as unlikely to shout in the reading room of a library as we are to work on an embroidery project at a football stadium or change a nappy in a nightclub.

(Open Architecture Network 2009)

In online social spaces the digital décor and behaviour cues are often far less visible and can be incredibly subtle. Automated algorithmic interventions, by which I mean interruptive or manipulative interactions between user and the programming of a given site, may not only offer similar cues to use of a space (with demands to comment, “like”, “friend”, “share” etc.) but may also be personalised to individuals so that each person receives a different set of cues and encouragements or discouragement in the same space. For example the pricing of airline tickets on a carrier's website may be based on the number of times a site has been browsed combined with previous user behaviour, whilst Facebook suggestions to users will be based on actions (or inactivity) of connected users.

Screen shot of a Facebook “suggestions” panel.

Wesch (2007) characterises the growth of social media in the context of user created content with a connection between the phrases “The Machine is Using Us” and the “The Machine is Us” but I think it is interesting to question the notion of “the machine” in such relatively utopian visions of the internet and social media. In theory the machine may be an aggregation of the various technological machines that make up the network, the internet, the cloud, cyberspace, etc. However in practice the sites that exist in this space are operated by a huge variety of bodies from those funded by government (e.g. sites on the .gov.uk, .gov and .ac.uk or .edu domains), those operated by not for profit organisations (e.g. Linux Foundation, the Wikimedia Foundation, sites on the .org domain), privately held companies (e.g. Facebook) and publicly listed private companies (e.g. Newscorp (owner of MySpace), Google, etc.). Although social spaces allow communities to form and communicate they do so by returning useful data and/or direct income to their owners: the machine that is us may therefore not only be algorithmic but also corporate in nature. Many and conflicting agendas are at play but it is often a commercial imperative that drives culture-forming designs and manipulations rather than the singular military agenda for blending humans with machine envisioned by Haraway (2000) in A Cyborg Manifesto.