Who's the Boss?

In many of the short digital ethnographies for this Digital Cultures course (Bayne and Ross 2009) the role of powerful and/or influential users and hierarchical groups was noted despite most of the spaces examined being those that theoretically support the formation of flattened social structures. This contrast of possibility and practice, also discussed in Hand (2008) with reference to the views of Poster (2006), may reflect existing societal structures and behaviours in the physical world but, I would argue, there are also online social site design imperatives driving the formation of these online groups and cultures. Turnley (2005) discusses the important role of critical awareness in web authoring and Melucci (1999 quoted in Hand (2008)) specifically highlights the importance of control and codification of digital information flows in the current connected era: “...the possibility of exerting power shifts from the contents of communication and social exchanges to the formal structures, to the codes that organise the flow of information.”

Thus regardless of what you do in an online social space it is the owners and/or operators of that site who retain the ability to change, influence and shape your experience and that of your peers, colleagues, students, etc. and the provision (or not) or functionality, the frequency and type of emails sent or ads provided by a site, the availability of portable data formats (e.g. RSS and Atom feeds, through APIs), etc. will be more a reflection of power and the agenda of site owners rather than reflective of community desire (though both may often be satisfied at once).

In spaces like YouTube all users are equally able to upload and access content with YouTube itself (and ultimately owner Google) policing activity by holding each individual user to account for any issues with their own videos, comments, acceptable behaviour, etc. This appears to be a flat democratised structure of participation but the technology behind the site conceals a privileged role for key contributors (YouTube 2009b) and rights holders (YouTube 2009a) - maintained through automated screening of videos – so that, to paraphrase Orwell (1945), all participants are equal, but some participants are more equal than others.

Comment systems such as the BBC's Have Your Say functionality, or the Guardian's Comment is Free area or the Mail article-level comments systems (e.g. comments on “Facebook Rage as social networking sites fuel jealousy and stalking online”) etc. also display a filtered view of postings. In some regards this is a utopian scenario: spam is caught, offensive or flame comments can be removed or edited, mature discussion is encouraged. However such intervention also presents a filtering that is not present or possible in physical contexts and does not address or confront perceived social problems which may simply be reflected in removed comments. This freedom of speech dilemma is at the heart of the manipulation found in many social spaces. Whilst there is western derision for the state filtering of the internet in China (Johnson 2009) there is at least a degree of public support for relatively invasive measures to tackle piracy, terrorism related activities, and, most notably, the safety of children online and the acceptance of such perceived threats influences policy making, site design – and interventions such as CAPTCHA, moderation, censorship of controversial words etc. - and differentiated freedoms of expression (Gies 2008).