Annotated Bibliography

Methods/Perspectives/Theories

Anderson, B. (2006). Imagined communities: Reflections on the origin and spread of nationalism. London: Verso.

Anderson’s position is that nationality and nationalism are “cultural artefacts” (p. 4) created at the end of the eighteenth century. Some try to make capital “N” nationality an ideology like liberalism and fascism rather than a word more similar to kinship or religion (p. 5). Anderson defines a nation as “an imagined political community—and imagined as both inherently limited and sovereign” (p. 6). It is imagined because not everyone always knows everybody, and it is limited because almost by definition, a nation has boundaries that does not include everyone in the world (p. 7). It is a community because it is “always conceived as a deep, horizontal comradeship” (p. 7). It is a sovereign because the concept emerged in a religious time where people accepted that God believed the nation should be free (p. 7). Nationalism is even aligned with religiousness; soldiers are buried for their nationalism (p. 10). Cultural systems related to nationalism are the religious community (p. 12) and dynastic realm (p. 19); the “nation” didn’t replace these communities, though, but they did allow the idea of the nation to be possible (p. 22).  One reason for the popularity of the nation is capitalism, and book publishing was one of the world’s first markets (p. 37). Writing became a way to coalesce populations to action once writing was delivered to people in a vernacular language (p. 40). Print language led to nationalism in three ways: 1) it created a unified people through a singular language and created an outlined audience which led to the idea of a nation (p. 44); 2) it also fixed language into particular forms; and 3) it created a language of power for those whose language was closer to the written form (p. 45). Overall, in addition to print, ultimately Anderson shows that nationalism is something that grew out of a process and history ("Contemporary nationalism is the heir to two centuries of historic change"(p. 159), and only due to certain events nationalism became something that people imagine exists.

Ball, K., Haggerty, K., & Lyon, D. (2012). Routledge Handbook of Surveillance Studies. Oxon: Routledge.

This book is divided into several parts: 1) Understanding surveillance; 2) Surveillance as sorting; 3) Surveillance contexts; and 4) Limiting surveillance.   In the first section Elmer discusses that many people point to Foucault for the idea of the panopticon, discipline and control. However, Bentham originated the Panotopicon (and thought of it as freeing), and Deleuze really outlined the society of control. Bogard talks about simulation and the Panopticon without walls; Koskela talks about gender and the female subject of the gaze;  Wellner talks about history, Rule talks about privacy (the definition is contested and ideas only show Wittgenstien’s family resemblances) and the belief in the legitimized ability to gather and store information, Browne talks about race and mug shots; McGrath talks about performing surveillance and the Hollywood obsession; Andrejevic talks about ubiquitous surveillance and the seemingly inescapable realm of personal information being collected, recorded, stored, and sorted (p. 92) from supermarket cards to criminal records (p. 94); Kammerer talks about literature, film, and television (p. 99); and Smith talks about those that work in surveillance (p. 107).  The second section, Surveillance as sorting, has the following readings: Gandy talks about remote sensing (p.125); Turow and Draper talk about advertising (p. 133); Kroener and Neyland talk about technologies (p. 141) ; Sa’id talks about colonialism (p. 151); Jenkins talks about social sorting (p. 159); Hayes talks about the surveillance industrial complex (p. 167); and van der Ploeg talks about the body as data (p. 176). In the third section, Surveillance contexts, Adey talks about border control (p. 193); Fussey and Coaffee talk about urban spaces (p. 201); Ruppert talks about census (p. 209); Donadlson talks about non-humans (p. 217); Taylor talks about schools (p. 225); Haggerty talks about policing and crime (p. 235); McCahill talks about the media (p. 244); Norris talks about CCTV (p. 251); Artega Botello talks about violence in Latin America (p. 259); Wilson talks about the military (p.269); Bigo talks about democracy (p. 277); Monahan talks about terrorism (p. 285); Gates talks about globalism and homeland security (p. 292); Sewell talks about employees (p. 303); Webster talks about public administration (p. 313); Pridmore talks about consumers (p. 321); Murikami Wood talks about globalization (p. 333); Bruno talks about web 2.0 (p. 343); and Steeves talks about young people (p. 352).  In the fourth section, limiting surveillance, Stoddart talks about ethics (p. 369); Rabb talks about regulation (p. 377); Kerr  and barrigar talk about privacy, identity, and anonymity (p. 386); Regan talks about institutional arrangements on security (p. 397); Gilliom and Monahan talk about everyday resistance (p. 405); Bennett talks about privacy advocates and advocacy (p. 412); and Abu-Laban talks about politics including civil liberties, human rights and ethics (p. 420).  

Barnard-Wills, D. (2012). Surveillance and identity: Discourse, subjectivity and the state. Burlington: Ashgate.

This book works on a discourse analysis of surveillance and starts out that “surveillance is a contested social practice” (p. 2). It points out, using discourse theory from Laclau and Mouffe’s discussion of discourse that surveillance is rooted in the discourse that it necessity, legitimacy and appropriate, and it did not just emerge from one place in government. Identity is also constituted by discourse. The book is guided by the following questions: what discourses of surveil lance are identifiable in the contemporary United Kingdom?, how is the nature of the problem of governance defined in these discourses, what roles or subject positions are made available by discourses of surveillance?, and how is the idea of individual identity  articulated within contemporary discourses of surveillance?’ (pp. 59–60). Barnard-Wills comes to the conclusion that often in the UK discourse, three subject positions are privileged: the individual, the illegitimate and the vulnerable (p. 166). The individual is the subject that has rights; the illegitimate is the contrast to the subject and one that that is generally the enemy rather than adversaries: the immigrant, the criminal, the terrorist; and the vulnerable may be a wealthy person susceptible to identity theft or it might be one in a socially disadvantaged position like the homeless, poor, or disabled (p. 166-7). There is also a privileging of the surveillant identity; many times it is from the vulnerable position. Ultimately the author finds that what surveillance research defines as surveillance is not what people or governmental discourse defines as surveillance; therefore, it is important to look at discourse when researching surveillance. Many times surveillance is justified and discussed through risk aversion, and “identities become the basis for risk assessment” and as Nikolas Rose (1999) states, identity becomes a password (p. 178). The conclusions help inform discussions on governmentality and political and social theories of identity. The implications of the findings are: normalization, limiting surveillance, limits of dystopia, data protection requirements as empty signifiers, accuracy and effectiveness of surveillance, and question of human v. machine truth (p. 172). Ultimately, Barnard-Wills concludes that in governmentality, identity is often fixed by others which will limit one’s control of who they are, and any attempt to resist surveillance is problematic due to the normalized surveillance procedures (p. 190).  

Bauman, Z. & Lyon, D. (2013). Liquid Surveillance. Cambridge: Polity Press.

Liquid surveillance conceptualizes surveillance as flows of information. Surveillance in particular is fluid because 1) it is has a quickly changing frame of reference and seeps into our lives in unexpected ways. This is based off of Deleuze’s ideas of the society of control where surveillance functions like rhizomatic,  creeping weeds (Haggerty and Ericson would say surveillance assemblage) (p. 3). There is no longer a fixed, brick and mortar panopticon; there are fragmented, mobile technologies that can log us and track us from anywhere.  Surveillance not only tracks what we are doing, but what we might do, locally and globally p 2). One main result of new technology is being able to hold power and surveil from afar (p.76). Power now exists dispersed globally, but politics are local. Surveillance can be accomplished not just by governments, but also through police agencies, private corporations and social media (p. 7).  Many times the data double created through liquid surveillance is trusted more than the person itself, and those that do the sorting claim to be just dealing with data and are morally neutral (p.8). The book discusses social media, theories associated with the panopticon (banopticon, synopticon, etc), consumerism, and surveillance within these areas.  It addresses how distance allows us to place individuals under categorical suspicion based on falsely assumed neutral data. It ends with discussions of ethics and agency and the ability to question technologies and practices that surveillance is caught in.

Cohen, S. (1985). The master patterns. In Visions of social control. (13-39). Cambridge: Polity Press.

Cohen begins by outlining two shifts in controlling deviance in Western, industrial societies. One is more transparent and took place at the, and the other is opaque and occurring currently. Cohen walks through multiple historical theories of the prisons in the late 18th/early 19th centuries to include Foucault. Cohen comes to the conclusion using a left functional theory. He states, “All talk of success and failure totally misses the point: the prison invents the delinquent; it cannot ‘fail’ because, like all punishment, it ‘is not intended to eliminate offences, but rather to distinguish them, to distribute them, to use them” (p. 27). In the 1960s, there was a push to deinstitionalize which reduced the number of prisoners. Although this happened, it was not successful. Cohen plans to spend the rest of his book exploring the past to look at the present criminal justice system.

Ericson, R. V., & Haggerty, K.D. (1997). The Risk Society. In R.V. Ericson & K.D. Haggerty (Eds.) Policing the risk society (pp. 81-130). U of Toronto: Toronto.

This section discusses using policing and surveillance to mitigate risk. Risks are just discourse constructions created by institutions. Risk is institutionalized when society accepts the classifications of risks. In order to manage and reduce risks, institutions offer guarantees like commitment or warranties, assurances like stability, and predictability like routinized services (p.85). This supposedly makes an institution more trustworthy.  However, trust and risk are both abstract concepts and subject to outside forces. As noted by Heidegger (1974), the desire to make risk management more predictable and knowable has created practices and technologies which have further increased risks.  According to Beck (1992), in a risk society, people are no longer concerned with moving goods but instead are focused on distributing and managing “bads” – so thus, “bads” become “goods.” Risk interprets problems not as chance but as errors or faults which could have been prevented by more knowledge of the risks. Fear and anxiety permeate the idea that risk can be minimized with a good counter-plan.  This discourse sees risk as something repetitive and calculable; technical problems are assumed to preempt problems. This turns life into what Max Weber thinks it is – “deliberate, systematic, calculable, impersonal, instrumental, exact, quantitative, rule-governed, predictable, methodical, purposeful, sober, scrupulous, efficacious, intelligible and consistent” (p. 87). Risk is constitutional rhetoric which calls them into being and “involves classifications of difference, classifications that not only provide a focus on what is present but also a sense of what is absent and might be” (p. 87). Risk brings an imagined future into the present. Since the risks are probable, this leads them “subject to moral, political, and aesthetic knowledges” (p. 90) because these factors work to shape solutions to only possible problems.  People are sorted into categories of risk, and these categories become transmitters of biopower- (the power of biography) which, thorough a liberal government, manages to control people through the expectations of the categories (Did one live up to their expectations? [http://hereandnow.wbur.org/2015/01/15/wes-moore-the-work]). These categories help with risk management, and surveillance is the method that helps create these categories.  Ericson and Haggerty highlight the drawbacks to making decisions based on statistics. They state quote Hacking who brings out that in a risk society, decisions “can’t be conducted without decisions couched in terms of probabilities. [And] by covering opinion with veneer of objectivity, we replace judgment with computation” (p. 115). Computers structure these “truths” and assist in this categorization based on statistics and probabilities. Law becomes a force which helps with risk management (p. 116).  Risk societies value privacy because it is constructed as spaces where one can avoid risk calculations. Tocqueville (1840) commented that more bureaucratic surveillance is needed as citizens are more privatized (p. 117). Ericson and Haggerty bring out privacy is ultimately a spiral because the more people seek privacy, the more intrusions there are into privacy so people can be “trusted.” The more surveillance is used to create trust, the more skeptical people are about trust. The more people are distrustful, the more surveillance is presumed as needed.   Overwhelmingly, the risk society is fragmented because it transcends institutional and professional boundaries and because it concerns the past but deals with the future (p. 118), unequal because people are categorized into certain areas which limit or grant access – (i.e., the literature major denied a credit card (p. 120), and expels morality because sometimes the statistical norms are reimagined as what should be done instead of just a statistical result (i.e., law enforcement may focus on alcohol as THE reason for crashes, but it is really just one).  

Foucault, M. (1977). Discipline and punish: The birth of the prison. (A. Sheridan, Trans.). New York: Vintage.

[Foucault worked with prisons in France and was influenced after a 1972 trip to Attica. He was impressed with the smoothness of the management.] Foucault felt the prison was a symbol of the society it existed in. Prisons have changed the nature and form of punishment from one that emphasizes corporal punishment on the body to one that is less brutal but still debilitating; it controls their lives through daily rituals. The modern disciplinary society started in the late 18th century and has set the example for other institutions to follow such as the school, hospital, or factory. Routinized control is symbolic of the disciplinary society, and surveillance over day-to-day activities has replaced a more physical method of control. [Some criticize that instead of all institutions being modeled after the prison, it might be more accurate to say institutions sought the same idea of discipline/dominance.] The prisons legitimizes control of those both in and outside of the prison system because after prisoners get out they are still monitored, so the offender is never finished with the sentence and all of society is watched. Foucault views the prison as watching the offenders at all times [although some criticize even at his time this was not true.] Prisoners become parts of society, and there are no outlaws. Law is really just a tool of class power and social control. Foucault says the discipline is maintained through micro-power (as opposed to national super-power) through the ‘gaze’ and constant visibility. He uses Bentham’s Panopticon as the metaphor for this visibility. The individual becomes a “subject” due to this visibility. This visibility is not always a given though, and often subjects are ruled only by the threat or possibility of visibility. Discipline is also a form of knowledge, and knowledge is power. Knowledge also produces a knowledge expert which further inscribes power. Criticisms of Foucault are that he is said to focus more on penological studies and ignore sociology. Some also call it speculative and genre-defying; i.e., it is not quite history since there are grand sweeping claims with little specifics to back it up [sounds like criticisms of Jared Diamond].

Gee, J.P. (2013). An introduction to discourse analysis theory and method. London: Routledge.

This book is based on the idea that language only has meaning in social practice. According to Gee, discourse analysis is the study of language in use” (p.8). It is important to do discourse analysis because “[l]ooking closely at the structure of language as it is being used can help us uncover different ways of saying things, doing things and being things in the world” (p. 9). There are at least two types of discourse analysis: descriptive and critical. To see the difference, one can ask why we would want to see how language is being used. According to Gee, descriptive approaches “answer this question by saying that their goal is to describe how language works in order to understand it, just as the goal of the physicist is to describe how the physical world works in order to understand it…“critical,” answer this question differently. Their goal is not just to describe how language works or even to offer deep explanations, though they do want to do this. They also want to speak to and, perhaps, intervene in, institutional, social, or political issues, problems, and controversies in the world. They want to apply their work to the world in some fashion (p.9). Gee however takes his own critical look at critical discourse analysis and says that “all discourse analysis needs to be critical, not because discourse analysts are or need to be political, but because language itself is, as we have discussed above, political” (p. 9). Overall for the book’s purpose, Gee is “interested in a method that can do two things beyond description: a) illuminate and provide us with evidence for our theory of the domain, a theory that helps to explain how and why language works the way it does when it is put into action; and b) contribute, in terms of understanding and intervention, to important issues and problems in some area that interests and motivates us as global citizens” (p. 12).

Gilliom, J. & Monahan, T. (2013). SuperVision: An introduction to the surveillance society. Chicago: U of Chicago.

According to the authors, the main point of the book was “to show diversity and ubiquity of new technologies of surveillance in everyday life” (p. 117). The book’s chapters look at cell phones (p. 11), loyalty cards (p. 27), online behavior (p.47), schools (p. 72), employments (p. 89), and the cost of surveillance (financially and socially) (p. 108). The book concludes that boundaries of time and distance are evaporating as information is accessible in the cloud at any time (p. 127), and it maintains ten big ideas (p. 128): 1) Current surveillance metaphors like Big Brother, the panopticon, and a “right to privacy” are limited and need to expand for a greater understanding of surveillance: Big brother is a totalitarian state; privacy is not as simple as just being left alone; and the panopticon only gives the idea of one observer watching a prisoner in an isolated cell (pp. 128-9). 2) Surveillance isn’t always bad, but there is an imbalance of power built in it (p. 130). 3) The idea of ‘big government’ is limited; there are lots of corporate powers at play too (p. 132). 4) Surveillance is more than just a watcher; it makes (data) versions of us that will be judged, and it also alters our behavior (p. 133). 5) Fear and desire work closely in surveillance; people think surveillance can help us be safe, and it also helps us feel not so alone when we are part of an online community, but it is built on our fears (p. 134-5). 6) Surveillance alters our understanding of space and time; technologies help compress and recall information in many places (p. 136). 7) People do try to resist surveillance, and not all resistance should be categorized as cheating; a real dialogue should be opened (p. 137). 8) Surveillance systems might appear new, but they have emerged from existing social patterns to include race/class/gender (p. 138). 9) Not all surveillance works or comes up with the desired results; also, tech determinism needs evaluation (p. 140). 10) Because of our belief in science, we have a hunger for more and more information to explain our world, and this will just lead to more surveillance (p. 140).

Haggerty, K.D. & Ericson, R.V. (2000). The surveillant assemblage. British Journal of Sociology (51)4, 605-22.

This article discusses Gilles Deleuze and Felix Guattari's idea of "surveillant assemblage." According to Haggerty and Ericson, Orwell's notion of Big Brother and Focault's notion of the panopticon are both limited in their understanding of surveillance because both have a hierarchical understanding of surveillance as coming from a position of authority (either Orwell's understanding of the state or Foucault's higher power as watcher which causes self-regulation). Surveillant assemblage breaks down the hierarchical, central power and shows how multiple players supply flows of information which are then complied to form a "data double." This data double becomes an alternative identity, an identity which the subject may never know. The body ("auditory, scent, chemical visual, ultraviolet" (p.611)) is a primary data gathering site, and centers of calculation are thus "forensic laboratories, statistical institutions, police stations, financial institutions, and corporate and military headquarters" (p. 613).

Haggerty, K. D., & Samatas, M. (Eds.). (2010). “Theorizing Surveillance and Democracy.” Surveillance and Democracy. (pp. 17-88). New York: Routledge.

This section was made up of four chapters. The first, by Deborah G. Johnson and Kent Wayland was “Surveillance and Transparency as Sociotechnical Systems of Accountability (p. 19). The authors use a framework of co-construction/mutual constitution from science and technology (STS) studies which asserts that technology constitutes, and is constituted by, society [like critiques of technological determinism?]. It holds that technology is part of a sociotechnical system composed as “artifacts, social practices, social arrangements, and meanings” (p. 17).

Lyon, David. Surveillance Studies: An Overview. Malden: Polity Press, 2007.

Lyon addresses the fundamentals of surveillance in this book. He begins with the idea of assemblage and the way different coordinates such as time, place, and data from the body such as DNA, fingerprints, iris scans, images, financial information like salary or ID number, emails, telephone conversations, text messages (p. 18). These coordinates are assembled to create a data double (p. 4). Surveillance is a power display (p. 23) and is technology-driven (p. 21). Many times justified by ideas of social control, governing, crime control (p. 21), or national security (p. 46). It is often gendered due to the perceived "maleness" of technology and the idea of the gaze (p. 22). Heuristics to measure sites of surveillance are rationalization, technology, sorting, knowledgeability, and urgency (p. 26-7), and specific sites of surveillance are the military (p. 27), the state (p. 30), employers (p. 33), law enforcement (p. 36), and corporations (p. 40).  Lyon traces the roots of surveillance to topics of workplace management by scholars Michel Foucault and Elia Zureik (p. 48); the military from Christopher Dandeker and Max Weber (p. 49); and criminal justice from Cohen, Marx, and Durkheim (p. 49). Lyon also explores modern ideas of surveillance as an outgrowth of capitalist enterprises, bureaucratic organization, the nation-state, a machine-like technologic and the development of new kinds of solidarity (p. 51), and postmodern ideas of surveillance as William Stapes-inspired ideas of forms of "technology- based, body-objectifying, everyday, universal kinds of surveillance" (P. 51). Heuristics to understand surveillance could be understood three ways. Pre-modern ways of surveillance was face-to face; modern is rationalized as "using accounting methods and file-based coordination" (p. 75); and postmodern methods are "digitally mediated, based on behavioral and biometric traits, future-oriented, micro (body) and macro-(globe) oriented, and tend to be exclusionary" (p. 75). It is important to understand Staples (2000) contribution that surveillance is "systematic, methodical, and automatic"in that it it is algorithmic and preemptive (p. 88-89). It is also important to understand that how people are classified due to this data matters and should be a moral decision (p. 73). Many times people resist or disagree with their data-double (p. 90), and peoples' identities do not always match up to the way they are identified (p. 91). Identity is another central component of this book, and, quoting Jenkins, Lyon states, "[T]he capacity of organizations to identify people-authoritatively or powerfully, as individuals or collectively - and to make those identifications "stick" through the allocation of rewards and penalties should not be underestimated" (p. 124). Being classified due to a data-double one does or does not identify with has consequences. For instance, since surveillance is often used for social sorting (Lyon, 2007, p. 101), some are classified as potential offenders (p. 105). Gary T. Marx talked about those classified as possible threats may not even be threats, but they are under "categorical suspicion" because they are categorized in this way (Lyon, 2007, p. 106). This shifts away from Foucautl's disciplinary society and toward Deleuze's "society of control" (Lyon, 2007, p. 107) because not just agents of authority (law enforcement, government, etc) are conducting surveillance; more agents in society are watching and keeping track. It is important to understand regulation because "personal information may be vulnerable to forms of fraud or other misuse" (p. 119). Data then becomes the basis of judgment, and Lyon quotes Bauman (1998) as saying [T]he database is an instrument of selection, separation, and exclusion" (p. 134). Some try to go against the "dialectic of control" (p. 163) through negotiation (p. 165) and resistance (p. 166) by organizing privacy and anti-surveillance movements (p. 169) and legislation (p. 171). Privacy is one frame often applied to resistance, and this can include bodily, communication, information, territorial, and image privacies (p. 174-5). Some feel that if you have nothing to hide then you have nothing to fear (p. 176), but even the accumulation of massive amounts of personal data and the handling of that data is of importance before it is even categorized. Beyond this, there are issues of access to information and transparency (p. 181), especially due to social sorting, profiling and discrimination (p. 182). Additionally, it is important to know "who's watching the watchers?" (p. 186). Many times algorithms sort people into categories through automation (p.184), and many times "algorithmic surveillance using networked databases has become a crucial component of governance" (p. 188). Ultimately, the presumption of innocence may be overtaken by categorical suspicion which may ultimately affect the way rewards and punishments are given (p. 191). Overall, Lyon concludes there should be a public debate about surveillance that includes five things: first, 9/11 has been used as a discursive construct for the justification of surveillance - knowing this can help examine trends already set in place before that time (p. 195). Second, the war on terror has locked business and governments into mutually productive agreements, and the way to obtain "security" that limits liberties should be examined (p. 195). Third, the justification of surveillance and data gathering for "national security purposes" should be examined (p. 196). Fourth, the nature of privacy agreements should be examined (p. 196). Finally, privacy and data protection laws should also be examined continually (p 196). The whole discussion should make pople explore the psychological, political, social, cultural, and demographic dimensions of privacy, surveillance and social sorting (p. 197).

Marwick, A.E. (2012). The public domain: Social surveillance in everyday life. Surveillance & Society (9)4, (378-93).

Marwick discusses the difference between traditional forms of surveillance (the idea that big state or corporations, asymmetrical in power to the watched, are watching, p. 380) and an idea of social surveillance where people use social media to watch each other AND adopt their behavior accordingly (p. 381). According to Marwick, “Social surveillance is the use of Web 2.0 sites like Twitter, Facebook and Foursquare to see what friends, family, and acquaintances are “up to” (Joinson 2008; Tokunaga 2011)” (p. 378). A move to a more interactive surveillance has been “conceptualized this shift as lateral surveillance (Andrejevic 2005), participatory surveillance (Albrechtslund 2008), social searching (Lampe, Ellison, and Steinfield 2006)” (p. 379), and Marwick uses the term social surveillance. Knowing that people are watching alters what they do. She states, “People monitor their digital actions with an audience in mind, often tailoring social media content to particular individuals (Gershon 2010; Trottier 2011)” (p. 379).

Nadesan, M.H. (2008). Governmentality, biopower, and everyday life. Florence, KY: Routledge.

Governmentality helps explain how society’s problems and people are arranged in relation to specific frameworks of problems solving and power – frameworks which can lead to marginalization, discipline, invisibility, and fear (p. 5).  Governmentatily is how governments (as shaped by flows of information from different spaces and theoretical underpinnings) come together and form a situational assemblage that governs a population. An analysis of governments (she uses the term government interchangeably p. 13) produces  three visible fields: the state, the market, and the population (p. 13). Depending on the ideology of government (liberalism, neoliberalism conservatism, neoconservatism) in relation to a specific time period,  governmentality will be carried out differently.  Spaces of control are through health information (chapter 4); the mind and brain (chapter 5); and more over displays of control such as state sovereignty though events like war (chapter 6). Chapter 5’s discussion on the history of isolating criminals and those with mental illness is important for a study of illness and community sustainability. The 19th century institutionalization moved   to mental hygiene involving community psychiatry (p. 151), and soon the dangers of the external and internal became something that could be normalized and regulated instead of being contained or excised (p. 152). This leads to thinking mental health can be known and for the most part regulated through expert knowledge (p. 154) and personal control (be it by drugs (which is also useful for the economy) or personal control), and it runs the risk of assuming problems are caused by personal failure rather than social factors (p. 180). Agamben (2000) shows how liberal democracy can contain social and geographical exclusions, and thus, problems of “racist and elitist ideas” can find themselves in biopolitical readings of the biology of human behavior and emotion. Contemporary, neoliberal governments are being marginalized by the market model of government which privatized formerly government-centric responsibilities. However, the risk to get into these markets is still there, and this is the place where government regulations resonate (p. 183). Overall, governments, due to their differing underlying ideological principles govern in many different ways. Power through these paradigms can take forms such as biopolitical or sovereign control. These controls place individuals in categories of control which can ultimately create marginalized, socially disadvantaged groups which are kept marginalized by policies and a cycle of control.

Pink, S. (2001). Doing visual ethnography: Images, media, and representation in research. London: Sage.

Pink talks about using visual material in ethnographic research and seeks to look at the methodological and theoretical underpinnings. Historically, the objectivity of the image has been challenged (p. 7). Photography [like Big Data] has been described as something that changes the way we understand and see things (p. 13). Hammersley and Atkinson (1995) describe ethnography as participating, overtly or covertly, in someone’s life over a period of time to watch and listen to what is said as well as asking questions and collecting any other related data (p. 18). Pink says this is limited though because this limits the role of an ethnographer, and ethnography is not just about data collection; Pink instead calls ethnography a methodology itself. Rather than being a how-to, it is a process of creating and representing knowledge that doesn’t claim objective truth but does provide a version of what is happening (p. 18). Ethnographers should be aware of the way gender, age, race, ethnicity, and class situates and influences the research they do and produce (p. 20). The image can only be translated into meaning, but even the photo is just an interpretation of reality (p. 24). Images can be understood through society’s shared norms (p. 27), and often, photographs are sorted into genres based on society’s assumptions (p. 28). Pink discusses Banks (1997) three activities involved in visual research: 1) “studying society by producing images”; 2) “studying images for information about society”; and 3) collaborating with actors while making visuals (p. 30). It is not always appropriate to take photos when doing research (p. 33), and ethnographers should understand (ethically – p. 37) how their photographs are viewed and can influence the groups they are studying (p. 34). Existing approaches to ethics in ethnography are: 1) covert research and informed consent – some question taking photos without others knowing and question whether people really understand consent (p. 40); 2) permission / right to photography at events- who can actually provide permission to take photos/video (p. 41); 3) harm to informants – different cultures respond to visuals differently, and some may be emotionally stressed due to feelings of lack of control (p. 42); 4)harm, representations and publishing permission – moving from personal collections to publication presents issues, and people should be aware of the larger purposes of studies (p. 43); 5) exploitation and giving something back – those leading the studies often benefit more than the participants, so this may bring up if there could be anything a researcher could do to be kind back (p. 44) although the researcher may not be going “back” to anywhere (p. 45); 6) ownership of research materials – since production may often be both the researcher and participant, ownership may be in question which also informs what can be done with the research results – written agreements may be beneficial here (p. 46). Photography was initially thought of as objective (p. 49). Now people see photos as interpretations, and each person can see a photo differently (p. 52). One photo is not necessarily more ethnographic than another (p. 52). Often the ways photos are taken vary from genre to genre and culture to culture (p. 55). Classic photography methods for research are: 1) photos as recording devices /photographic survey – this is an attempt “to represent physical environments, objects, events or performances” in photos - it should be remembered that these are just aspects of culture and not the whole culture itself with fixed meanings(p. 58). 2) participatory & collaborative photography - ethnographers are engaging with the informant [surveillance-y] and their culture to create photos (58-9) and often, one must work around the needs of the informants (p. 60). 3) Ethnographers’ photos are interviewing with images – some ethnographers may share photos with informants to see what meanings informants can impart on the images (photo-elicitation) (p. 68). 4) Displays and exhibitions – viewing photos with others – this involves going with informants to see how they talk about these collections of photos (p. 74). 5) Absent photos – what information has been hidden or thrown away? (p. 75). Producing video is different than photos (p. 78). There has been divisions between objective footage and creative footage. Objective footage is supposedly unedited or manipulated, and creative footage is more cinematographical and aimed at bringing stories to audiences, edited and narrated with a purpose (p. 78). This however makes the creative works not ethnographic because it suggests ‘real’ ethnographies should be scientifically objective. Criticisms for that though are that 1) it is impossible to have undisturbed video footage because people in a video are always people in a video; 2) ethnographic information isn’t always observable facts and is often produced in conversation, etc; and 3) ‘ethnographicness’ is contextual and doesn’t always rely on intentions– the viewer gets to make the determination (p. 79). Overall, photos are not just supplements to text and are valuable to providing meaning (p. 115). Academic meanings are put on what could be considered ambiguous photos and video (p. 94). These meanings are also arbitrary and are influenced by whatever framework someone is working in (p. 94). Pink discusses that analysis occurs from the beginning of a problem until the end of the report (p. 95). Photo/video analysis can involve comparing meaning made in the field to academic meanings made out of the field (p. 95). Meaning is supposedly made when the visual is translated into words (p. 96). In order to do this Pink suggests first looking at content and context (p. 97). Realist approaches suggest analysis of context and content should be reliable and complete, and reflexive perspectives say there is no “complete” and everything is just a set of relationships so contexts should be monitored (p. 97). Not all images can be taken home (p. 101), so it is up to the ethnographer to remember it and possibly use field notes and diaries to also do so (p. 103). There are differing ways to organize and classify ethnographic images (p. 104). Classification systems have been criticized as objectification systems imposed on the weak by the powerful [sorting]. For instance, old photos of criminals were sorted into the ‘criminal type’ which discriminated against those that looked the type (Sekula, 1989) (p. 104). Collier and Collier suggest that sequential order must be maintained in organization – which essentially argues there is only one way to order images (p. 105). Pink thinks while it is important to know the sequential order, meaning may be made in better ways when the orders are rearranged (p. 106). Others suggest thematic organization – although this may be tricky since categories often overlap (p. 107). Video footage organization is different than photos (p. 110). Pink criticizes transcribing videos into written words as there are lots of elements going on such as visuals and audio (p. 111). Scrutiny of the video might suggest links between video and other ethnographic methods (p. 111) as these are the places where meaning might be made (p. 112-3). Electronic archives might allow future possibilities (p. 114). “Writing up” the research involves situating the research and creating relationships among texts and images (p. 115). A review of main points are that 1) ethnographic images are not just supplements to texts and do not just illustrate points made in text (p. 115-6). 2) Meaning is made in multiple places: readers and views have their own agency to create meaning, and local and academic knowledge also have a place in analysis (p. 117). 3) there can be multiple interpretations of the same data (i.e., academic v. local) (p. 117). It is accepted that ethnographies are not “truths” but “representations” (p. 121). Ethnographers engage in certain ways to convince their audiences (p. 122). They mostly write in present tense (p. 123) to show that the informant and researcher were present at the same time, and photography functions similarly. The present tense also provides the “slice of life” which positions the text/images as objects (p. 123) and realist expressions (p. 124). Some argue that captions may provide meaning to content, but this should be avoided and pictures should speak for themselves (p. 125). Realist ideas use images to illustrate written points (p 126). Uncaptioned images can assist the viewer in creating their own meaning and discourage captioned information as being taken at face value (p. 127). The photograph can be seen as a record rather than interpretation (p. 128). Expressive photography invites questioning to arouse curiosity by doing things such as acknowledging the constructedness of photos (p. 128). Pink goes through experimental presentations of ethnographic research (p. 128). Ethnographic films are useful for teaching and broadcasting (p. 138). Pink’s last chapter is about electronic hypermedia texts which seems very outdate but does offer a window into a technological past (p. 155). This deals with mediated publications (p. 158), digital manipulations (p. 160), non-linear presentation (p. 163), hyperlinked information (p. 164), ability to relate multiple narratives (p. 166), presentations of “unfinished” texts (p. 167), an interactive audience (p. 168), different ways to construct meaning (p. 170), and copyright (p. 172). [Takeaways: Those looking at SNS for the purpose of surveillance are involved in digital ethnography so therefore need to be aware of larger cultural practices. Surveillance is ethnography without the ethical considerations. This book did seem to stick to traditional views of ethnography as traveling elsewhere or visiting different cultures rather than say, looking at digital realms.]

Richards, N. M. (2013). The dangers of surveillance. Harvard Law Review 126(7), (1934-1965).

According to Richards, “our society lacks an understanding of why (and when) government surveillance is harmful” (p. 1935). According to Lyon, surveillance is “routine attention to personal details for purposes of influence, management, protection or direction”, and it is “focused on learning information about individuals. Second, surveillance is systematic; it is intentional rather than random or arbitrary. Third, surveillance is routine — a part of the ordinary administrative apparatus that characterizes modern societies” (p. 1937). Both autocratic (p. 1937) and democratic (p. 1938) governments participate in surveillance, and so do private companies (i.e., the Internet is only free because our data is surveilled for advertisement purposes) (p. 1938).  Big data is unique not just in scope and “the amount of personal information that can be processed, but because of the ways data in one area can be linked to other areas and analyzed to produce new inferences and findings” (p. 1939). Some may not think of private actors as being capable of surveillance because of the Big Brother/panopticon metaphors, but “But in a postmodern age of "liquid surveillance," the two phenomena are deeply intertwined. Government and nongovernment surveillance support each other in a complex manner that is often impossible to disentangle” (p. 1940). Thus, “while government regulation might be one way to limit or shape the growth of the data industry in socially beneficial ways, governments also have an interest in making privately collected data amenable to public-sector surveillance” (p. 1941).

Saldana, J. (2011). Fundamentals of Qualitative Research: Understanding Qualitative Research. New York: Oxford University Press, 2011.

Saldana walks through different methods of qualitative research and shows how qualitative research can be accomplished.

Surette, R. (1992). Mass media, crime, and criminal justice: An introduction. In Media, Crime, and Criminal Justice: Images and Realities (1-20). Belmont: Wadsworth Publishing Company.

The central question of the book is, “how have mass media changed the reality, and people’s perceptions, of crime and criminal justice?” (p.2). The central assumption is that media’s coverage and impact on crime  and justice can be seen by 1) looking at the social construction of reality and 2) by looking at the nature of media organizations. Studying media is important because media isn’t a neutral outsider providing a just news; it pervades too much and too far into society to be.  The information goes so far that it influences people’s views. Surette stats, “People use knowledge obtained from the media to construct a picture of the world, an image of reality on which they base their actions“(p. 2).  As evidence, in 1988 the Canadian Sentencing Commission said that the public bases its view of sentencing “on a data-base that does not reflect reality” (p. 5). This process is called the “social construction of reality,” and is “particularly important in the realm of crime, justice, and the media” (p. 2). In this view, “people create reality—the world as they believe it exists—based on their individual knowledge and on social interactions with other people” (p. 4). Overall, “A society’s ideas of criminality and social justice reflect its values concerning humanity, social relationships, free will, and political ideologies” (p. 5); mass media is often described as the most common source of information on justice and crime (Graber says as much as 95%).  Surette looks at two criminal justice system (CJS) theories: due process and crime control. He also looks at the mass media through two lenses too: front-stage and backstage behavior.  These types of behavior are between either the public or between intimates, and the media can chose what version to portray. The first mass media was print media like newspapers, books, and magazines (p. 10). There is also electronic media like radio, film, and television  which emerged  in the 1950s and overtook print media in popularity (p. 10). One difference between the two was that print media focused on front-stage activities, but electronic media moved to capture backstage areas and has brought its content to a wider audience.  This broadening audience collapses once segregated social institutions (p. 13).  This has caused the media to be seen as both the cause and solution to problems of crime. This leads Surette to conclude that the legitimacy of the CJS “will come to be questioned more and more as more of its daily backstage operation s are exposed though the efforts of the media” (p. 13). One example of the media’s showing both front and backstage behavior is cameras in the courtroom (p. 14). Barber (1987) poses the concern that “the audio-visual element may only enhance the dramatic appeal” (p. 17), and the book provides a table of arguments for and against cameras in court.  

Staples, W. G. (2000). Everyday surveillance: Vigilance and visibility in postmodern life. 2nd ed. Lanham: Rowman & Littlefield.

Staples' book is about social control , and he frames his studies of surveillance as rejecting the “highly coordinated, state-driven, Big Brother monopoly over the practice of watching people…[focusing instead] on the microtechniques of surveillance and social control that target and treat the body as an object to be watched, assessed, and manipulated” (p. ix). The characteristics of postmodern social control are meticulous rituals of power and are 1) “systematic, methodical, and automatic in operation” (p. 4). It is video cameras and databases, able to be accessed by computers. Techniques are also “increasingly technology-based…and sometimes anonymously applied, and they usually, generate a permanent record as evidence” (p.11).  2) These meticulous rituals of power control the body in two ways: a) “continuously, anonymously, and automatically” (p.5) and b) “the ability of organizations to monitor, judge, or even regulate our actions and behaviors through our bodies is significantly enhanced” (p. 5), and our bodies are objects that contain information to be analyzed and judged. “Many new techniques target and treat the body as an object that can be watched, assessed, and manipulated” (p.11). 3) Instead of an older model of control in which we lock up deviants, there is a push to control “deviants” without first locking them up through things like community corrections, regulatory welfare and other social services (p.6). “The new techniques are often local, operating in our everyday lives” (p. 11). 4) Not only are “deviants” under the gaze of control, and innocent until proven guilty is seemingly a cliché. Data generated through surveillance produces “types” that are at “risk” for behavior. This moves punishment from the individual deviant to the overall “type.”  “Local or not, they manage to bring wide-ranging populations, not just the official “deviant,” under scrutiny” (p. 11).

Warnick, B. (2007). Rhetoric Online: Persuasion and Politics on the World Wide Web. New York: Peter Lang.

Warnick claims that “web based affordances offer a number of advantages for public discourse that are unavailable in mass media” (p. 6), and she begins the book commenting the four perceived crisis in the public sphere: 1) the decline of the nation-state as the center source of power (p. 3); 2) growth of big media corporations through buyouts and takeovers (p.3); 3) constraints on political discourse through media practices and campaign behaviors (p. 4); and 4) attention to scandal in politics (p.5).  She then shows that more rhetorical analysis of online discourse is needed. How are online messages persuading in the public sphere? The second chapter discusses the difficulties of studying online communication. For one, many pages are designed by a corporation or unknown author; since rhetorical studies concerns authorial intentions, this analysis becomes problematic online.  However, just because the medium changes, this doesn’t mean all rhetorical analysis principles must change too; in addition to new techniques, some old can be adopted too.

Wise, J.Macgregor. (2005). Assemblage. In Charles J. Stivale (Ed.), Gilles Deleuze: Key concepts. 77-87. Montreal: McGill-Queen's University Press.

Gilles Deleuze developed the idea of assemblage. Assemblage is not the finished product in arrangement; rather, it is the process of that arrangement or organization. It is also not a set of prearranged parts, but it is also not a random collection either. It is the arrangement of a group of parts that have a relation due to what they make up as a sum of parts rather than the parts themselves.  Thus, we don’t know what something is until we can find out what it does. The term can be related to archeology which finds separate bones which ultimately make up a whole. Assemblages create territories by claiming space, and these claims are ongoing arrangements of movements, speeds and flows which can be dismantled to form other assemblages.  Assemblages are also discourses and semiotic systems which are embedded in the practices of those that are assembling and viewing the assemblage.  In order to understand technological assemblages, it is helpful to compare three concepts of the human- technology connection: 1) the received view sees humans and technology as separate entities and makes technology an agent. This can lead to technological determinism in that technologies can control humans or social determinism which implies that humans control technology; 2) the contextual view which sees technology as part of a larger context – technologies are culturally imbued and exist in social context and can be extracted from this context; this idea still sees technology and humans as different agents though; and 3) the view of articulation which sees technology as just articulations (connections) which can be put together and assembled uniquely for a desired purpose [interchangeable parts?] in particular contexts and practices. It invites questions of critical theory. Assemblage relates to idea of articulation but differs in three ways: 1) assemblages are not just “things, practices and signs” but also “qualities, affects, speeds and densities” (p. 84). 2) Assemblages go through “flows of agency rather than specific practices of power.” 3) “whereas articulation emphasizes the contingent connections and relations among and between elements, assemblage is also about their territorialization and expression as well as their elements and relations” (p. 84).  Ultimately, assemblages can group themselves into larger systems of assemblages  like cultures or ages which “may express a broader set of functions or principles” (p. 85); these would be considered abstract machines.  For Deleuze, a  larger control society  is emerging due to technologies which allow us to be controlled not by brick and mortar institutions and walls, but rather flows of assemblages from all our locations (schools, employments, purchases, etc). Resistance to these assemblages should always be a focus.  

Wise, J.M. (2002). Mapping the culture of control: Seeing through the truman show. Television & New Media 3(1), 29-47.

This article explores the cultural implications of Deleuze’s culture of control by looking through a lens of the Truman Show (TS). Wise maps this through 1) “the rise and dominance of a regime of surveillance and control” (different from Foucault’s surveillance and discipline) 2) “the explosion of product placement and the branding of everyday life” and 3) “the trust that plucky individualism will always triumph over the first two” (p. 29).

Communities

Barak, G. (1994). “Media, society, and criminology.” In G. Barak (Ed.), Media, process, and the social construction of crime: Studies in newsmaking criminology. 3-45. Garland Publishing: New York.

American conceptions of crime, the criminal justice system, victims, and offenders are introduced through the selection of stories and images the news media chooses to display. People like the idea of good versus bad, and our media fascination with first cowboys and now cops and robbers helps support this idea.   Stakeholders include journalists, sources, and audiences. A formula for the study of media and crime is “PERCEPTION OF CRIME=MEDIA x (CULTURE+POLIICAL ECONOMY) OVER TIME” (p. 6). Many news agencies purport to “tell it like it is.” Despite the diversity of channels, however, the news appears to tell the same stories that are not diverse and avoid controversy, and they do not really “conform with reality” (p.10). Minorities (race, class, gender) are underrepresented in “good” news stories and are disproportionally represented in the “bad” news stories. There are examples too though of the news media leading change such as support for homosexuals in the military, anti-Viet-Nam-war discussions legitimating resistance, and Gulf War pushback (p. 12-16). Barak brings up several lenses to look at the media through such as constitutive criminology and newsmaking criminology.  In a hierarchy of worst, bad and good crime news spectrum, the worst crime news are those that feature live footage and dramatic recreations; these are especially dangerous  because they lack commentary from criminologists that can put the stories in context. They instead present the world as out of control and at the mercy of others (p.23). Bad crime news is less subtle than the worst news. These stories are stories such as sexual assault which depict stereotyped versions of victims in TV-appropriate language; terrorism which ignore larger uses of state-sponsored force; and drug use which focuses on moral aspects of drugs more than other factors such as the economy and inequality.  Good news on the other hand is that research shows that not all stories are covered in the same ways, and not every story emphasizes race, class and gender like others do (p. 30). Overall, the absence of certain types of crime are as important and the inclusions of others (p. 33), and it is important to take into account how the mass media can influence society’s view of crime.

Ericson, R. V. (Ed.). (1995). Introduction. In Crime and the media: Cultures, communications and effects. (xi-xxx). Brookfield: Dartmouth.

According to Ericson, crime and the media often work together; for instance, there is news about crime, and TV and movies are often directly based on or feature stories about crime. The stories are rooted in “an omnipresent public discourse about disorder and decline” (xi) and are often used to support the need for more security. It also constructs “demons and enemies” and shows what “respectable fears” are. Bordieu (1986) found that crime and the media is a means of “articulating moral sensibilities, tastes and distinctions.” Often the media itself is targeted as the reason for moral decline. The overall theme is that “’crime’ is something that is ‘made’ according to the institutional classifications and communication formats” and different platforms have been described as helping these declines in different ways (xiv). There are five crimes of the mass media when dealing with stories of crime: 1) Fun (media entertains but doesn’t educate-“they dull the mind, induce laziness, foster political alienation and produce cultural dopes” p. xii). Regarding fun, three conclusions that the books makes are 1) that the format a trial is displayed in (i.e., televised) is important, and “television mediation, driven by the need to be fun, has a number of potentially negative effects on the legal processes: like invasion of privacy, prejudiced trials, and “public opinion verdicts’” (Drucker, xv); 2) moral problematic (individual moral character, community moral character, political morality, and organizational morality) and formatting make the public enjoy stories of crime, and higher quality outlets let the audience make the moral judgment, and lower quality outlets tell the audience what to think. Exploring mass media stories on crime ultimately function as a routine-ized, ritualistic function like the morning shower (Katz; xv); and 3) news is a source of order and security. People seek out specific news stations not because of their information per se but because of familiarity and routine, and the presence of news in public places provides ontological security. News ultimately “fives them a sense of controlling their individual lives through familiar, usual and taken-for-granted formats (Snow pp. xv-xvi). 2) The second crime is folly (they are organized in a way that will produce distorted knowledge- i.e., journalism is biased).  Regarding folly, the conclusions are that “the reality of news is embedded in the nature and type of social relations that develop between journalists and their sources, and in the politics of knowledge that emerges on each specific newsbeat” (p. xvi). 3) The third crime is fear (crime stories induce fear which leads to “Distrust, social distance, privatized and individualized lifestyles, and lack of community). Regarding fear, because drama has turned into a habitual experience, in order to mesh with pop culture, crime stories often need to be dramatized. Often the violence is presented as shocking and new rather than something that is a historical constant, and police often emerge as the heroes. Ultimately, three fears emerge: 1) “the dramatization of violence and other social problems is seen as the primary locus of popular fear”, and position people to be shocked into action (xix). 2) The persistence of seeing these stories may put people into dread, dismay, consternation, and trepidation (pp. xix-xx). And 3) people fear the medium itself- people fear that TV is part of the problem of crime and TV itself is ruining culture (p. xx). 4) The fourth is Fake (stories simulate reality so that everything seems fake leading to “lack of context or history, and a decline in hierarchical order.”). For fake, the dramatization of crime has turned crime into a spectacle, and whether wholly dramatized or televised for news, people are skeptical of the reality of crime representation. 5) The fifth crime is fetish (“the mass media commodify crime prevention and security products in the same way that they commodify toothpaste or a pop star, to the point of fetish” which results in safety as a part of consumer style like any other product). For fetish, since most broadcasting organizations are companies after profit, they are looking for an audience. Thus, the crime that they relate has become a commodity itself. They have also turned safety into a commodity that needs to be purchased, and push people to listen to “experts” who can provide knowledge (xxvi).

Ericson, R. V., Baranek, P.M., & Chan, J.B.L. (1996). Law and justice. In Representing order: Crime, law and justice in the news media (pp. 284-338). Toronto: University of Toronto Press.

This chapter explores how legal discourse and justice are cultural tools, open to interpretations which structures and constrains our imagination and interpretation of “criminal” acts. Legal discourse produces the words that indicate the appropriate way to speak about crime and its solutions (p. 285).  News further relays these structures. According to the authors, “The news emphasis on control, law, and justice is central to news sources, who view the news first and foremost as a vehicle for helping them to enact and control their organizational environment” (p. 284). Control is indicated by stories of arrests, fined corporations, officials dismissed, or laws enacted; many times the search for justice and authority are the central focus of these stories and stories are related through institutionally sanctioned discourse. “News involves control through the routine selection and classification procedures of journalists and their sources, through the influence news has on sources, and through the way in which news articulates and influenced public opinion about knowledge/power relations in society” (p. 286). “News presents images of control institutions that are perceived as real and therefore real in their consequences” (p. 287). The way crimes and social problems are presented through institutions can lead to erasure and are important because “this conception of power in turn is arguably the principal means by which the distribution of power in a society is made acceptable, legitimated as moral authority” (p. 288). In this study, law enforcement was depicted as most often being the agent of control (p. 308), and politics was second (p. 315). Much of the time, government agencies themselves were the target of control (p. 317).  

Finn, J. (2009). Capturing the criminal image: From mug shot to surveillance society. Minneapolis: University of Minnesota Press.

Finn talks about all the ways photograph plays into surveillance: the mug shot, the fingerprint, DNA analysis, in databases, and border security. His theoretical framework is Latour's idea of inscription by showing that meaning making is often done at many points along the way, to include the moments when biometrics become our identifiers. Finn starts by walking through the history of the mug shot; it used to be the positivist belief by those like Galton and Bertillion that the body could measured and recorded to identify deviancy, and there were criminal types. Mug shot became part of the identification information so that mobile criminals could be identified wherever they went. This subsequently gave way to the fingerprint, and then DNA has further added to the biometric profile.  This information is ultimately stored in databases. Large databases bring not only criminals into the gaze, but databases are bringing the yet to be guilty into the gaze. Those that are most suspected, due to social understandings of what is dangerous, are in categorical suspicion (p. 121). These databases can be searched and referenced at places such as borders. Ultimately, photographs help make the criminal visible.

Gee, J.P. (1990). Social linguistics and literacies: Ideology in discourses. Bristol: The Falmer Press.

This book argues that ideology is present in any theory especially in language use, and discourse analysts are the experts to make clear the ideologies built into language use (p. 24). It is an ethical imperative to break down the ideologies (p. 22). Literacies are social phenomenon. It is a literacy myth to just learn skills, and if the goal is to learn skills, then this teaches people to reiterate information but not to think. New literacy studies criticize the divide between oral and written words. Also, there are multiple literacies. We assume many cultural things in our use of language; for instance, when we think of the work bachelor, we don’t think of a priest or gay man; we think of a heterosexual male. We inevitably allow social institutions to do some of our work for us (p. 86). Gee uses everyday conversations and shows that our ordinary utterances are sociocultural performances. There are five subsystems in discourse meaning: cohesion, discourse organization, contextual systems, thematic organization. Primary discourses are socioculturally determined ways of thinking based on our relationships with intimates, and secondary discourse are socioculturally determined ways of thinking based on our relationships with institutions beyond the family. Both involve values and ways of thinking which supersede either of these groups.  Gee situates language in Discourses which are “socially accepted association among ways of using language, of thinking, feeling, believing, valuing, and of acting that can be used to identify oneself as a member of a socially meaningful group or ‘social network’, or to signal (that one is playing a socially meaningful ‘role’” (p. 143). Discourses are ideological, resistant to internal criticism, standpoints taken by Discourse in relationships to each other, concerns itself with certain viewpoints and objects, and are related to the distribution of social power (p. 144). Discourses are sites of competing beliefs. Literacy is a “full mastery” of secondary discourses (p. 155). Primary discourse always affect secondary discourses. Schools read students who have a similar primary discourse and often judge those with more diverse primary discourse as not as valuable. The school, however, even if to colonize the diverse primary discourses, opens itself up to be affected by it. The media helps control variations of Standard English dialects (p. 13). Theories “are a set of generalizations about an area (in this case, language and language acquisition) in terms of which descriptions of phenomena in that area can be couched and explanations can be offered. Theories, in this sense, ground beliefs and claim to know things” (p. 15). Overall, Discourses a certain sort of person in ways of seeing and communicating with the world (p. 192). Each discourse has expectations of its members, and one must not forget that we are performing certain expectations. Discourses privilege those that master them (p. 192). Overall, knowledge of the ideological underpinnings of Discourse is power, and we must use our power to stop harming others and protect others from being harmed (p. 192).

Jewkes, Y. (2004). Crime and the Surveillance Culture. In Media & Crime: Key Approaches to Criminology (pp. 171-198). Thousand Oaks: Sage Publications.

According to Jewkes (2004), CCTVs are at the forefront of a surveillance society (p. 172), and there has been a “disappearance of the disappearance” (Haggerty and Ericson, 2000: 619). Jewkes uses the Panopticon as a “motif” which unites disciplines to begin her discussion of surveillance stating it is basically a way to talk about the watching of some by others (p. 174). Technological advances though have expanded beyond closed environments and extend to society as a whole. For instance, CCTVs are everywhere. One major drawback though is that, as Haggerty and Ericson (p. 618) note, they are a mile wide but an inch deep because they don’t work without networks of other associated information compiled in a database (p 175). The amassing of chunks of information into a larger whole is an surveillant assemblage.   Foucault’s notion of a carceral society involves the creep of prison-like aspects into regular public life. Assemblages then help create more systems of discipline and domination (p. 177). Foucault noticed that crime used to be treated on the body, but now the mind is the target of crime prevention strategies (p. 178). One strategy is controlling the human body with biopower like photographs, fingerprints, bodily fluids, DNA, and retina scans. There is also technological surveillance through things like electronic monitoring. Another strategy is governmentality; it has multiple aims.  One is to develop new strategies for crime control to single out people that don’t belong and preemptively exclude them. It moves “to render populations quantifiable through identification, classification and differentiation” (p. 180). Jewkes states, Rather than attempting to tolerate, understand and rehabilitate the different and the dangerous, there has been an ideological shift towards the less expensive and simpler task of displacing them from particular locations and from opportunities to obtain goods and services of restricting mobility and behavior; and of managing them rather than changing them (p. 180). According to Stenson, 2001; 22, this shift disproportionately singles out poor whites and minorities who are “segregated in ghettoized spaces that function as ‘human garbage dumps where survival, excitement and success and opportunities for entrepreneurship depend on increasingly on involvement in illegal economies” (p. 180).  The consequences of increasing surveillance are the increased idea of danger and fear.  Fear is a tactic that doesn’t need technology to monitor; when people have fear, they discipline themselves (p. 185). Those in the mainstream tend to demonize the others (p. 181-2) to create fear. Many times the solution to fear is more surveillance which really only nets more fear (p. 182). It also blurs the line between public and private space. There is an argument as to what is appropriate to monitor nowadays; should everyone be subject to surveillance, or should we focus on the terrorist, criminal, drug dealer, etc. (p. 185).One of the largest drives for surveillance expansion is profit (p. 185). While Foucault talked about the soul being a target for surveillance, some like Bauman (1992) argue that surveillance now less about discipline and repression and more about categorizing wealth and consumption patterns for customer profiles to seduce the desirables to the market economy.  This profile supposedly creates Gandy’s idea of a panoptic sort (“a situation in which individuals are continuously identified, assessed and sorted into hierarchies which are used to manage and control their access to goods and services” , p. 186) because it predicts future consumption and puts populations at (dis)advantage for things like credit and movement (across borders) (p. 186).  In this idea, the shopping mall (a history of the mall is provided by Jewkes p. 187), or a ‘cathedral of capitalism’ being monitored by safe, friendly cameras becomes the goal of society and emphasizes sanitation (p. 189), power and privilege where customers either fit in or don’t (p. 187). The many can watch the few in the synopticon, and reality shows can show this. The obsession with media for movies and literature shows aa ore basic yearning for seeing in the lives of others (p. 191). Trust has been replaced with surveillance technologies who can scientifically monitor and compute probabilities (p. 192).  

Jewkes, Y. (2004). Theorizing Media and Crime. In Media & Crime: Key Approaches to Criminology (pp. 1-34). Thousand Oaks: Sage Publications.

The idea that mass media can cause deviant behavior (i.e., TV and video games cause people to be violent) has roots in mass society theory and behaviorism (p.5) which was the idea that mass media injected ideas into people, and their responses would be measureable and quantifiable.  By the 1960’s, researchers turned away from behaviorism (p. 13) and towards theories based on a more sociological approach (based off ideas from Chicago School sociology) such as strain theory and anomie (p. 14). A third theory is Marxism and the critical criminology/’dominant ideology’ approach (p. 16) which sees the ruling class using mass media to control the populations, and crime is just a label the ruling class places on activities which they define as criminal (the “labeling theory”) (p. 17). A fourth approach is pluralism (p. 21) which sees the audience as more skeptical and mass media as more positive and offering a intellectual freedom. This viewpoint is often the view of journalists and policy-makers.  Realists question what is actually the consequences of mass media and what people do with media (p. 24), and postmodernism sees the media as shifting to the role of entertainment with its sole purpose of being to satisfy the audience (p. 26).

Klofas, J.M., & Porter, J.L. (2011). Corrections and sustainable communities: The impact on local populations. Journal of the Institute of Justice and International Studies, 11, 117-28.

Klofas and Porter address the inclusion of crime and safety into definitions of sustainable communities (p. 117). Ideas of social science were largely omitted from sustainability at first, but now things like “[I]ncome distribution, democracy, and human rights have become part of a legitimate discourse on sustainability” (p. 117). The authors quote Raco (2007) who suggests “implies that community justice is embodied in a community that provides needed space and opportunities for its Residents” (p. 118). The authors point out statistics of how many people are or have been in prison, and conclude that “[i]n terms of criminology and criminal justice, attention to sustainability would certainly mean focusing on the implications of crime and crime policy for the health of geographically and demographically defined communities” (p. 119).

Lankshear, C & Knobel, M. eds. (2008). Digital literacies: Concepts, policies and practices. New York: Peter Lang.

This book focuses on broadening digital literacy to incorporate the trend of the term digital literacies. The subtitle of “Concepts, policies and practices” really hits to the purpose of the book. Moving to “literacies” is important because of 1) the many versions of digital ‘literacy’, 2) the sociocultural nature of literacies, and 3) the benefits for incorporating literacies and the implications on learning (p.2). First, there are many definitions of digital literacy (p. 2-14). Chapter 1 traces the history and lineage of digital literacy. Bawden traces the origins and concepts of digital literacy to Paul Gilster in 1997 (p. 18) but shows that Gilter was also influenced by information literacy and computer literacy; there have been lots of lists of literacy competencies.  One basic interpretation of digital literacies are functional literacies, and this is the focus of chapter 2. This idea centers on the ability to use technologies specifically involving the internet. Chapter 3 discusses information literacy and being able to navigate through information flows to “find, evaluate, and accept or reject information” (p. 50). Chapter 4 talks about what students need to know in an educational context in order to be literate. Chapter 5 discusses policies in the EU designed with the idea that that citizens need to be digitally literate and competent in the learning economy (p. 101). Chapter 6 talks specifically about Norway’s digital curriculum (p 119) and emphasis on the importance of using and being competent with digital tools. Chapter 7 explores the “digital society” and the baggage that this term comes with such as 1) technological determinism, 2) the ability to blame social problems on technology, and 3) implying that technology comes out of nowhere and changes the world (p. 151). The chapter also discusses several components considered a digital literacy which make up “literacies”, and it also explores levels of comprehension. Chapter 8 deals more with building on literacies to move from knowing to doing activities such as remix and production. Chapter 9 deals with digital literacies in the workplace (p. 203), and chapter 10 deals with digital literacies of online shopping (p. 227). Chapter 11 talks about digital literacy in online social networking (p. 249). Chapter 12 talks about legal discourse and remix (p. 279). Overall, the book breaks down and discusses what puts the literacy in digital literacies for different authors, in multiple places, in multiple environments, across the world.

Lewis, T. (2006). Critical surveillance literacy. Cultural Studies Critical Methodologies 6(2), 263-81.

Lewis mixes the idea of critical literacy with studying surveillance. Lewis specifically focuses on schools and surveillance and asserts that school surveillance is the “articulation of domestic militarism” and “substitutes compassion for suspicion and punishment” (p. 264).

Raco, M. (2007). Securing sustainable communities: Citizenship, safety and sustainability in the new urban planning. European Urban and Regional Studies 14(4), 305-20.

Raco addresses the mixed messages provided by government about sustainable communities. He states, “On the one hand, they promote community balance, mix and diversity as a vehicle for the creation of more functional and less crime-ridden places. On the other hand, they simultaneously identify diversity as a threat to community safety” (p. 305).

Rheingold, H. (2012). Net Smart : How to Thrive Online. Cambridge, MA, USA: MIT Press.

Rheingold’s book addresses how to be literate and smartly use digital media. Rheingold addresses five literacies: 1) the literacy of controlling attention; 2) the literacy of filtering (out crap) information; 3) literacy of participation; 4) literacy of collaboration; and 5) the literacy of network savvy. First, for the literacy of controlling attention, Rheingold advocates harnessing your attention’s attention and practicing mindfulness which he states as the process of metacognition (p. 69). Metacognition, involves a mental awareness of thoughts, and he says involves metacognititive knowledge (what individuals know about themselves and others); metacognitive regulation (the regulation of cognition and learning experiences thorugh a set of activities that help people control their learning); and metacognitive experiences what deal with the current, ongoing cognitive endeavor) (p. 66). The overall rule of this principle is “pay attention to your intention” (p. 77).  This element is reflective (p. 145). Second, to filter through the massive amounts of available information and filter out the crap, first ask 1) who is the author? (p.79); 2) note design (but don’t rely on it) (p. 79;) 3) who has the same stories ideas posted (CNN? BBC?) (p. 80); 4) examine the credentials and who are the publishers (p. 82); 5) understand how search engines work (p. 87); 6) have a well-organized work space (p. 104); and don’t always trust algorithms alone to identify the truth (p. 106). This idea is analytic (p. 145). Third, to understand the literacy of participation, you have to understand why participation through things like blogs, social media, or other online avenues is important (p. 133). This idea builds off Jenkins’ thoughts on participatory culture(PC) that you don’t just consume culture, but you also participate in it to build it (p. 114). In PC, things like reading, tagging and commenting are low threshold for collective intelligence, and thinks like leading, moderating, and collaborating are high engagement with collaborative intelligence (p. 120). One thing that can be learned in participatory culture is making connections, and Rheingold states, “Making connections is a learnable skill that is amply rewarded by networked publics” (p. 122).  Many times, bloggers become filters of links to help their followers make connections. Curators also accumulate material (p. 126). It is important to understand who profits from your material (p. 134), and how one looks to themselves and others (p. 138). This element is deliberate (p. 145). Fourth, collaboration is helpful to get more, and better information than one may have obtained by one person alone. To understand the literacy of collaboration, you need to understand virtual communities. This means a 1) need to know the territory you are getting into by understanding the difference between a community and a network. 2) You also have to understand the community you are entering, which is a form of netiquette (p. 163) and assume goodwill;  3)”…jump in where you can add value” (p. 164). 4) “Reciprocate when someone does you a favor or shows a courtesy” (p .165). Fifth, to understand the literacy of networks, one has to understand that “Networks have structures, and structures influence the way individuals and networks behave” (p. 191).  In group-centered societies, many of the contacts know each other; in network-centric societies, connections are not likely to know each other (p. 193). Nodes are the people connected by the networks. A bridge is a person that links two networks (p. 204). Hubs have many links (p. 207). Networked individualism allows people to make networks beyond their immediate geography (p. 211). Social capitol are networks of trust and norms of reciprocity that enable the farmers in this group to get things done together that they might not have been able to do otherwise” (p. 217). Ultimately, one should engage with others and grow their networks to be more informed with a wider group of resources to pull from. Overall, in order to be an informed citizen, one needs to watch what those in power are saying in the public sphere (p. 240) and use remix to buck the prevailing thoughts (p. 242).       

Selber, S. (2004). Multiliteracies for a digital age. Carbondale: Southern Illinois University Press.

Selber unpacks and explains previous understanding of computer litereracies and offers alternative strategies for understanding this idea. Many times the idea of computer literacies is a decontextualized understanding of the mechanics of computers and is limited by not including the "social, political and economic contexts" (p. 20) and resultant implications of the technology. The book claims to take a "postcritical stance" which means, as drawn from Stanley Aronowitz, that computers are going to be used in educational contexts, and teachers should look for ways to use computers to align and challenge professional values; and from Patricia Sullivan and James Porter, Selber emphasizes that computer literacy means understanding a "critical consciousness of its position" (p. 8). Ultimately, technology is not a neutral (p.11, p. 23), "self-determining agent" (p. 8), and it can be used to promote social change (despite that it also reproduces "dominant cultural values" (p. 12). In order to reimagine computer literacies, Selber encourages a mulitliteracy approach which uses different kinds of computer literacies, and Selber uses functional, critical, and rhetorical literacies to do this (p. 24). In the case of computers, Selber describes functional literacy as imagining computers as tools, critical literacy as computers as cultural artifacts, and rhetorical literacy as meaning computers as hypertextual media; this can also be understood as students as users, questioners, and producers of technology (p. 25). Functional literacy has been described as being "reduced to a simple nuts-and-bolts matter" and just mastering techniques (p.32), neutral and decontextualized out of the social sphere it exists in. It is not neutral because it is shaped by market forces (p. 39) and needs a skilled-enough user to operate it (p. 40). In order to work through the reductiveness of functionalism, Selber encourages using five parameters (p. 45): educational goals (meaning students can use computers for educational goals, p. 45), social conventions (meaning students understand that students must decode the social space expectations, p. 51), specialized discourses (a student must understand the appropriate language of a community to include the cultural privileges embedded in it, p. 55), management activities (a student needs to understand how to manage the large amount of information computers amass p. 61), and technological impasses (students' inabilities to traverse writing or communication problems (p. 67) such as computer anxiety and issues of race, class, and gender, p. 68). The lens of critical literacy encourages students to understand and question the politics of computers (p. 75) and is set against the idea of construcitivsm (that students learn based on their previous knowledge p. 76). Critical literacy challenges the status quo and attempts to socially and politically reconstruct the practices i.e. computers are caught in (p. 81) and does not accept computers as a neutral technology. In order to illustrate this, Selber refers to computers as artifacts (p. 86). By doing this, computers take a more archeological tone and highlight the psychological (p.90), physical production, material, and social backdrop (p. 92) aspect of computers and turn students "into the critical role of questioner" (p. 95). Finally, rhetorical literacy "concerns the design and evaluation of online environments" (p. 182) and the idea that students "should become producers and not just users of computer-based environments" (p. 140), and rhetorical literacy can help craft "texts" of social action. Rhetorical literacy concerns praxis; it is integrating the functional and critical literacies into a action and embodiment (p. 145). It concerns persuasion (all acts are persuasive p. 149), deliberation (deliberate which actions needs to be taken over another), reflection (conscious, critical understanding of actions), and social action (actions taken are not solely technical but are also social) (p. 147). Computers also provide hypertextual media arenas that disrupt a traditional start-to-finish understanding of static text (p. 169), but the way these elements are put together are important (as can be seen by perhaps a site ma), and as Dennis Wood states, map making is never innocent (p. 181). Rhetorical literacy should ultimately allow students to be "agents of positive change" and "reflective producers of computer technologies" (p. 134) The book concludes by contextualizing Selber's recommendations in examples and recommending that with "pedagogical, curricular, departmental, and institutional" (p. 233) support, English departments can incorporate various literacies in the digital realm.

Street, B. V. (1984). Literacy in theory and practice. Cambridge: Cambridge UP.

Street discusses the differences between autonomous and ideological models of literacy. Street criticizes what he calls "the autonomous model" which is based off of research from those such as Hildyard and Olson, Goody, and to some extent Lyons. Hildyard and Olson privilege Western education systems which teach away from real-life situations (p. 35) and reinforce their own academic style (p. 40). They conclude that literacy is a neutral (p. 30), universal (p. 38), and literacy and orality are vastly different (p. 42). Goody considers writing a "technology of the intellect" (p. 44) which reinforces technological determinism. He concludes that "if some societies are more 'scientific' and 'logical' than others, it is not on account of the nature of their thought processes but because their acquisition of literacy has released these capacities" (p.49). Lyons contradicts himself by presenting that literacy is objective, universal and asocial but admits that it is also culture-dependent (p. 84). Street criticizes them all as being ethnocentric and academic-centered (p. 70). Instead, Street recommends the ideological model which centers around a socially constructed  understanding of literacy.  Instead of an autonomous model, Street says that "attention to the 'interpersonal,' socially-conditioned aspects of literacy is central to understanding the nature of that practice" (p. 43). Literacy is "always embedded in some social form" like letter writing, styles, or academic texts (p. 43), and technology is not neutral -- it is also socially constructed (p. 65). There is no "'core meaning' of an utterance but rather there are always alternative models" (p. 85) (see Derrida reference p. 101), open to socially-constructed interpretations. There is no great divide between "objective" written text in opposition to "'context-dependent', oral, lay language" (p. 94). Literacy is thus a "social process, in which particular socially constructed technologies are used within particular institutional frameworks for specific social purposes" (p.97). Additionally, Street references Graff who concludes that "the structures of demands, needs and uses for literacy, and thereby the definitions of it, vary according to context" (p. 109). Street concludes his theory section by quoting Shirley Brice Heath when she says, "Literacy events must...be interpreted in relation to the larger sociocultural patterns which they may exemplify or reflect" (p. 125). The second and third sections of Street's book discusses Street's work in Iran in the 1970s. Street addresses Maktab literacy which helps Street conclude ""literacies' acquired in different contexts may be quite different, or conversely, may have simliaties at leaves that have not been recognized" (p. 154); commercial literacy which helps Street conclude that "literacy is, in fact, a socialisation [sic] process rather than a technical process" (p. 180); Unesco literacy campaigns which help Street reinforce the importance of adopting an ideological model of literacy (p. 212); and literacy campaigns in the UK and USA which help Street to conclude that lack of 'academic' literacy won't hurt intellect (p. 225) and politically, if literacy is constructed as a problem, people should be able to learn more personally applicable desires for literacy taught by people who understand literacy ideologically (p. 226-8).

Street, B. (1999). The meanings of literacy. In Daniel A. Wagner, Richard L. Venezky, and Brian V. Street (Eds.), Literacy: An international handbook (pp. 34-40). Boulder: Westview Press.

Street outlines discussions about literacy. Street first takes on the "autonomous model" which treats literacy as being a technical skill which exists outside of social context (p. 34-35). Goody and Watt call it autonomous because it is supposedly different from oral communication and it exists on its own without the time and place of the utterance needed for consideration. Many criticize this approach to be ethnocentric, privileging the "modern" over the traditional society (p.36). Street calls the alternatives "social approaches to literacy." This understanding positions literacy as being caught in social practice. Because it is a social practice, "it varies with social context and is not the same, uniform thing in each case" (p. 37), and it involves contested meanings of meaning, definition, and control of literacy agenda. Social approaches to literacies often expand to multiple literacies which helps to conceptually explain that there is no single literacy which "is the same everywhere and simple needs transplanting to new environments" (p. 37). Additionally, the term "literacies" has expanded to mean more than just communication and has moved into domains of social life such as "computing, politics, and so on" (p. 37); this however pulls on the autonomous idea of literacy meaning that being "literate" is a exterior skill set or "a set of competencies of skills" (p. 37). New Literacy Studies has approached "literacies" as meaning a "set of practices around literacy, whether with computers, visual media, or traditional print, as a complex of these domains that varies with context and social meanings, so that the emphasis is not so much on the medium as on the practices" (p. 38). "Literacy practices" are the intersection of cultural practices and reading/and or writing in particular contexts; there are many practices, multiple cultures, and no one idea of literacy (p. 38). The same practices can be viewed in different ways by any stakeholder.

Surette, R. (1992). “The media’s influence on attitudes and beliefs about crime and justice.” Media, crime, and criminal justice: images and realities. 79-106. Wadsworth Publishing Company: Belmont, California.

Surette theorizes that news and entertainment are a long, continuously running public relations campaign (p. 80). He asserts that people live in two worlds: real and media. One is from experience, and one is through the decisions of editors and producers. Many times (Surette cites Garger’s 95%), people get their knowledge about crime from the media (as opposed to direct experience). The chapter uses three lenses and hypotheses to look at criminal justice and media: public information campaigns, public agendas, and public attitudes. First, Surette discussed public information campaigns. These provide implications for mass media, crime and justice because  although research shows that calculated campaigns to sway public opinion were not that effective,  non-grand-planned repetitive and pervasive qualities of mass media make mass media may have “unplanned effects on attitudes, particularly in the area or crime and justice” (p.87).  These effects would probably favor crime control attitudes as opposed to due process because media has more emphasis on crime control policies. Second, in regards to setting agendas, older studies have shown that in agenda research, “the media, by emphasizing or ignoring topics, may influence the list of issues that are important to the public – what the public thinks about, rather than what the public thinks” (p. 87).  Specifically for crime and justice, media has “been credited with raising the public’s fear of being victimized to disproportionate levels and hence giving crime a disproportionately high ranking on the public agenda” (p. 88).  While “proportion” is debated in coverage, some do assert that more coverage of crime could block other problems like hunger from the agenda. This lens of public agenda is limited and often noted as socially insignificant however due to conflicting research results. Third, regarding public attitudes, Surette points to a study by Gerber that looked at television watchers and found that even sophisticated viewers include fictional media representations in their worldviews (p. 90). Other studies have since shown that whether news is local or distant matters. Also, social background does mitigate the adoption of information.   For instance, a Doob and MacDonald study showed that for those that experience crime directly, media tends to have less of an impact.  Conclusions Surette came to were that media in essence, play a part in the ecology of public policy (p.99); they are both messengers and actors. Also, television is more-related to fear of crime, but print relates to knowledge about crime and adoption of crime prevention. Additionally, some individuals are more susceptible to crime and media. Overall, the relationship between media and crime and justice attitudes depend on three things: medium discussed, medium’s style of presentation, and “experiences, predispositions, and immediate community of the consumer” (p.96). Whereas data shows lack of consistent findings when dealing with media, crime and justice, it does seem to suggest that there is a correlation in the perceptions of the world and community (p. 102).

Interactive Media and Technology

Aas, K. F., Gundhus, H.O., & Lomell, H.M. (Eds). (2008). Technologies of Insecurity: The Surveillance of Everyday Life. Taylor & Francis.

The book’s main purpose is to navigate between those that bemoan the addition of technologies in surveillance and those that act like it will save society from insecurity (p. 3) and push past the straight jacking of surveillance metaphors like Big Brother and the panopticon. Foucault talks about four themes of technology (meaning for Faucualt a “broad social matrix of action”): technologies 1) of production (it lets us produce things); 2) of sign systems (which permit us to makes signs and symbols for things); 3) of power (which determines the conduct and controls the behavior of others); and 4) of the self (which allows individuals to make decisions about themselves (p. 4). Technologies of the self and domination are often brought together under the idea of governmentality. According to Lyon, it needs to be remembered that “surveillance not only constrains but also enables actions” (p. 6).  In order to carry out these themes, there are five sections. Part one is (In)Security and Terror and contains the following: Chapter 1 is by Neyland and talks about terror and everyday objects (as opposed to people which is the dominant focus of study) (p. 21). We often turn objects into things to fear (i.e., the letter could be a bomb, p. 22). Neyland looks at science and technology studies (STS) and actor network theory and concludes that objects are the heart of governance and accountability (p. 24), and the failure to look at the ordinary-ness of object obscures the assistance that a more critical look at objects could give. Chapter 2 is Lyon’s discussion on identification practices and national ID cards (p. 42) going through their history to include uses for colonial control (p. 47), crime management (p. 49), and war (p. 52). Ultimately, national ID cards often bring together mobility, entitlement and exchange (p. 55) which often privilege the rich and socially sort others in the name of security (p. 56) through a registry, database, and assemblage. Part two is (In)secure Spaces starting with Klauser’s chapter 3 discussion of FIFA in 2006 Germany (p. 61) where he discusses the urbanization of surveillance, public viewing sites, surveillance globalization, technologies used at places such as World Cup (and the corporations making those technologies), and he concludes that space is important for operations and globalization. Jones’ chapter 4 uses Lessig’s regulatory schema and looks at checkpoint security, be it at the border, airport, schools, or anywhere gateways are monitored (p. 81). This brings questions inclusion/exclusion and involves human, bureaucratic, and technological stakeholders for social sorting (p. 99). Part three is (In)secure Visibilities, and it contains Nellis’ chapter 5 work on tracking offenders by satellite (p. 105) which involves not just tracking an offenders place at a location (p. 109), but rather the offender’s movement in space and delves into themes of mobility, ubiquity, and acceptance of these practices in America and Europe. These scenarios would not have been possible without a historical context and the move towards normalizing surveillance. Douglas Smith’s chapter 6 discusses the roles present in surveillance but argues that surveillance workers (in this case, the CCTV employee) are relegated to watchers, but they should also be seen as empowered and disempowered workers in connection with technology (p. 125). The article ultimately reminds us that technological determinism ignores the social, human underpinnings of surveillance. Koskela’s chapter 7 discussion on amateur photography (p. 147) discusses “hijacking surveillance” where the public gets involved with surveillance practices. This is the fourth phase in a resisting surveillance scale (1) being passive acceptance, 2) critical approach; 3) counter-surveillance). Old notions of surveillance saw organized watching from the state, but now surveillance is decentralized and out of sight (p. 149) (with the “technological turn” being a push in this direction- p. 150). This is ultimately causing “the historical structure of political institutions of the ‘authorities’ and ‘the public’ is fading. When there is no difference between the controllers and the controlled, all politics and ethics need to be rethought. The democratic idea of representational authority is breaking down” (p. 163).  Part four is (In)secure Virtualities and has Jewkes’ chapter 8 work on technologies in prisons, and in particular, inmate use of the internet (p. 171). Jewkes concludes that inmate internet use is framed as a security matter which “makes the policy of denial intelligible, but it obscures the reality, which is that contemporary penal philosophy, segregation, separation and silence remain the severest penalties” (p. 187). Yar’s chapter 9 work on private policing (p. 189) starts out with a discussion on a discourse of insecurity and society’s desire, in response to reports about dangerous things, to keep safe and securitize aspects of life. Yar concludes that two factors have contributed to a call for increased private interest in securitization: 1) a “greater economic dependence on information communication has resulted in an imperative to secure such systems” (p. 200); 2) political sensitivities “Have inspired efforts to secure critical information infrastructures against private attacks” (p. 200). Part five is (In)secure Rights and contains Goold’s chapter 10 discussion of technology and institutional trust (p. 207). He starts out by looking at both cultural and institutional theories and concludes that although speculative, although research focuses on the immediate reactions of using technology and surveillance, failing to see how trust in democratic societies plays a role in surveillance is ignoring a large area of study (p. 217).  Yttri Dahl’s chapter 11 discussion of layers and DNA evidence (p. 219). He discusses how DNA is included in legal procedures, but there should be caution when incorporating new techniques into legal matters. Although the use of scientific and technology-related data is often not scrutinized, sometimes the insecurities behind the techniques need to be brought into the open (p. 235). [See Finn’s work on Latour’s ANT and inscription]. Finally, Halvorsen’s chapter 12 discussion is on torture, terror and rights (p. 238), especially in a post-9/11 context. The work looks critically at the often-evoked Huntington’s idea on “clash between civilisations” as opposed to more social scientist/humanities, and the author challenges research to show that Huntington’s idea is false (p. 255). The epilogue is by Zedner and focuses on the persistence of security technology insecurity (p. 257). Readers are reminded to 1) realize that society has influenced these technologies, and technologies do not just pop out of nowhere (p. 259) and tech is a social contrstruction (p. 267); 2) there are also limits to technology (p. 260); 3) technologies help shape identities (p. 263); 4) they are not always a cure-all (p. 265); and 5) there may be gaps between the stated aims and underlying purposes (p. 268). Ultimately, a micro-analysis of how specific technologies work, the interaction between man and machine, the interaction between machine and society (cultural, political, and structural) need to be examined (p. 269).

Baym, N.K. (2010). Personal Connections in the Digital Age. Cambridge, UK: Polity.

Baym's book focuses on examining mediated communication. She highlights that this is not necessarily a new conversation; there is a history of doubting technology facilitates “real” communication (i.e., p. 27, 35). Baym discusses how media influences connections (as opposed to f2f) in terms of interactivity (participatory), temporal structure (i.e., synchronous/asynchronous), ability to transmit social cues, storage/replicability, reach (audience), and mobility (portability). Two paradigms of technology Baym explores is technological determinism which implies that technology controls people instead of people controlling technology, and the social construction of technology (SCOT) which explains technology as coming from social processes (p. 39); i.e., designers make development choices based on social forces. In digitally mediated communication, participants tend to recreate the social norms of non-mediated worlds (p. 71, p. 81), and there is no standard digital language across all platforms. According to Baym, "Any instance of digital language use depends on the technology, the purpose of the interaction, the norms of the group, the communication style of the speaker's social groups offline, and the idiosyncrasies of individuals" (p. 65).  Just because people interact online doesn't make them a community, but Baym found five qualities of online groups that overlapped with definitions of community. These qualities are space (which can include a place online, p. 76); practice (unconscious, routinized behaviors, p. 77); shared resources and support (roughly, ability to get help such as emotion, esteem, and information p. 82-4); shared identity (sense of shared space, practices, and support, p. 86); and interpersonal relationships (one-on-one relationships, p. 89). One main issue people worry about in online communication is identity. According to Baym, "Digital media seem to separate selves from bodies, leading to disembodied identities that exist only in actions and words" (p. 105). Ultimately, the internet has just provided another means for forming and maintaining relationships. Although mediated communication may offer less social cues, there are ways to replicate some social cues (like emoticons, p. 103) and relationships can exist. Additionally, due to the replication of social norms in mediated spaces, people are not necessarily more dishonest than in f2f contact. A person's mediated relationships may ultimately depend on one's social nature (p. 133) and understanding embodiment, old and new media (p. 155).

Baym, N., & boyd, d. (2012). Socially mediated publicness: An introduction. Journal of Broadcasting & Electronic Media, 56(3), 320-329.

Baym and Boyd discuss the public nature of social media. One important idea expressed in this article was that social media's architecture and affordances shape and complicate the nature of the public. The authors highlight Dayan's (2001) definition that ""audiences" are aggregates produced through measurement and surveillance, while "publics" actively direct attention" (p.322). Social media is not a one-way model like i.e., television, so the lines between the two began to blur. Social media audiences are now more visible (because they are more real to people, although some could argue face-to-face is the real audience), and Rosen (2006) moves further to describe the social media audience as a collective "people formerly known as the audience" (p. 322). With a more real audience, "people may imagine they are addressing the people who most often comment on their messages, their supervisors at work, or they may not be considering recipients at all" (P. 323). In 1959, Goffman worked on ideas saying we define ourselves in relation to others, so defining ourselves on social media is not new. But the time and space dimensions of social media may make it hard to know who is where and when (p.323). The awareness of a public in social media changes how people write. The authors state that people "become more aware of themselves relative to visible and imagined audience and more aware of the larger public to which they belong and which they seek to create" (p. 325). The difference between mass media and social media is the participatory nature of social media; network architecture is cheaper to produce and therefore lowers the barriers of entry (p. 326).

boyd, d., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society,(15)5, 662-79.

Boyd and Crawford discuss Big Data (BD). BD creates three classes of people: those who create (purposefully or by leaving footprints); those who collect data, and those that analyze it (p. 675). They start out problematizing the overall term of BD. They state, “Big Data is, in many ways, a poor term. As Manovich (2011) observes, it has been used in the sciences to refer to data sets large enough to require supercomputers, but what once required such machines can now be analyzed on desktop computers with standard software…Big Data is less about data that is big than it is about a capacity to search, aggregate, and cross-reference large data sets” (p. 663). Also, BD has not just changed computation; it has changed the way we do things. They state, “Big Data not only refers to very large data sets and the tools and procedures used to manipulate and analyze them, but also to a computational turn in thought and research (Burkholder 1992). Just as Ford changed the way we made cars – and then transformed work itself – Big Data has emerged a system of knowledge that is already changing the objects of knowledge, while also having the power to inform how we understand human networks and community” (p. 665). They quote Latour who reminds that if you change the tools you change the social theories that go with them. In the larger scheme of things, BD causes a “profound change at the levels of epistemology and ethics. Big Data reframes key questions about the constitution of knowledge, the processes of research, how we should engage with information, and the nature and the categorization of reality” (p. 665). The authors also state BD is a result of interaction between 1) technology (maximizing computers and algorithms to “gather, analyze, link, and compare large data sets” “(2) Analysis: drawing on large data sets to identify patterns in order to make economic, social, technical, and legal claims. (3) Mythology: the widespread belief that large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity, and accuracy” (p. 663). Anderson (1988) traces the beginning of big data back to the US Census Bureau’s punch card machine in 1890 (p. 664). Lessig (1999) said that “social systems are regulated by four forces: market, law, social norms, and architecture – or, in the case of technology, code” (p. 664). The authors add that BD information and numbers don’t speak for themselves (p. 666). There are other social understandings below the raw data.  Although social scientist can be the ones coming to conclusions based on the data, “there remains a mistaken belief that qualitative researchers are in the business of interpreting stories and quantitative researchers are in the business of producing facts. In this way, Big Data risks re-inscribing established divisions in the long running debates about scientific method and the legitimacy of social science and humanistic inquiry” (p. 667); there is only a false notion of objectivity. They state, “Interpretation is at the center of data analysis” (p. 668). Regardless of how objective numbers seem to be, the authors remind that “all researchers are interpreters of data…A model may be mathematically sound, an experiment may seem valid, but as soon as a researcher seeks to understand what it means, the process of interpretation has begun” (p. 667). The research isn’t always 100% correct either, and “Large data sets from Internet sources are often unreliable, prone to outages and losses, and these errors and gaps are magnified when multiple data sets are used together” (p. 668).  The word apophenia means “seeing patterns where none actually exist, simply because enormous quantities of data can offer connects that radiate in all directions: (p. 668). BD and whole data are not the same. The ideas of firehose, garden hose (10% of public tweets), and spritzer help visualize the amount of data give to some (p. 669), and these pictures show how some data is limited in scope to some due to privacy (p. 669). BD provides to type of social networks” articulated networks and behavioral networks (p. 671). boyd and Crawford state, “Articulated networks are those that result from people specifying their contacts through technical mechanisms like email or cell phone address books, instant messaging buddy lists, ‘Friends’ lists on social network sites, and ‘Follower’ lists on other social media genres… Behavioral networks are derived from communication patterns, cell coordinates, and social media interactions (Onnela et al. 2007; Meiss et al. 2008)” (p. 671). People mistakenly measure tie strength with frequency or public articulation, but the example of a coworker v. spouse shows that you can spend more time with someone (coworker) but be less close than a spouse (p. 671). The authors remind that just because information is out there doesn’t mean it is ethical to use this information; many times people would object to their information being used for purposes other than those they intended. Other concerns with BD is that for whole sets of BD, ones that actually get the most information themselves are the social networking companies themselves or universities, and within outside groups, those that can afford and pay for that information is able to use that information and train individuals to work with this type of data (p. 673-4). This places limited lenses on data, and the authors remind that “[m]ost researchers who have computational skills at the present moment are male and, as feminist historians and philosophers of science have  demonstrated, who is asking the questions determines which questions are asked (Harding 2010; Forsythe 2001)” (p. 675). Overall BD “creates a new kind of digital divide: the Big Data rich and the Big Data poor” (p. 674), and  many argue private companies are the ones that can analyze data better so it signals who is and insider and outside.

boyd, d., & Ellison, N. (2008). Social network sites: Definition, history, and scholarship. Journal of Computer-Mediated Communication, 13, 210-230.

boyd and Ellison discuss social media scholarship and also define social networks as "web-based services that allow individuals to (1) construct a public or semi-public profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3) view and traverse their their lists of connections and those made by others within the system" (p. 211). Although often used interchangeably, a social network site differs from social networking sites in that a networking site emphasizes relationship initiation rather than relationship maintenance. Research suggests that people mostly use SNSs for maintaining already established relationships (p. 221), but the bulk of previous scholarship has "focused on impression management and friendship performance, networks and network structure, online/offline connections, and privacy issues" (p. 219). According to Boyd and Ellison, components are profile data like age, city, hobbies, information about the user, profile photos, and ways to leave messages (p. 213).  Underscoring social network sites (where profiles are typed into being (as per Sunden, 2003) (p.211)) is the visible profile with lists of friends, and this "public display of connections is a crucial component of SNSs (p.213). According to boyd and Ellison, "Friends provide context by offering users and imagined audience to guide behavioral norms" (p. 220). The actions people make on these sites can be turned into data and gathered for consumption (p. 220). A potential concern for the information divulged on SNSs are privacy issues and the ability to reconstruct a user's profile information for means other than those intended for things such as police (p.222) and identity theft. 

Coll, S. (2014). Power, knowledge, and the subjects of privacy: Understanding privacy as the ally of surveillance. Information, Communication & Society 17(10), 1250–1263. http://dx.doi.org/10.1080/1369118X.2014.918636

This article sees privacy, rather than being an alternative to surveillance, as a “partner-in-crime” to surveillance. The article studies consumer loyalty cards in the framework of biopower (p. 1250). Coll uses Bozovic’s discussion of the Panopticon to look at privacy in a different way. In the panopticon, there was a line on the floor after which prisoners couldn’t be seen, but invisibility was the same as visibility; the only place the prisoner could be is behind the line, so he was visible. Coll also discusses Big Brother but comes to the conclusion that “the more clearly privacy is defined, the more it can be subject to control” (p. 1251). Also, “privacy should be reinforced as a collective value rather than being seen only as an individual resource, the idea of an ‘invasion of privacy’ has actually become too limted to account for what turned out to be a worrying and recurring issue of modern time” (p. 1251). Privacy involved protection of data (p. 1251), and the area of privacy that deals with data on an individual is called informational privacy (p. 1252). Often, in order to combat surveillance, privacy is used as a weapon of choice. According to Coll, “Most often, privacy is seen as an informational bubble surrounding individualsthat must be protected against external and undesired intrusions from the state, private companies, or even other persons motivated by their curiosity (p. 1252). The origins of privacy start around early 18th century. Privacy was “exclusively upper-class privilege until 1960s” (p. 1252). The opening up of privacy to the middle class allowed the sexual revolution. (Foucault first developed the idea of biopower around sexuality, and biopower is “any type of power that directly targets the body and intends to take control of it”) (p. 1252). Important to this discussion is that “The main project and discourse of data protection laws is to educate users to protect their own privacy, at least in the informational context. In other words, biopower is producing subjects owning a privacy, feeling concerned about it and willing to protect it” (p. 1253). Experts like to say that you have privacy, you need to protect it, and we can tell you how (p. 1253). In Coll’s look at loyalty cards, one main finding is that information articulated in data protection policies is inadequate and subjective to consumers. Objective privacy is the normative definition of privacy; subjective privacy is the subjective definition given by a stakeholder, and privacy as a lived experience is what is really experienced in life (p. 1258). These three definitions show that privacy is “a terrain of power struggle” (p. 1258). The informational self-determination principle “holds that every person should serve as a proactive actor of his or her own privacy (using the right of access to the data, its modification, or its destruction)” (p. 1258). In this sense then, privacy itself becomes just another microtechnology of power where ones must discipline and monitor themselves to make sure that they conform to society’s norms [and maintain the ability to participate in capitalism]. Thus, privacy becomes a partner-in-crime to surveillance; “any scientific discourse about the subject reinforces its subjection” (p. 1258). Simitis (1987) spoke of controlling one’s own data as “chimerical,” and the idea of transparency is fiction; data gathered can be used for countless new and other data sets (like psychological and sociological profiles) and “control” of data privacy seems unattainable. Overall, Coll concludes that data privacy makes people feel a false sense of ease with the spread of information, and really self-monitoring and the emphasis on the individual and (Foucaults’s) care of self is just a ruse to keep ones embedded in capitalism (p. 1259). Individualized ideas of privacy becomes like ideas of freedom, but “[o]nly a conception of privacy oriented in terms of a collective good can possible balance measures meant to serve these overwhelming interest (p. 1260). Coll calls on Regan (1995) and says “privacy should not only aim to protect the individual, but also the society and its democratic values…Privacy and its definition must urgently be understood as a struggle of power between promoters of a model of informational capitalism based on surveillance of citizens and consumers, and those who would prefer to promote privacy as a common good that could lead society to more democracy and freedom” (p. 1261).

Donath, J., & boyd, d. (2004). Public displays of connection. BT Technology Journal, 22(4), 71-82.

Donath and Boyd (2004) talk about the importance of linked connections to specifically social networking sites (see the difference between "network" and "networking" from boyd and Ellison). Donath and boyd describe that the main point of social networking sites is to make connections (p. 77), and these sites are defined as "on-line environments in which people create a self-descriptive profile and then make links to other people they know on the site, creating a network of personal connects" (p. 72). The authors go on to say that people mostly use their real names and photos, and their list of connections also is present on the site. The links to these connections tend to share four main features: they are 1) mutual (if A has B as a connection, B will have A as a connection); 2) public (others can see these depending on the site specifications); 3) links are not nuanced (a connection one barely knows may appear next to a relative known all one's life); and 4) the links are decontextualized (one can't be selective about who in one's networked is shared with whom (p. 72). One way to think about these connections is through signalling theory which "describes the relationship between a signal and the underlying quality it represents" (p.72). Many times signals which are deemed honest have a high cots in quality and time, and the ability to punish due to deception. This idea helps show that a person is legitimized by their connections because one's reputation is at stake because one can't be as dishonest in their profile if they must continue to maintain a lie over a period of time and their linked associates may be able to call out the lying individual in front of a public audience of connections (p. 73). The authors conclude, "In theory, the public display of connections found on networking sites should ensure honest self-presentation because one's connections are linked to one's profile; they have both seen it and, implicitly, sanctioned it" (p.74). For instance, if someone used fake information or accomplishments the network would presumably call the person out - except if the 1) network connections were fake; 2) the connections may not know the subject at all or well enough to discern truth; or 3) the connections are real and know the subject but don't care wither due to the nature of the site (i.e. humorous tone) or just don't care (p. 74-5). Participants in these spaces should understand the value of this information (p. 77) [i.e., law enforcement, criminals, marketers]. These sites revolve around different foci (used by Feld to talk about situations, interests and individuals) and this could be things like mutual acquaintances (p. 77). This may cause problems though because in life we may be able to keep our networks separate, but online they come together, and "[w]hen people from different context in one's life meet, it is possible that the different facets of one's life will be revealed to each other" (p.78). Connection characteristics are things such as "the context in which they formed, the frequency of contact, [and] the closeness of the relationship" (p. 79). The authors speculate that networking sites may function most to form and maintain weak ties (p.80) (but this may be a good thing considering Baym, 2010 said, "Weak ties are also those most likely to provide us with the bridging capital" (125)). Ultimately, bridging seemingly disparate groups may be strategically important and prove to extend and bounder our social worlds (p.81).

Fuchs, C., Boersma, K., Albrechtslund, A., Sandoval, M. (Eds.) (2012). Internet and surveillance: The challenges of web 2.0 and social media. London: Routledge.

Part 1 of the book is the theoretical framework of surveillance. Chapter 2 is Fuchs’ critique of the political economy. He makes the distinction between neutral concepts of surveillance which make the following assumptions: 1) surveillance can be positive; 2) surveillance is both enabling and constraining; 3) surveillance in fundamental to society; 4) surveillance is necessary for organization; and 5) any gathering of information systematically is surveillance and negative understanding of surveillance which sees surveillance as something bad. Fuchs sees the neutral description as problematic for many reasons such as it normalizes surveillance and it puts both positive and negative elements together. Ultimately, Fuchs argues for less capitalistic surveillance and a more socialistic space for place like the creative commons (p. 67). Chapter 3 is Andrejevic’s discussion of data mining, and it explores “the relationship between surveillance and economic exploitation” (p.72). The article talks about predictive analytics and concludes that although the components of exploitation (unpaid, surplus labor, coercion, and alienation) are interconnected,   they vary depending on the power relationship between those that control communication resources (pp. 86-7). Chapter 4 is ‘Key features of social media surveillance’ by Trottier and Lyon, and it discusses Trottier’s concept of the five basic functions of Facebook, and it concludes that managing privacy is often left to the user and although there has been some analysis of social media, it is ultimately a new are of study which needs more attention (p. 104). Chapter 5 is Hill’s discussion of Lyotard and the (in)humanity of internet surveillance, and it talks about Poster’s (1990) criticism of Foucault who did not address new forms of surveillance. Using Lyotard’s 1979 work The Postmodern Condition and The Inhuman (1987), Hill points out two ideas: 1) “performativity” as operating “techno-scientific capitalism” (defined as “the combined force of technological and scientific R & D and advanced capitalism” p. 107)and 2) hostage-taking which, according to Sim (1996), is “the pressure to conform to prescribed modes of behavior” (p. 108). Both of these are meaningful to Hill because they explain that everything seems to need to be translated to abstract forms of information in order to be efficient, and social conditioning turns us into prescribed positions (p. 108). For the internet then, this means that personal information is turned inhuman and into computerized data that is used for monitoring and capitalist gains. The final chapter in part 1 is Allmer’s chapter 6 discussion which frames surveillance as an economic issue that is not just about technology, individuals, economics or politics by themselves, but it is a societal problem encompassing all these things (p. 141). Part 2 has chapter 7’s work by Sandoval on consumer surveillance, and this chapter answers, how is surveillance involved with gathering information on web 2.0 sites to whittle the masses down to particular consumers, and how do the owners of these sites use data? The author looks at privacy policies for specific sites and concludes that website privacy statements essentially let companies use users’ information for the corporation’s capital gain (p. 165). In chapter 8, Arditi talks specifically about the music industry, and it shows how the “Big Four” watched p2p sharing and eventually held their reign over music and market dominance (p. 183). In chapter 9, Albrechtslund talks about socializing, sharing, and online communities, and three main questions the author aims to answer is 1) what is online sharing? 2) what does location mean for online sharing? And 3) how can the popularity of this practice be understood? The author concludes online sharing is “a practice of exchanging more or less personal information” (p. 188); location in online spaces means using GPS to show others where one is located (p. 192); and this practice can be understood as being involved in a larger social practice of mapping (p. 195). Ultimately, this study looks at why people voluntarily self-surveil (which can be further explored (p. 196). In chapter 10 Szekely talks about IT professionals, and concludes 1) IT professionals are pivotal in projects handling personal data, and they care about the privacy of others (while being critical of the mass public’s understanding of personal data protection) (p. 215); 2) IT professionals would speak out with concerns, but would eventually do what they are told; 3) different countries’ IT workers have varying views on data protection; and 4) personal attitudes of IT workers only marginally affect what IT workers do. IT workers in this chapter’s study showed they were neither for nor against surveillance (p. 216), and more research should be done in this area.  In chapter 11 Christensen and Jansson’s chapter is “Fields, territories, and bridges: networked communities and mediated surveillance in transnational social space’, and it talks about the disciplinary discourses of surveillance and community making and looks at the example of making communities in transnational spaces (such as Turkish nationals in Sweden). The article looks at Wittel’s (2001) concept of “networked sociality” and Bourdieu’s theory of social fields. The piece concludes that “the ease, speed, and ephemerality of mediated proximity and boundary maintenance conceal both the temporal (for example the persistent longevity of data) and the spatial (for example the presence of data in multiple digital locales) aspects of surveillance and systems of sorting” (pp. 235-6). In chapter 12, Wayland, Armengol and Johnson talk about transparency; often, transparency is thought to be the antidote to data mining practices (p. 239). This article talks about the campaign finance disclosure law and ultimately concludes, especially with campaign financing, that the idea of transparency as seeing things are as they are is a myth (p. 251), and transparence is really just a house of mirrors (p. 244 and p. 253). In chapter 13, Taddicken addresses ‘Privacy, surveillance and self-disclosure in the social web’ and social groups, and he looks at how privacy affects a user’s actions. He examines various ideas about privacy (such as the outdated “right to be left alone”), and concludes that users have to straddle the “privacy paradox” where they maintain privacy but still share information (p. 268). In chapter 14, Weber talks about privacy and the internet. This chapter goes through the changing understanding of privacy, laws surrounding privacy, and transparency. The chapter calls for global agreements and more control for the individual regarding their own data (p. 289). And in chapter 15, Boersma offers a summary of the internet and surveillance. Because the internet has brought a new means of surveillance, people need to be aware and critical of the procedures and ultimately protected from abuses of power (p. 305). Those that have argued against the book claim it is Marxist propaganda that critiques the corporation (more than the government) as being the agents of surveillance. 

Gates, K A. (2011). Our biometric future: Facial recognition technology and the culture of surveillance. New York: New York UP.

Gates takes on biometric surveillance through the lens of facial recognition technology (FRT), and the goal of her book is to examine the social construction of FRTs. Many, like US officials after 9/11 assert that FRT can help prevent terrorist attacks. This however, is technostalgia, and like all technology, are cultural forms which embodying the hopes, dreams, desires of the creators and nations creating the technologies. Surveillance systems do more than just their technical capabilities; society’s belief in their sophistication may exceed actual technical abilities, and this belief leads to forms of social control (p.6).  The book focuses on the politics behind certain technologies which carryout both technical operations and a narrative of control (p.7). Due to the technological limitations and the fact that society influences both technology and the recognition of the face, it is hard to think of FRT as being a neural, objective, and all-powerful technology. Gates first takes readers through a history of FRT. Society has been attempting to “securitize identity” (coined by Nikolas Rose and meaning “the intensification practices t a proliferation of sites”) (p. 27). One of the more modern pushes for identity assurance was in the banking industry (p . 37-42). People think FRT’s and other biometric surveillance technology can irrefutably identify individuals (p. 42). FRT also moved to the workplace, consumer tracking, and network control (43). Criminal justice agencies also got involved with FRT in order to monitor jails and control their photographs (p. 54). FRT technologies became a popular tool because as opposed to other biometric information, the face fit into what people were already comfortable with – using faces for identification (i.e., driver’s licenses (p. 47).  Further, the hopes that it could be integrated to surveillance camera would make it less invasive than ever, and more technology could be developed to sort that data into useable information.  Welfare reform, the war on drugs, and immigration became places where FRT was used (p. 56). In chapter 2, gates talks about the power that surveillance has afforded police and questions the legitimacy of that power (p. 64). Most views of crime are police centered and make the argument that police are the antidote to crime and not any other social problems that could be helped with more education or jobs ; William Bratton reiterated this (p. 67). However, some question this and object to more policing as the antidote to safer communities; CCTV, as run by police, can become a symbol and a function of a mistrusted police department some consider to have too much control (p. 68). CCTV can be cumbersome for review, however, so smart were tried in Ybor City, FL. CCTV technologies aimed to use FRT to streamline analysis of footage. This would supposedly remove anonymity and assign an identity to everyone caught in the gaze (p. 84). This idea is acceptable to many because one assumes that FRT targets “only specific, dangerous identities” and “gloss[es] over the more troubling politics of inclusion and exclusion that informed the redesign of urban space and its accompanying technological infrastructures” (p. 91). Proponents justify using FRT because they say we are no longer in a Cold War era where we know our enemies and defense spending is justifiable; FRT helps make terrorists “identifiable” (p.98) – it supposedly could have stopped a 9/11 terrorist (p. 101). Databases become important then (p. 102), because the FRT has to recognize who is a “threat.”  One trope people use is “the face of terror” (p. 106). Mug shots databases and their interface designs on pages such as the FBI and America’s Most Wanted present a symbol of our fight against the other (p. 116).  In addition to law enforcement, commercial agencies also use FRTs (p. 125). [It is interesting to remember that FRT is used in Facebook, but the interface does not give the user access to controlling that tool].  Society is expected to be tech-savvy (p.128) and accept FRTs for things like security (p. 130) and personal photo management (p. 136).  Cameras are early technologies which started to put people in a panopticon (p. 136; p. 158). Commercial applications of FRT like that in Facebook works to normalize FRT and make FRT appear less threatening.   Automated facial expression analysis (AFEA) software often overlooks social dimensions (p. 155; p. 162). For instance, “Humans interpret one another’s faces in highly imperfect ways based on social conventions that vary culturally and historically” (p. 164).  Also, there are difficulties in coding expressions; most often they are coded with sign-based and judgment-based approaches (p. 168).  Overall, Gates concludes, “Any system for representing the face tells us something about the society and historical moment that produces it” (p. 193), and claims at the universality of facial expressions are problematic. Additionally, neoliberal governments equate free government with free markets (p. 196), and there is a push to control more from afar; FRT helps perpetuate the argument that more advanced surveillance is the key to a safer society (p. 195).

Gillespie, T. (n.d.). The Relevance of Algorithms. In T.Gillespie, P. Boczkowski, and K. Foot (Eds.) Media Technologies. Retrieved from <http://www.tarletongillespie.org/essays/Gillespie - The Relevance of Algorithms.pdf>.

Gillespie talks about the importance of algorithms. Computers are essentially just algorithmic machines. Human knowledge is being subjected to algorithmic logic. Public relevance algorithms produce and certify knowledge, and have six dimensions of political relevance: 1) patterns of inclusion; 2) cycles of anticipation; 3) evaluation of relevance; 4) promise of objectivity; 5) entanglement with practice; and 6) production of calculated publics. Algorithms are not neutral however, and there are “warm human and institutional choices that lie behind these cold mechanisms.” Algorithms are a scientific instrument and a communication technology like broadcasting and publishing. Algorithms are meaningless though without being paired with a database, and they are intertwined. The collection policies associated with databases and the associated practices involved are important to examine. The algorithm is “designed to be and prized” for being automatic and be “triggered without any regular human intervention or oversight.” They are also designed to place things into categories, and “Categorization remains vitally important to database design and management.” Because of this, the “sociological implications of the database has largely been overlooked.” Back to the six dimensions of political relevance, 1) databases are designed to include/exclude. The algorithms are created to only recall certain information and are therefore “invisibly exclusionary.” [Have Burke’s terministic screen.] Some also resist inclusion by choice by using code such as robot.txt which prevents search engines from listing a page/site. Also, programs such as YouTube algorithmically demote sites such as suggestive videos. So, whether filtering occurs through newspapers or algorithms, there is some level of filter which establishes “standards of viable debate, legitimacy, and decorum.” 2) There are cycles of anticipation which mean “sites hope to anticipate the user at the moment the algorithm is called upon” which requires information about the user the algorithm already has obtained and knowledge of users “estimated to be statistically and domgraphically like them” (or Stadler and Mayer’s (2009) “second index”). This causes providers to “take advantage of the increasingly participatory ethos [see digital rhetoric] of the web” where users voluntarily, and fell good about, giving up information about themselves. The information that is gathered is considered according to Solove (2004), a “digital dossier”, according to Cheney-Lippold (2011) an “algorithmic identity”, or Balka’s (2011) “shadow bodies.” In order to better predict the future, providers also try to understand users’ habits through human-computer interaction (HCI) studies. The data in databased can be used by the “providers who amass this data, third party industries whol gather and purchase user data as a commodity for them, and those who traffic in user data for other reasons (that is credit card companies)”, and these stakeholders often have stronger voices because of this informational power but face debate about consumer rights and safeguards. 3) Evaluation of relevance means that algorithms filter through available information to determine what is most relevant for the current criteria. Search algorithms used to be based on the frequency of search terms appearing in indexed web pages now use “contextual information about the sites and their hosts, consider how often the site is linked to by others and in what way, and enlist natural language processing techniques to better ‘understand’ both the query and the resources might return in response.” Google supposedly looks at 200 signals for each query. Relevancy is “a fluid and loaded judgment” though and open to interpretation. “there is no independent metric for what actually are the most relevant search results for any given query, engineers must decide what results look ‘right’ and tweak their algorithm to attain that result, or make changes based of evidence from their users, treating quick clicks and no follow-up searches as an approximate, not of relevance exactly, but of satisfaction.” There really are no unbiased algorithms. There are three ways to see bias: 1) through criteria (an examination of the actual criteria an algorithm uses – although this is often hidden for competitive purposes); 2) through commercial aims (the economical and cultural contexts and their embedded aims – i.e., targeted ads and the intertwined nature of both social and commercial space) and 3) epistemological premises (a “question of whether the philosophical presumptions about relevant knowledge on which the algorithm is founded matters”- i.e., there are tendencies to privilege information that is already popular, English-speaking sites, and commercial information providers). Overall, most algorithms are treated as unproblematic. However, one algorithm is often built on many, and each assumption along the way inscribes pieces of meaning onto the end result [Latour’s black-box and inscription]. 4) promise of objectivity (many think algorithms are “stabilizers of trust, practical and symbolic assurances that their evaluations are fair and accurate, free from subjectivity, error, or attempted influence). But “no information service and be completely hands-off” and algorithms are designed for logonomic control which adapts to the prices of users, is influenced by social life, material design, and economic obligations. Algorithms must be validated discursively and technically because the algorithm has to technically produce the results and discursively be respected and trusted in society. The perceived impartiality is one of the main carriers of trust; “the performance of algorithmic objectivity has become fundamental to the maintencance of the tools as legitimate brokers of relevant knowledge. According to Morozov (2011), Google asserts its objectivity, but it’s seeming “algorithmic neutrality” betrays its function as “the world’s most important information gatekeeper. Objectivity is discursive though, and “while the algorithm itself may seem to possess an aura of tehcnologial neutrality, or to embody populist, meritocratic ideals, how it comes to appear that way depends not just on its design but also on the mundane realities of news cycles, press releases, tech blogs, fan discussion, user rebellion, and the machinations of their competitors.” There is commercial value in appearing to be objective. 5) entanglement with practice (“If users fail or refuse to fit that tool into their practices, to make it meaningful, that algorithm will fail.”) Users believe in the relevancy of an algorithm and adapt their behaviors to work with it. For instance, Flicker users might adapt their behavior to create pictures that are more algorithmically favored; hashtags are used to be more visible; teens on Facebook may include unrelated brand names in status updates in order to become more privileged in feeds. These practices are described through three ideas: 1) backstage access (who is aware of the algorithmic criteria. I.e., corporations, advertisers, third party developers?); 2) domestication (users make algorithms their own after going public-i.e., iPhone users swapping ways to make Siri better), and 3) knowledge logic (studying the changes being made due to the codes habituate users and make them shift their worldviews to accommodate the logic and presumption of the algorithms). 6) production of calculated publics (This is the ability of call into an existence an audience (i.e., constitutional rhetoric), and the ability to assemble and structure a public, and the emergence of that public, based on digital technology). Algorithms put us in filter bubbles where we are members of communities that already think like us. Examples are Google searches, Amazon recommended suggestions, Facebook audiences of friends, friends or friends, etc. These publics may not exist unless called into being by the algorithm. It can also define a nation [Anderson] by aggregating what “Americans” like and reveal larger patterns that people did not know existed [i.e., digital humanities?]. They may also be inappropriate – A Grindr advertisement suggested a related app would be a sex offender finder. Overall, since divisions of labor are often necessary, algorithms can help handle information. However, they are “not just codes with consequences, but as the latest, socially constructed and institutionally managed mechanism for assuring public acumen: a new knowledge logic.” It is an editorial logic which “depends on the proceduralized choices of a machine, designed by human operators to automate some proxy of human judgment or unearth patterns across collected social traces. Social processes made these algorithms legitimate, are designed to work automatically, deliberately (code) unpublished, and work within an almost unimaginable information scale.

Graham, S. & Wood, D. (2003). Digitizing surveillance: Categorization, space, inequality. Critical Social Policy 23(2), 227-48.

Religion and cultural norms used to enforce surveillance, but bureaucracy, management and policing have replaced this in the capitalist, modern state. Graham and Wood look at 1) the nature of surveillance. The authors state, “For Gary Marx (1988), this ‘new surveillance’ is characterized by  ‘the use of technical means to extract or create personal data…taken from individuals or contexts’ (Marx, 2002: 12)” (p. 228); Jones (2001) calls this digital rule. This is significant because 1) it “enables monitoring, prioritization and judgment to occur across widening geographical distances and with little time delay” (p 228), and 2) “It allows the active sorting, identification, prioritization and tracking of bodies, behaviours and characteristics of subject populations on a continuous, real-time basis” (p. 228). It is thus a shift towards the automatic and a shit in “the power, intensity and scope of surveillance” (p. 228). According to Graham and Wood, “A characteristic of digital surveillance technologies is their extreme flexibility and ambivalence” (p. 229). They are contradictory because they are both designed to exclude based on automated judgment and to help “overcome social barriers and processes of marginalization” (p. 229). Digital encoding works to compress information to a minimum baseline of binary code (analogue wants an accurate representation). Panopticicm is defined as “the tendency towards a disciplinary state based on direct surveillance” (p. 230).

Humphreys, L. (2011). Who’s watching whom? A study of interactive technology and surveillance. Journal of Communication 61, 575–595.

Privacy and surveillance are treated as opposites. Humphrey’s uses Westin (2003) to define privacy as “the ability to control what information about oneself is available to others” (p. 576). He uses Lyon (2001) to define surveillance as “any collecting or processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been gathered (p. 2)” (p. 576). Humphreys brings out that the power imbalance is important to understanding surveillance, and “Asymmetry is an important differentiating factor between monitoring or watching and surveillance (Andrejevic, 2006)” (p. 576). Poster (1990) and Gandy (1989) bring out that technology is incorporated into surveillance, and “Gandy suggests that information technology and the growth of databases create an asymmetrical monitoring of behavior. Drawing on Bentham’s concept of the panopticon (Foucault, 1977), Gandy (1993) demonstrates how information technology facilitates the surveillance by an unseen corporate and bureaucratic observer who can not only commodify the personal information of those observed, but also use such information to inform practices of social control and discrimination. Such information technology ‘‘involves the collection, processing, and sharing of information about individuals and groups that is generated through their daily lives as citizens, employees, and consumers and is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy’’ (Gandy, 1993, p. 3)” (p. 576). The author concludes that “it allows those who control the databases (often corporate or government entities) to know more about individuals than they may know about themselves…[and] [t]his information can also be commoditized and potentially used for discriminatory activities” (p. 577). Information is often given freely though, and “people willingly share their personal information because they derive some sort of benefit from these interactive, information-based services” (p. 577). Humphreys describes three kinds of surveillance: voluntary panopticon, lateral surveillance, and self-surveillance. “Voluntary panopticon refers to the voluntary submission to corporate surveillance or whatWhitaker (1999) calls the ‘‘participatory panopticon” (p. 577). This type of surveillance is voluntary and is part of a consumer society, and information is often volunteered for consumption purposes. “Lateral surveillance is the asymmetrical, nontransparent monitoring of citizens by one another (Andrejevic, 2006)” (p. 577). This involves the Internet and participatory culture. People can watch others without them even knowing. The final type of surveillance discussed is self-surveillance, and “Meyrowitz (2007) defines self-surveillance as ‘‘the ways in which people record themselves (or invite others to do so) for potential replaying in other times and places’’ (p. 1)” (p. 577). For instance, people record themselves with camera phones. Humphrey’s goes through a case study of the app Dodgeball through these three lenses. For the voluntary panopticon, he quotes Andrejevic (2007) saying that using interactive media is an ‘invitation to participate in one’s own manipulation by providing increasingly detailed information about personal preferences, activities and background to those who would use the knowledge to manage consumption’’ (p. 242)” (p. 583-4). For lateral surveillance and Dodgeball, “network members monitored the communication and behavior of other network members (i.e., their friends)” (p. 585), and Andrejevic states, “‘Interpersonal interaction always contains an element of mutual monitoring, but the deployment of interactive networked communication technology allows individuals to avail themselves of the forms of asymmetrical, nontransparent information gathering modeled by commercial and state surveillance practices’’ (Andrejevic, 2006, p. 398)” (p. 585). Dodgeball could more to be asymmetrical when “this mutual monitoring could become asymmetrical (a) because not all people broadcasted personal information at the same rate and (b) because users did not always know when or if people access the information they broadcast over Dodgeball” (p. 587). For self-surveillance, Dodgeball let users record behavior to use at another time (p. 587). Humphreys concludes that “Consistent with privacy research (Gandy, 1989; Stone, Gueutal, Gardner, &McClure, 1983), as long as people felt they were in control of their personal information they were unconcerned about their privacy” (p. 590).

Kim, M. (2004). Surveillance technology, privacy and social control. International Sociology 19(2), 193-213.

Kim looks at control moving from being reactive to technologies which are supposedly being proactive (p. 193). Kim goes through the history of privacy (p 194), and ultimately addresses “the dynamic relationship between surveillance technology and social control” (p. 194).

Jansson, A.,& Christensen, M. (Eds.). (2014). Media, surveillance and identity: Social perspective. New York: Peter Lang.

This book is a collection of three sections: perceptions, practices, and politics. The book grounds itself in social perspectives because 1) social processes influence surveillance practices 2) surveillance promises to help keep societies safe which produce sustainable identities for the self and society; and 3) power is negotiated between different groups in society (p. 5-7). These sections are made of the following essays: the first section “Perceptions” addresses “how social actors regard, copy and comply with surveillance” (p.7) and starts with Christensens’s essay about complicit surveillance and visible media geographies (p.15). This chapter draws on fieldwork in Sweden and shows how “an increased dependence on the media, commodification of communication, and a geopolitics of fear” places us not in a space monitored by surveillance, “we are situated in it” (p. 8); the places where we are, matter. The second chapter is Zurawski’s chapter on consumer culture and surveillance (p. 32). He asserts consumer surveillance is the most prevalent form of surveillance, and he argues the panopticon or surveillant assemblage are not as useful for consumer surveillance due to the interdependence of consumption, control, and social formation; we often willingly give up information. [although we often theorize surveillance as a more powerless relationship – see Haggerty and Samata’s idea of surveillance/accountability for locations of power]. The third chapter is Allmer, Fuchs, Kreilinger, and Sevignani’s discussion of social networking sites and surveillance (p. 49).This chapter looks at targeted advertisements in social networks and calls for more protections against corporate surveillance. Three main theoretical foundations are privacy, surveillance, and digital labor. The book takes a socialist approach to privacy and asserts that people have the right to know what is happening to our data. The chapter ends with practical ways to subvert surveillance. The fourth chapter is Lyon’s discussion of an emerging surveillance culture (p. 71) and shows that surveillance is not just top-down watching; it has become an “ordinary, natural and normal” (p.9) and an everyday experience and often people are compliant (or complaining) “data subjects.” There are two factors that encourage and normalize surveillance: the fear factor and the fun factor. Fear motivates people to securitize everything due to perceived threats, and fun motivates people to participate in social media (see Albrechtslund’s participatory surveillance and Andrejevic’s lateral surveillance) or be involved with reality programs (p. 75). Technology ultimately hides many of the mechanics of surveillance (see ubiquitous computing or ambient intelligence). The second section “Practices” addresses how practices are at odds with social practices and starts out with Andrejevic’s chapter about debt in the digital economy (p. 91) starting with the idea that TV advertisements were “payment” for “free” TV programs. Debt is a way we can be surveilled. He discusses how we accept and agree to free services on the internet in exchange for targeted ads and surveillance. He calls for more analysis of the power relations structuring these agreements (p. 9). The sixth chapter is Humphrey’s chapter on mobile social networks from the user’s perspective (p. 109). This explores the differing perspectives users (in this case, Dodgeball users) of mobile networks have, believing that they have more control over their information when really they are subject to three forms of surveillance: voluntary panopticon, lateral surveillance, and self-surveillance (p. 9). The seventh chapter is by Germann Molz about collaboration, technologies, and the sharing economy (defined as “the social and economic phenomenon of technologically mediated peer-to-peer exchanges”) (p. 129). She looks at how online reputation spaces (such as airbnb.com) really play into a larger ““Surveillancization” of social relations” (p. 10) and how surveillance is normalized (p. 139). The eighth chapter is about social and material approaches of interveillance and transmedia technologies (p. 145). This chapter looks at Swedish examples and comes to the conclusion that “the managing of private information and the judgment of other people’s online behavior operate through taken-for-granted registers of “common sense”” and calls for more “socio-mateterially embedded, negotiated” process evaluations of media (p. 10). It discusses the morality associated with surveillance.The final section is called “Politics” and “presents contextualized analyses of how monitoring practices sometimes resonate with dominant discourses of civic responsibilities, while at the same time having the potential to feed into more expressive forms of (identity) politics” (p. 8). It starts with Barnanrd-Wills’ chapter on online privacy politics (p. 165). This chapter examines the politics of online privacy through hegemony and identity (p. 10); often, discourses position our identities, so online, we have to be aware of the positioning power of hegemonic discourse. The tenth chapter is Makinen and Koskela’s take on surveillance as a reality game (p. 183). They look at how uses of surveillance change and how people resist these uses and changes, oftentimes though a playful manner (p. 10) and examine how surveillance has been explained through metaphor, especially games like hide-and-seek or cat-and-mouse (p. 197). The eleventh chapter is Jacob’s discussion of sexualized bodies and the Chinese internet (p. 201). This chapter shows how some uses their bodies in a sexual way online to counter the strict rules set out by the state. It links how the gaze of surveillance and identity mesh with porn culture and sex. The twelfth and final chapter is Burkart and Andersson Schawarz’s discussion on post-privacy ideology (p. 218). Post-privacy “advocates the abandonment of privacy activism and personal privacy hygiene based on the conviction that digital privacy is both untenable and socially unrewarded” (p. 11). The authors try to create a manageable way to go about privacy in a post-privacy culture.

Jenkins, H. (2006). Convergence culture: Where old and new media collide. New York: New York University Press.

This book talks about convergence culture when content flows across multiple media platforms (p. 2) and participatory culture where people take media in their own hands (p.17) and users are not just consumers of media but active participants (p. 3). The first chapter discusses knowledge communities and a Survivor spoiler group. Collective intelligence, as opposed to Peter Walsh’s expert paradigm, assumes that each person in the group has something to contribute rather than only the experts having knowledge (p. 53). Knowledge communities are these opposite of expert-driven groups and have “voluntary, temporary, and tactical affiliations” (p. 57). They are bottom-up, collaborative, and evaluate information collectively (p. 58).

Lessig, L. (2008). Remix. New York; Penguin Press.

Lessig tackles interactive technology and participatory culture. Lessig provides a history of being influenced by the work of others and a history of copyright that is currently restricting society. Lessig describes two types of culture: read/write meaning someone can both read and the file and make changes to it as well as read/only meaning a user can only read a file (p. 28). While copyright tries to hold on to possession, technology allows for rewriting. People take the freedom to quote others for granted when writing (p. 53), but it is inappropriate to do the same with music and text (p. 55).  Remix is the act of taking something made and remixing it in new ways. RO cultures profess to be the experts. RW culture’s future depends in some ways on new legislation (p. 108). Economies can be described as either a sharing or commercial economy. The internet is caught in commerce (i.e. Amazon), and many times the internet is part of a sharing economy where people contribute to larger projects for the good of others (i.e., Wikipedia) (p. 157). There are also hybrid economies. A hybrid economy is “either a commercial entity that aims to leverage value from a sharing economy, or it is a sharing economy that builds a commercial entity to better support its sharing aims” (p. 178).  The hybrid spaces on the internet can be explained as 1) community spaces (p. 186), 2) collaboration spaces (p. 196); and 3) community (p. 213). Hybrid economies can help decriminalize the youth because money-making entities can share their content with users who can produce something which allows them to feel creative but can be used by corporations for capitalization (p. 249). Reforming laws can make his happen by clearing the title (p. 260) and decriminalizing the copy (p. 266) and file sharing (p. 271). Overall though, changing the way we think about remix is also needed (p. 274); changing laws needs a group of people that understand why it needs to be changed (not people who see this as pirating).

Lyon, D. (2009). Identifying citizens: ID cards as surveillance. Malden: Polity.

This book examines the idea of creating national identity cards Lyon questions the system of a national ID and the states’ authority to say you are who the state says you are (p. 11), and we are who we are in relationship to others (p. 13). Our identities are then used to include or exclude us from experiences (p. 17). The first chapter discusses demanding documents by giving a history of the ID card from colonialism, crime identification, to war. The second chapter discusses sorting systems and the database becomes a central player in sorting the public by their IDs to either include or exclude (p. 41). The third chapter is about going beyond the idea of “state” control to what Lyon has termed the card cartel which is the intersection of both government and corporate desires to work together to create a system (cards, databases, algorithms) of identification. The fourth chapter is about stretched screens which involves the larger breadth of data and screening. Lyon mentions a border example where “Processes are handled largely in databases using search engines and sometimes data-mining techniques made visible to officials on computer screens” (p. 85). This also involves the idea of liquidity and the traversing of time and space, and people are abstracted from their data (p. 97). Also, papers are not just requested at borders; the border travels and papers are needed at many places (p. 90). Interoperabilty (defined as “the means whereby diverse systems and organizations work together” (p. 98) has also been called for, especially since 9/11; there are claims that better linked databases can stop problems before they start (p. 97).  The fifth chapter is body badges and deals with biometrics and the databases through which biometrics is recorded and accessed. Lyon ends with saying that we are governing by identity, and the consequences of the biometrics are not always fair (p. 129). The final chapter functions as a conclusion by focusing on cyber-citizens and that identity becomes digitized and socially sorted. It also discusses that often freedom is equated to consumer choice (p .140), and our goals are to be consumers. Lyon concludes that a new, national ID would imply six things: 1) ID is remote (p. 143); 2) ID systems are interoperable (p .144); 3) ID systems are categorical (p. 144); 4) these categories can be conflated easily (p. 145); 5). “New ID systems depend on bodily and behavioral traits” (p. 145); and 6) “new ID systems are exclusionary” which means that although identity management seems neutral, this is not true and often the “Other” is singled out (p. 147). IDs may also “be considered as a stand-in for the kinds of political identities that are important in the twenty-first century” (p. 152). Ultimately, Lyon raises questions about national IDs; the ability to construct someone’s identity based on biometric, consumer, or other information; the relationship between the government and corporations when creating identity schemes, the databases that hold all this information, and the ability to look for it anytime, anywhere.

Markham, A. & Buchanan, E. (2012). Ethical Decision -Making and Internet Research: Recommendations from the AoIR Ethics Working Committee (Version 2.0). Retrieved from <http://aoir.org/reports/ethics2.pdf>.

This is a technical document outlining ethics in online research. The first version was out in 2002, and the second came out in 2012. The document first defines important terms. The first is Internet. The authors state, “The term “Internet” originally described a network of computers that made possible the decentralized transmission of information. Now, the term serves as an umbrella for innumerable technologies, devices, capacities, uses, and social spaces” (p. 3).

Mann, S., Nolan, J. & Wellman, B. (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments. Surveillance & Society 1(3), 331-55.

The authors talk about sousveillance in a surveillance society. The surveillance society is explained as “the notion of a “surveillance society” where every facet of our private life is monitored and recorded has sounded abstract, paranoid or far-fetched to some people” (p. 332).They explain surveillance is “organizations observing people” (p. 332), and sousveillance is watching from below (p. 332) where people can gather information about their surveillance to neutralize it (p. 333). Mann (1998) developed the concept of reflectionism to explain “philosophy and procedures of using technology to mirror and confront bureaucratic organizations” (p. 334). Sousveillance inverts the gaze (p. 336) and tries to neutralize surveillance by watching the watched (p. 333). Inherent to surveillance is an imbalance/asymmetry of power (p. 334). Although ubiquitous surveillance is seemingly new, it shares the same idea of the panopticon; one was only under the possibility of surveillance and didn’t know if they were being watched (p. 334). Knowledge of possible surveillance is enough to get people to obey authority (p. 335).

Morville, P. (2005). Ambient findability. Sebastopol: O'Reilly.

Moreville’s book discusses finding things around the internet, and ambient findability in particular concerns finding anything at any time. On the web, writing reading is not linear; people want to find whatever they need from wherever they are. Wayfinding tools are not new; Kevin Lynch (1960) described at least five places of wayfinding in the modern city: paths (like streets, railroads, and other regularly traveled routes), edges (walls, shores, and other places separating two areas); districts (sections of the city with a defining character – i.e., uptown); nodes (intersections, squares, corners, subway stations, or any place that is a reference or transition); and landmarks (buildings, mountains, signs, or any object that serves as a reference point (p.27).  There are more than just physical spaces to navigate though. Telihard de Chardin made the idea noosphere popular which  connotates all the interacting minds on earth.  While it can be imagined as a figure of speech or metaphor, the metaphor can be applied to the web. Our interaction with the web shapes us, and we shape the web (p. 42). According to Moores’ Law, information will be used in proportion to how easy it is to obtain (p. 44).

Rainie, H. & Wellman, B. (2012). Networked: The New Social Operating System. Cambridge: MIT.

Rainie and Wellman discuss the convergence, or triple revolution of social networks, the internet, and mobile. These three factors combine to allow people more access to online spaces at all times, facilitating their networks. Networks are different than groups because groups are more stable. According to the authors, “In networked societies, boundaries are more permeable, interactions are with diverse others, connections shift between multiple networks, and hierarchies tend to be flatter and more recursive” (p. 37). Advances in government, technology, bandwidth, radio spectrum, flat rate internet providers, storage, and apps helped bolster the internet (p. 62-4). Four cultures have shaped the internet: 1) techno-elites (designing the networks), 2) hackers (programmers upgrading the internet) and virtual communitarians (social forms, processes, and uses), 3) entrepreneurs (those making money on the future; 4) participants (users who create and share material online) (p. 79). All these changes have changed peoples time and space constraint flows (p. 99; p. 102, p. 108). Networks can consist of families and friends (p. 147), and work activities (p. 171). Network creators and participants are people that have taken advantage of participatory culture (p. 197). Information also needs to be networked (p. 223). This information is also subject to being watched (surveillance (p. 236), coveillance (p. 238), and sousveillance (p. 240). Thriving as a networked individual involves being nice to others and welcoming new and varied technologies (p. 263); using technology to connect to a wider audience (p. 264); be active, vary in your reliance on networks, reach into new networks on occasion, develop large and diverse networks (p. 265); independently cultivate networks (p. 266); monitor your image (p. 267); segment your identity (p. 268); function effectively in different contexts (p. 269); be trusted (p. 270); manage boundaries (p. 271); and be aware of audiences (p. 272). The book encourages seven forms of literacy: graphic literacy,navigation literacy, context and connections literacy, multitasking literacy, skepticism literacy, ethical literacy, and networking literacy (p. 272-4). The book ends with projecting what may happen in the future (p. 276).

Walker Rettberg, J. (2014). Seeing ourselves through technology: How we use selfies, blogs and wearable devices to see and shape ourselves. Retrieved from http://jilltxt.net/books/

According to Walker Rettberg, “This book explores the ways in which we represent ourselves today through digital technologies” (p.2), and “strands of self-representation intertwine in digital media in three distinct modes: visual, written and quantitative” (p. 3). Being concerned with oneself isn’t new; people has historically represented themselves through writing things like autobiographies, diaries, essays and blogs (pp. 3-7), portraiture such as small doodles of monks, portraiture, photos, or selfies (pp. 7-9), and self quantification. Self quantification means refers to “[i]f the mode of the diary is narrative, then the modes of quantitative self-representation are numbers, lists, maps and graphs” (p. 9). In history, examples exist like Ben Franklin’s tracking of 13 virtues, and medieval literature goes even farther back (p. 10). Then there was pen and pencil for tracking, and now there are spreadsheets, GPS, and wearable devices (pp. 9-11). We can now use devices to see ourselves and record it (p. 11). She admonishes, “Social media is about communication with others, but we should be equally aware of how we use social media to reflect upon ourselves” (p. 12). Creating and sharing selfies are forms of self-reflection and self-creation (p. 12). Due to the lack of images (due to things like lack of technology to take, input or share (i.e. poor bandwidth)), Walker Rettberg says “We imagined that the Internet was disembodied, anonymous and virtual” (p. 12). While earlier sites on the Internet were about words and conversations, newer sites like Instagram and Pinterest are about images (p. 13). The idea of not participating used to be bad (i.e., Lurker), now posting photos is participating [Trottier shows that FB stalking isn’t really looked down on] (p. 13) and when we look at photos we are participating as the audience. G. Thomas Causer (2012) uses the word “autopathography” is a term for collections of personal communication about going through an illness (p. 14). The author provides an example of misreading the purpose of Twitter and states, “An example of the mismatch between seeing a stream of tweets as text rather than as self-expression can be seen in the frequent condemnation of people who tweet or blog or in other ways share stories of illness or hardship in social media” (p. 14).

Richards, N.M., & King, J.H. (2014). Big Data Ethics. Wake Forest Law Review, 49.2, 393-432.

The authors state, “The scale of the Big Data Revolution is such that all kinds of human activities and decisions are beginning to be influenced by big data predictions, including dating, shopping, medicine, education, voting, law enforcement, terrorism prevention, and cybersecurity” (p. 393). Big Data is causing society to change, and what we allow to be normalized now will set precedent for future decisions. In technical terms, Big Data (BD) has been defined as “data that exceeds the processing capacity of conventional database systems” or "[w]here the data volume, acquisition velocity, or data representation limits the ability to perform effective analysis using traditional relational approaches or requires the use of significant horizontal scaling for efficient processing” (p. 394). Richards likes to focus more on the social aspects of Big Data and state, “Mayer-Schbnberger and Cukier define big data as referring "to things one can do at a large scale that cannot be done at a smaller one, to extract new insights or create new forms of value, in ways that change markets, organizations, the relationship between citizens and governments, and more” (p. 394). The authors are actually reluctant to just use the term Big Data at all and think it should be data analytics or data science. Non transparent data collection allows corporations to benefit at the expense of identity (p. 395). Overall, the article argues that BD “Is producing increased powers of institutional awareness and power that require the development of Big Data Ethics” (p. 395), ethics that would concern “privacy, confidentiality, transparency, identity, and free choice” (p. 395). Richards and King argue that privacy should not just be thought of as an individual responsibility; it should be built into the institutions. The authors make four claims. First, “Understanding privacy rules as merely the ability to keep information secret severely handicaps our ability to comprehend and shape our digital revolution. What has failed is not privacy but what Daniel Solove has termed "Privacy Self-Management," the idea that it is possible or desirable for every individual to monitor and manage a shifting collection of privacy settings of which they may only be dimly aware. We argue that "privacy" in today's information economy should be better understood as encompassing information rules that manage the appropriate flows of information in ethical ways” (p. 395-6).

Trottier, D. (2012). Interpersonal surveillance on social media." Canadian Journal of Communication, 37, 319-32.

Trottier looks at social media sites and shows that social media “renders users visible to one another in a way that warrants a care of the virtual self” (p. 319) so that users are concerned about what they post and what users post about them. Trottier uses Lyon’s (2001) understanding of surveillance that it “refers to the covert, sustained, and targeted collection of information, often about an individual or group of individuals” (p. 320). It is also more than just collecting information; “it relies on mediated relations, profiling, and asymmetrical relations of visibility” (p. 320). Users of SNSs see survillaning of each other as a violation, but it has also been normalized (Murakami Wood & Webster, 2009). Brighenti (2010) uses the term intervisibility as the users are visible to each other, and the watching ins mutual. However, there are also breaks of unintended exposure as “[o]ne function ‘creeps’ into another, and information ‘leaks’ to new contexts (Lyon, 2001)” (p. 320).

Trottier, D. (2012). Social media as surveillance: Rethinking visibility in a converging world. Abingdon: Ashgate Publishing Ltd.

Trottier considers social media sites a dwelling since many people live their lives on it (p. 1). The book opens with a history of social media. He states, “Social media is best understood as a series of practices surrounding the authoring of personal information, creation of interpersonal networks and the development of coordinated activities… A sociological perspective highlights four dilemmas brought on by the rapid adoption of social networking technology. These dilemmas correspond to four distinct phenomena: individual usage, institutions that attempt to manage these individuals, marketers that are seeking new ways to harness (or ‘monetize’) personal information and police as well as other investigators who are turning to social media.” (p. 7). Chapter 3 looks at students and Facebook and how their usage or “dwelling” shifted when more of the public joined or when they left school. Chapter 4 looked at institutions and professionals who are “overwhelmed by the amount of information and visibility brought on by Facebook” (p. 155). Chapter 5 discussed business’ ambivalence towards being involved with Facebook. Chapter 6 deals with policing and looks at officers both using social media for investigations and for social activities. Overall, Trottier concludes that social media has spread into other social contexts, and this spread has caused concerns in visibility (p. 155). The book explains Facebook through the lens of mutual augmentation, and states that “this research considers four kinds of surveillance: individuals watching over one another, institutions watching over a key population, businesses watching over their market and investigators watching over populations” (p. 156). This grows as more eyes are on Facebook. Trottier continues, “All four practices are augmented by Facebook’s exponential growth. The social media service is rapidly approaching one billion users. Thus, more users are joining the site to watch over peers, customers, markets and brands. With every additional set of eyes affixed to Facebook, any content already on the site has a larger audience. Moreover, that increased audience is situated in a greater variety of social contexts, starting with Facebook’s growth out of the postsecondary sector. In addition these users all augment each other’s visibility by uploading content that implicates each other” (p. 157). According to Trottier (2012), "five key features of social media highlight a shift in the collection of personal information on the Internet and illustrate the growing liquidity (Lyon 2010) of surveillance. These are: 1) Collaborative Identity Construction (p. 158); 2) Lateral Ties Provide Unique Surveillance Opportunities (p. 161); 3) Social Ties are a Kind of Content (p. 162); 4) Interfaces and their Contents are Always Changing (p. 163); and 5) Social Media Content is Easily Re-Contextualized (p. 165). Each of these elements comments on understandings of privacy. Often, people see privacy as a private/public binary (p. 167). In another perspective, “legal scholars like Nissenbaum propose a contextual (2009) understanding of privacy…. Multi-contextual services need to develop privacy settings that are robust enough to maintain contextual boundaries…While respondents value privacy, they compromise their own because of competing or conflicting values. They may choose to expose private information for the sake of achieving publicity. (p. 167-8)...Privacy for users is often framed as an individual responsibility (p. 170).