In Weapons of Math Destruction, Cathy O'Neil discusses the philosophy behind and pitfalls of algorithms. The European Parliament defines an algorithm as a procedure used to solve problems (European Parliament 2019). In her book, O'Neil analyzes algorithms through three lenses: opacity, damages, and scale, and discusses "algorithmic decision making." Algorithmic decision making is explained as a way of electronically analyzing big data and having the machine or program use this analysis to make decisions (O'Neil 2016). Understanding of the use of algorithmic decision making is important in order to determine if there is bias within the algorithm, whether the algorithm is produced to cause harm, and ultimately to make sure that the algorithm is the best it can be.
Algorithms definitely have their influence within workplace surveillance. This is dubbed "algorithmic management." According to Data and Society, algorithmic management is "a diverse set of technological tools and techniques that structure the conditions of work and remotely manage workforces" (Mateescu et al. 2019). Algorithmic management systems have primary similarities, including:
Digital surveillance and data collection of workers
"Real-time responsiveness to data that informs management decisions
AI or machine learning decision making
Performance evaluations shifted to metric or ratings systems
Penalties to incentivize workers' behaviors
Algorithmic management can be see in real life scenarios. According to Lohr, within the restaurant industry, "every waiter, every ticket, every dish and drink" are catalogued and analyzed for "patterns that might suggest employee theft" (2014).
This website will discuss the algorithmic decision making of workplace surveillance through the framework of Cathy O’Neil’s Weapons of Math Destruction. Discussion will surround the pervasiveness of, potential problems of, and the reasonings or lack thereof for workplace surveillance.
O'Neil discusses the concept of transparency or opacity regarding algorithmic decision making. Opacity is whether the people affected by the algorithm know it’s affecting them or know how it works or its effects. To O'Neil, an algorithm is "opaque" when the people it affects do not understand what the algorithmic decision making exactly entails. It cannot be "seen through" and is incomprehensible, therefore it is opaque (O'Neil 2016).
In America, companies generally must let the employees know that they are being monitored. Where it becomes more opaque is in the fact that they do not have to explain how or why the employees are being monitored (Solon 2017).
There are two main ways in which employers are allowed to breach the privacy of their employees through workplace surveillance: false reasoning & confusion and deception & lack of consent.
The main reason given to employees for workplace surveillance is that it will increase worker productivity (Shell 2018). This reason, while it may ultimately be believed by the employers, does not reflect reality.
As stated previously, companies have no legal obligation to inform their employees about how they are being monitored, only that they are. This can lead to software being installed on employer issued devices such as laptops and phones. This can lead to tracking personal messages in applications such as WhatsApp (Solon 2017).
With these issues of transparency, one must wonder what the actual intention of workplace surveillance is. While increasing productivity is definitely not a lie, there is more to the story. According to Data & Society, it is also often about control and order of the workplace. Surveillance, while not explained, often has high visibility to signal to workers that they are being watched and will act accordingly. Likewise, because of its increasing availability and ease, workplace surveillance is becoming "an acceptable practice for disciplinary and general, pro-efficiency purposes" (Rosenblat 2014).
In the end, surveillance boils down to the fact that employers watch their employees because they can (Shell 2018).
In Weapons of Math Destruction, O'Neil discusses the concept of "damages" and their relation to weapons of math destruction. Ultimately, damages are exactly as they sound. Damages refers to algorithms that actively harm their users or target base (O'Neil 2016).
Doctorow refers to the rise of online employee surveillance as "robo-Taylorism" (2019). Taylorism is a century-old workplace management philosophy that at one time prevailed. It has since been critiqued and disregarded. Frederick Taylor's main philosophy was "scientific management," a strategy where a stopwatch was used to monitor time and a worker's every movement was catalogued. Originally seen as a way to protect workers, scientific management soon became a way for bosses to oppress. Workers came to hate scientific management. Experts like Peter Cappelli, director of the Center for Human Resources at the Wharton School of the University of Pennsylvania, suggest that this is how electronic surveillance is being used now (Lohr 2014).
There are a multitude of potential problems that come with workplace surveillance. These include lack of privacy, decreases in productivity, being a band-aid solution to real problems within the workforce, and having unintended effects and consequences.
There are rising privacy concerns of the "specter of unchecked surveillance in the workplace" (Lohr 2014). There were and are still instances where the employees' rights to privacy were violated. An employee at an undisclosed IT services company sent a private chat, an instant or direct message, to a friend at work detailing that he was worried over potentially having shared his sexual identity with his manager and that this would affect his career. Wiretap, a surveillance tech, detected this message and, without the man's consent, sent the information to a senior exec at his company (Solon 2017). While the senior exec was able to defuse the situation, this situation still calls into question whether the surveillance here was advantageous or breached the employee's privacy.
Jason Edward Harrington worked as a TSA agent at O'Hare International Airport in Chicago for six years. While working there, his higher ups would record his and his co-workers's every moves. While this was put in place for the safety of the passengers' belongings, Harrington claims that the managers instead combed through the footage to search for the slightest infractions, eventually causing him to resign (Shell 2018). In Harrington and the TSA's case, this surveillance was misused. This is a problem that denotes privacy issues but also questions the larger culture of work.
Not only are people being recorded often without knowing the true reason, but there is also concern over who truly owns this data. When surveillance is outsourced to third-party companies, the legal ownership of this footage can get murky. "Ownership issues can affect the power dynamics of work-related surveillance in ways that cut across all socio-economic classes" (Rosenblat 2014).
With claims of increased productivity being the driving force behind surveillance tech, there is evidence to show that these claims may be false. According to British anthropologists Michael Fischer and Sally Applin, workplace surveillance erodes the worker's sense of agency. This in turn increases workplace stress, promotes worker alienation, and lowers job satisfaction (Shell 2017).
In his essay "In Praise of Electronically Monitoring Employees," Andrew McAfee, an MIT researcher, details a workplace surveillance experiment he conducted alongside colleagues. McAfee and his colleagues monitored waitstaff at 392 casual restaurants using theft-detection software. The installation of the software correlated with decreased employee theft and increased tips and revenue. McAfee labeled this a "win-win." What McAfee failed to take into account, though, was that it may have not been the surveillance itself that caused these benefits. It's possible that instead of the surveillance causing this increase in productivity, it was the increased attention on worker's and the environment. Anteby, a researcher, notes that "It’s possible that almost any change—even changing the lighting—would have prompted a similar increase in productivity.” It’s also possible that observed employees felt pressured to push customers to order more—a practice that is not necessarily good for business in the long run, as few of us enjoy feeling pressured to overconsume" (Shell 2018).
Instead of addressing the actual problems causing a lack of productivity, McAfee and his team focused on monitoring the employees of these establishments. While this did ultimately have a positive effect on productivity during the time of this survey, one does not know the effects on the workers' mental health or their productivity in the years to come.
Workplace surveillance can often have unintended consequences. Whether these are specifically damages is subjective, but they can definitely be potential for problems for the people monitoring.
In Britain during the 1970s and 80s, during the time of the Yorkshire Ripper murders of sex workers, British police installed security cameras in high traffic areas of prostitutes in hope to deter them from entering these areas. Instead, the sex workers used this surveillance to their advantage by staying in the sight of the cameras to make sure the cars in which they entered were recorded (Rosenblat et al. 2014).
Surveillance of this measure creates "rigid technological control over standardized work activities." This control can cause worker's to resist the surveillance in ways they would not have if they weren't being monitored (Rosenblat et al. 2014).
Scale, according to O'Neil, refers to the overall outreach of an algorithm. This meaning how many people the algorithm affects. Mathematical models with a large target base are at a high risk of becoming weapons of math destruction. Something that affects more people more often is likely to become a weapon of math destruction over something more niche (O'Neil 2016).
Originally just beginning with CCTVs, phone monitoring, and email monitoring, workplace surveillance has grown to include "keeping track of web-browsing patterns, text messages, screenshots, keystrokes, social media posts, private messaging apps like WhatsApp and even face-to-face interactions with co-workers" (Solon 2017).
Workplace surveillance is not a new phenomenon. What is up and coming though is the use of algorithms and information technologies to monitor employees. The electronic surveillance of employees has been a rapidly growing field among all sectors of work, specifically in the United States (Shell 2018). While surveillance tech providers have often set their sights on the financial sector due to the nature of their work, "they are increasingly selling their tech to a broader range of companies to monitor staff productivity, data leaks and Human Resources violations" (Solon 2017).
The scale of workplace surveillance is widening and widening everyday.
Through the information above, one can see that workplace surveillance has a long way to go. With a lack of transparency, privacy concerns, and an overarching hold on employees, workplace surveillance is not a force to be reckoned with.
Instead of stricter workplace surveillance that has no real regard, a proposal towards a shift in workplace culture is necessary. I do not want to propose a solution in which workplace surveillance is balanced between security and privacy. I believe that this is just a band-aid solution to the true problem. Distrust between both the employer and employee will not be solved through surveillance; it will only be amplified. Workplace culture must change both socially and structurally in order for the intended results of surveillance, increased productivity and safety of workers, to really be present.
人民日报 . (2019). AI In Manufacturing [GIF]. Retrieved from http://wap.art.ifeng.com/?app=system&controller=artmobile&action=content&contentid=3481511
Ball, K. S. (2001). Situating workplace surveillance: Ethics and computer based performance
monitoring. Ethics and Information Technology, Vol. 3.
Retrieved from https://doi.org/10.1023/A:1012291900363
Blockchain [GIF]. Retrieved from https://medium.com/@Paralism/the-development-of-parallel-blockchain-ab0d43980c65
Castelluccia, C. & Le Metayer, D. (2019). Understanding algorithmic decision-making: Opportunities and challenges. European Parliamentary Research Service. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2019/624261/EPRS_STU%282019%29624261_EN.pdf
Clawson, D., & Clawson, M. A. (2017). IT Is Watching: Workplace Surveillance and Worker Resistance.
New Labor Forum, 26(2). Retrieved from https://doi.org/10.1177/1095796017699811
Doctorow, C. (2019). From prisons to factories to offices: the spread of workplace surveillance
and monitoring tech. Retrieved from https://boingboing.net/2019/03/06/robo-taylorism.html
Graham, C. (2014). Data Monitoring & Employee Privacy [Image]. Retrieved from https://technologyadvice.com/blog/human-resources/data-monitoring-and-employee-privacy/
Griffiths, T. [TEDx Talks]. 2017, August 1st. The Computer Science of Human Decision Making | Tom Griffiths | TEDxSydney [Video File]. Retrieved from https://www.youtube.com/watch?v=lOhL-XUQPFE
Lohr, S. (2014). Unblinking Eyes Track Employees. Retrieved from
https://www.nytimes.com/2014/06/22/technology/workplace-surveillance-sees-good-and-bad.html
Mateescu, M. & Nguyen, A. (2019). Algorithmic Management in the Workplace. Data & Society Research
Institute. Retrieved from https://datasociety.net/wp-content/uploads/2019/02/DS_Algorithmic_Management_Explainer.pdf
Mohammad, M. (2015). IT Surveillance and Social Implications in the Workplace. Proceedings
of the 2015 ACM SIGMIS Conference on Computers and People Research, SIGMIS-CPR '15.
Retrieved from https://doi.org/10.1145/2751957.2751959
Nandi, S. (2019). Micro-Management: The Benefits of Employee Monitoring [Image]. Retrieved from https://technofaq.org/posts/2019/01/micro-management-the-benefits-of-employee-monitoring/
O'Neil, C. (2016). Weapons of Math Destruction: How Big Data increases inequality and threatens democracy. Broadway Books.
Pico, P. [KPBS]. 2013, April 25th. Privacy Rights vs. Employee Tracking [Video File]. Retrieved from https://www.youtube.com/watch?v=U-mqdz-20Xs
Rosenblat, A., Kneese, T., boyd, d. (2014). Workplace Surveillance. Data & Society Research
Institute. Retrieved from https://www.datasociety.net/pubs/fow/WorkplaceSurveillance.pdf
Schneier, B. (2014). NSA robots are 'collecting' your data, too, and they're getting away with it. The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2014/feb/27/nsa-robots-algorithm-surveillance-bruce-schneier
Shell, E. R. (2018). The Employer-Surveillance State. Retrieved from
https://www.theatlantic.com/business/archive/2018/10/employee-surveillance/568159/
Solon, O. (2017). Big Brother isn't just watching: workplace surveillance can track your
every move. Retrieved from https://www.theguardian.com/world/2017/nov/06/workplace-
surveillance-big-brother-technology
Watkins Allen, M., Coopman, S. J., Hart, J. L., & Walker, K. L. (2007). Workplace Surveillance and
Managing Privacy Boundaries. Management Communication Quarterly, 21(2).
Retrieved from https://doi.org/10.1177/0893318907306033
Young, K.C. (2018). Monitoring An Employee's Workplace Computer [Image]. Retrieved from https://www.kcyatlaw.ca/workplace-computer-privacy-monitoring/