Dark patterns, as described by Brignull (2011), are practices web and interaction designers use as they fight for our attention in online environments. These patterns, that are used by all major companies and advertisers, have led to "a system of unintended consequences that has gotten out of control" (Harris, 2017). These unfortunate consequences have come about because these companies are no longer programming, they're "growing intelligence that [they] don't truly understand" (Tufekci, 2017). The consequences on its users include (but aren't limited to): data hacking, privacy infringement, monetary losses and/ or commitments, and unwanted location tracking.
"We might not like to admit it but deception is deeply entwined with life on this planet" (Brignull, 2011) and is found everywhere, including the web. Without knowledge of these dark patterns employed to persuade us, take advantage of us, and steal our data, we are unknowingly allowing our data and attention to be sold to the highest bidder (Tufekci, 2017).
The Belgium interaction design studio, Bagaar, created an online game called, User Inyerface, to represent these dark patterns. This game shows many of these practices in action, and although they are presented in a humourous, exaggerated way, it is these same tactics that are used "to control us and manipulate us in novel, hidden, subtle, unexpected ways" (Tufekci, 2017). Please read my reflection below of my experience playing this game.
The first page you see upon entering the URL, gives you a hint as to what exactly you are getting yourself into with this game; manipulation and frustration. It also says to fill out this form as fast as possible... why would there be a time limit? For what purpose would that serve? In my opinion we are all so used (and trained) to simply following directions and instructions, no questions asked, eventually forgetting our instinct to ask why? As Harris (2017) states, "there's a hidden goal driving the direction of all technology- the race for our attention". We need to think about the motives and intentions of companies and advertisers that are all either trying to sell us something or trying to collect our data to sell to someone else. As Tufekci (2017) warns, "when a platform says its products or content are free, that simply means we are the product that's being sold"!
A common way this game was able to manipulate my responses was through its use of trick questions; when you "respond to a question that tricks you into giving an answer you didn't intend" (Brigull, 2022).
One example of this was on the initial page, when asking you to check I do not accept the terms, in reality in order to move on, you must leave this box unchecked (which is tricky as we are used to checking the boxes to show our acceptance). This is also an example of a dark pattern using our pavlovian response against us (The Nerdwriter, 2018).
Also why does the game give a choice to select a title and gender, if it's not really going to allow me to choose as I see fit? It may as well just ask our gender and move on- making this is a trick question, because there's only one correct answer.
Another example of a trick question is the one asking its users to select 3 interests. The problem lies in the choices- they are inappropriate, unexpected, and irrelevant. Many aren't even interests, rather they are just objects (ex: tires, windows, faucets, dry-wall), and some have more than one meaning (ex: balls, dough, polo).
Misdirection played a huge part in this game's ability to manipulate my attention by "focus[ing my] attention on one thing in order to distract [my] attention from another" (Brigull 2022).
One example from the game was with the chat box that was open on the bottom right of the screen. This box got in the way as I was answering questions, but when I tried to minimize it, the only option was to raise it... Argh! The green button titled send- the one that usually sends the message written in the chat, had a subtitle reading 'to bottom'- making me finally aware of how to get the chat box out of the way. This was a great distraction tactic, as I was so caught up with getting rid of the chat, that when the timer came up again, warning me that 'time is ticking', I got anxious and more eager to move through the questions.
Another misdirection tactic that was rampant in this game was to have different words (incorrect words) underlined, highlighted, bolded, enlarged, and in boxes- making it very confusing to know which ones to choose. On the welcome page (shown in blue), I tried first clicking on the underlined, the highlighted, and the enlarged word, before finally finding the correct click.
Colour was also used to misdirect my attention and manipulate my responses throughout the game. I was repeatedly drawn to the large green or blue buttons, attempting first to click on them, only to realize they were incorrect.
Another way the game succeeded at manipulating my responses, was by "trick[ing me] into publicly sharing more information about [my]self than [I] really intended to" (Brigull, 2022). While playing this game, I was asked for a lot of data that I wasn't comfortable sharing- such as my interests, my picture, my birthday, my address, and my full name and gender. This represents a very real dilemma for me when I'm in online environments- should I provide my personal information to receive the free gift, lesson plan, or subscription? But at what cost? How will my info be used? The disturbing fact is that most likely the data I share will be collected by a data broker who will then sell it to companies and advertisers (Tufekci, 2017). Even more disturbing is the fact that "the industry is currently not well regulated and it is very difficult to opt out of having your data brokered" (Brigull, 2022). Meaning, we may no longer have a choice!
What made this game most frustrating was having to decipher what was really being asked in the questions or requests. This game also manipulated my responses by making the instructions difficult to understand, unclear, and hard to follow. As Brignull (2022) points out when utilizing dark patterns, "confusing language is often also used" to aid deception.
For example, the pictures it had us select to ensure we were human were confusing because they all met the criteria described in the question- meaning that they all had to be selected. This was a problem because in my mind I was thinking it can't be ALL the pictures, so I second guessed every single one! Another problem with these pictures was the items we were to select were homophones (i.e. light, bow, glasses) and therefore had more than one meaning. Increasing the confusion was the fact that they put in pictures of all the different variations of that word in the choices. It wasn't until my second try that I realized light also meant light in weight, meaning I should have included the feathers in my choices!
In one of the screenshots (pictured in blue), I show the password requirements for this game...they are very odd and unexpected (and I don't even know what a Cyrillic character is)!
Another weird (and stressful) feature was the timer that kept popping up every so often warning you to hurry up! This caused me a lot of anxiety and I have to say it did work in motivating me to answer the questions quicker.
The first step towards change is educating ourselves and our society to start seeing what's happening right in front of us, "as our best defence against the dark patterns is to be aware of them and shame the companies who utilize them" (The Nerdwriter, 2018). The Hall of Shame that Brignull (2022) shares on his site is a great start to this advocacy journey.
We cannot sit idly by and allow companies to "tak[e] our agency to spend our attention and live the lives we want" (Harris, 2017). We need to take back control of our lives, starting with the recognition and identification of dark patterns and persuasion techniques that are being used on us daily. We cannot allow ourselves to be run by notifications and pop-ups, that bring to mind things we weren't thinking of nor did we want to think of (Harris, 2017).
The next step is to demand transparency with the algorithms that are being used by companies and advertisers, as "it's not too much to ask that that language be comprehensible and honest" (The Nerdwriter, 2018). Once honesty and transparency become the legal requirement, it is then that we can focus on "build[ing] A.I that supports us in our human goals and is constrained by our human values" (Tufekci, 2017).
Those who don't agree and choose to live in denial, are simply driving home the fact that "sometimes the world's most pressing problems... are the ones right underneath our noses" (Harris, 2017). It is especially important for today's digital natives to be taught how to protect themselves, their data, and their privacy in online environments. We need to remember that in order "to reap the benefits from our prodigious technology, we must face its prodigious menace- open-eyed and now!" (Tufekci, 2017).
References:
Brignull, H. (2011, Nov 1). Dark Patterns: Deception vs. Honesty in UI Design. Interaction Design, Usability, 338. https://alistapart.com/article/dark-patterns-deception-vs-honesty-in-ui-design/
Brignull, H. (2022). Types of deceptive design. Deceptive Design. https://www.deceptive.design/types
Harris, T. (2017). How a handful of tech companies control billions of minds every day. [Video]. TED. https://www.ted.com/talks/tristan_harris_the_manipulative_tricks_tech_companies_use_to_capture_your_attention?language=en
The Nerdwriter. [Nerdwriter1]. (2018, March 28). How Dark Patterns Trick You Online. [Video]. Youtube. https://www.youtube.com/watch?v=kxkrdLI6e6M
Tufekci, Z. (2017). We’re building a dystopia just to make people click on ads. [Video]. TED. https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads?language=en