These past two weeks have been a struggle because I have committed to identifying some of the positive outcomes to algorithms affecting the way we communicate. I honestly view the situation in a primarily negative light, so this endeavor was a struggle. Despite all this, the consequences of algorithms have the potential to do good.
The media and its algorithms are capable of spreading joy and wholesome ideas, but the real question is how people's communication skills can grow for the better.
Algorithms promote unity and opportunity for users to form communities.
Algorithms are tailored to share specific content with a user that will encourage them to connect with people who share their beliefs, interests, etc. It can even go as far as to alleviate depression and its outcomes. Interestingly, a study conducted in recent years has found, "That the sense of shared identity, trust, informational support, and emotional support have positive effects on depression." Joshua Treadman, one of my interviewees, agreed, saying that he is able to relax and find comfort in some of his online communities.
The ability to have difficult conversations more openly.
As we well know, history has shown that society has often sheltered certain people and demographics from information that should arguably be shared. A few examples of this include openly sharing true feelings and struggles, hiding scandals like family members with disabilities or female relatives being raped, and outright ignoring problems like child labor and known illegal activities. Online, nothing can stay a secret. Everyone is in everyone's business. People are now further inclined to speak out because information is widespread. Now, some people may argue this is a negative impact, but it can go both ways depending on how often and in what quantities users are consuming it.
3. Mindfulness is more common within face-to-face conversations.
After people face the consequences of their actions, they are more likely to change the behavior in order to avoid those consequences next time. The same goes for users partaking in algorithms. Note that this is not always the case, but I have heard it presented this way by several people I had the privilege of interviewing. People are now able to understand the weight of their statements through their trial and error processes in their media usage. Unfortunately, this is oftentimes how people learn best. They now think before they act in real-world conversations as they would in a chatroom, live stream, and other modes of virtual interaction.
I have found that many of the benefits of algorithms impacting communication in real time are dependent on if the user is interacting with algorithms quite heavily or in moderation. Moderation typically leads to these positive influences we all seek to achieve. Even so, most people fall short of that ideal, hence why we experience most of the negative side effects more regularly. Algorithms can often be helpful, but we must keep in mind that they are also designed to keep users active, similar to how a trap works. As long as people are aware of the dangers, they can access the positive influences and avoid the bad ones.
https://www.frontiersin.org/journals/public-health/articles/10.3389/fpubh.2020.581088/full
https://www.govinfo.gov/content/pkg/CHRG-118shrg58143/pdf/CHRG-118shrg58143.pdf
https://www.tandfonline.com/doi/abs/10.1080/714041708
Includes references from several people that were interviewed for the purposes of this study.
Joshua Treadman