Social media platforms
Social media are not neutral: their governance and structure impact how and what content is shared on them. Research has shown that a range of ‘platform mechanisms’ increasingly influences the way stories are told, information circulated and connections made online. Three aspects of how social media platforms work are extremely important for health focused content and conversations: privacy, moderation, and algorithmic recommendation.
It has long been known social media uses have led to changes in privacy practices. These changes tend to challenge existing societal norms about sharing and being seen, ultimately impacting individuals’ expectations on and practices of privacy. Yet, little is known about how platforms’ privacy settings and users’ awareness of them influence which personal narratives (e.g., harrowing, heartening) are more likely to be posted online in relation to genetic conditions.
Moderation policies play a significant role in shaping overall user practices. By relying on the work of human reviewers and automatic software, social media companies moderate user-generated content to remove offensive and illegal content and to shape their environment to attract new users, advertisers and partners. However, moderation is often experienced at an intimate level. For instance, women who saw their breastfeeding photos removed from Facebook in 2007 or fought for their post-mastectomy selfies to be allowed on the platform in 2015 felt personally attacked by the platform’s policy on nudity and obscene content. In both cases, the controversy prompted those involved to join forces and protest.
While moderation determines what can be posted on a platform, algorithmic recommendation determines what can be seen on it, that is, it determines relevance - and visibility - of both content and users. Algorithmic recommendation systems follow data patterns: they match users and content on the basis of engagement metrics: the more a user - and users similar to them – shows interest in a type of content, the more they will be exposed to similar content. Algorithmic recommendation systems work on the backstage, making it difficult for users to even see their content curation.
Do you want to read more? You can start where we started
On social media as platforms
Helmond, A., 2015. The platformization of the web: Making web data platform ready. Social Media+ Society, 1(2), p.2056305115603080.
Van Dijck, J., Poell, T. and De Waal, M., 2018. The platform society: Public values in a connective world. Oxford University Press.
On privacy
Draper, N.A. and Turow, J., 2019. The corporate cultivation of digital resignation. New media & society, 21(8), pp.1824-1839.
Hargittai, E. and Marwick, A., 2016. “What can I really do?” Explaining the privacy paradox with online apathy. International journal of communication, 10, p.21.
On moderation
Gillespie, T., 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
On algorithms
Bucher, T., 2017. The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20:1, 30-44.
Gran, A.B., Booth, P. and Bucher, T., 2021. To be or not to be algorithm aware: a question of a new digital divide?. Information, Communication & Society, 24(12), pp.1779-1796.
Willson, M., 2017. Algorithms (and the) everyday. Information, Communication & Society, 20(1), 137-150.