Increasingly, sex educators have ventured online to educate and inform people about sex and sexuality. Yet, their work is being hampered by social media platforms that use algorithmic systems to remove and reduce the visibility of sex-related content and accounts. Hence, online sex educators try to understand and master the algorithms that govern their work, as this may determine their existence on the platform. This thesis adopts the concept of the algorithmic imaginary as outlined by Bucher (2017) to research how online sex educators experience and perceive algorithmic content regulation on Instagram. It sets a methodological precedent by using phenomenological in-depth interviews to study users’ experiences with algorithms. Indeed, it finds that online sex educators have many encounters with algorithmic outputs as their content and accounts are shadowbanned and removed, which has fueled their algorithmic imaginary. This thesis adds on to the research about the algorithmic imaginary. It reconfirms that strange or odd outputs of algorithms grasp the attention of users and lead them to evaluate their workings. Yet, this research expands the idea of that which can be considered a strange encounter. Algorithmic content regulation of itself can be considered a strange encounter when users, like sex educators, feel wrongfully categorized as harmful while their only intent is to help and educate people. These strange encounters are a powerful instigator for users to reconsider how algorithms are used to exert particular belief systems, and it makes them believe that Instagram does not understand nor appreciate sex education. Additionally, this research provides new insights into the phenomenological study of algorithms. Bucher (2017) proposed the phenomenological approach as a new way of studying algorithms through the experiences of users that are affected by them. This research finds that it is difficult to talk of a unified experience when approaching algorithms using phenomenology, as algorithms provide a personalized output for every user. Yet, it argues that the essence of the experience with algorithms is its dynamic and volatile character. The ever-changing and inconsistent nature of algorithms causes feelings of insecurity and frustration for online sex educators. Users can never pinpoint the exact nature of the algorithm, which makes it easy for platforms to undermine their claims about unfair and biased regulation practices. Herein resides the power of the algorithm, and with that the power imbalance between Instagram and its users. Lastly, this research provides a new understanding into how users understand and perceive algorithms. Online sex educators do not view algorithms as a neutral technical system that functions in isolation of its social climate, rather they regard algorithms as another tool in the hands of powerful institutions that repress their efforts to empower people in their sexualities. There are many reports of sex workers, LGBTQIA+, BIPOC, fat people, disabled people, and more that experience similar forms of algorithmic content regulation on social media platforms. Hence, unpacking these experiences helps us to better understand the circumstances that these content creators encounter and the way in which social media platforms influence the public debate.

dr. Joao Fernando Ferreira Goncalves
hdl.handle.net/2105/71566
Digitalisation, Surveillance & Societies
Erasmus School of History, Culture and Communication

Karlijn Maria van der Plaat. (2023, August). Instagram hates sex, the world hates sex. Digitalisation, Surveillance & Societies. Retrieved from http://hdl.handle.net/2105/71566