Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome

Instagram influencers of marginalized identities and subjectivities, for example those that are plus sized or people of color, often express that their content is moderated more heavily and will sometimes place blame on the “the algorithm” for their feelings of discrimination. Though biases online a...

Full description

Saved in:
Bibliographic Details
Main Author: Marissa Willcox
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-01-01
Series:Frontiers in Communication
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fcomm.2024.1385869/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841554669282787328
author Marissa Willcox
author_facet Marissa Willcox
author_sort Marissa Willcox
collection DOAJ
description Instagram influencers of marginalized identities and subjectivities, for example those that are plus sized or people of color, often express that their content is moderated more heavily and will sometimes place blame on the “the algorithm” for their feelings of discrimination. Though biases online are reflective of discrimination in society at large, these biases are co-constituted through algorithmic and human processes and the entanglement of these processes in enacting discriminatory content removals should be taken seriously. These influencers who are more likely to have their content removed, have to learn how to play “the algorithm game” to remain visible, creating a conflicting discussion around agentic flows which dictates not only their Instagram use, but more broadly, how creators might feel about their bodies in relation to societal standards of “acceptability.” In this paper I present the #IWantToSeeNyome campaign as a case study example which contextualizes some of the experiences of marginalized influencers who feel content moderation affects their attachments to their content. Through a lens of algorithmic agency, I think through the contrasting alignments between freedom of expression and normative representation of bodies in public space. The Instagram assemblage of content moderation, presents a lens with which to view this issue and highlights the contrast between content making, user agency, and the ways more-than-human processes can affect human feelings about bodies and where they do and do not belong.
format Article
id doaj-art-e11f0fb957b54972b352e5050ad4d07b
institution Kabale University
issn 2297-900X
language English
publishDate 2025-01-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Communication
spelling doaj-art-e11f0fb957b54972b352e5050ad4d07b2025-01-08T12:17:07ZengFrontiers Media S.A.Frontiers in Communication2297-900X2025-01-01910.3389/fcomm.2024.13858691385869Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyomeMarissa WillcoxInstagram influencers of marginalized identities and subjectivities, for example those that are plus sized or people of color, often express that their content is moderated more heavily and will sometimes place blame on the “the algorithm” for their feelings of discrimination. Though biases online are reflective of discrimination in society at large, these biases are co-constituted through algorithmic and human processes and the entanglement of these processes in enacting discriminatory content removals should be taken seriously. These influencers who are more likely to have their content removed, have to learn how to play “the algorithm game” to remain visible, creating a conflicting discussion around agentic flows which dictates not only their Instagram use, but more broadly, how creators might feel about their bodies in relation to societal standards of “acceptability.” In this paper I present the #IWantToSeeNyome campaign as a case study example which contextualizes some of the experiences of marginalized influencers who feel content moderation affects their attachments to their content. Through a lens of algorithmic agency, I think through the contrasting alignments between freedom of expression and normative representation of bodies in public space. The Instagram assemblage of content moderation, presents a lens with which to view this issue and highlights the contrast between content making, user agency, and the ways more-than-human processes can affect human feelings about bodies and where they do and do not belong.https://www.frontiersin.org/articles/10.3389/fcomm.2024.1385869/fullcontent moderationfeminismgenderalgorithmsagencyInstagram
spellingShingle Marissa Willcox
Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
Frontiers in Communication
content moderation
feminism
gender
algorithms
agency
Instagram
title Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
title_full Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
title_fullStr Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
title_full_unstemmed Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
title_short Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome
title_sort algorithmic agency and fighting back against discriminatory instagram content moderation iwanttoseenyome
topic content moderation
feminism
gender
algorithms
agency
Instagram
url https://www.frontiersin.org/articles/10.3389/fcomm.2024.1385869/full
work_keys_str_mv AT marissawillcox algorithmicagencyandfightingbackagainstdiscriminatoryinstagramcontentmoderationiwanttoseenyome