Silencing in data science practices
This article examines the relationship between data science practices and epistemic injustice, with a particular focus on the phenomenon of silencing . Our practice-oriented analysis of the data science pipeline – data collection, cleaning, model training and implementation – reveals a vicious cycle...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
SAGE Publishing
2025-09-01
|
| Series: | Big Data & Society |
| Online Access: | https://doi.org/10.1177/20539517251365228 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | This article examines the relationship between data science practices and epistemic injustice, with a particular focus on the phenomenon of silencing . Our practice-oriented analysis of the data science pipeline – data collection, cleaning, model training and implementation – reveals a vicious cycle of silencing that perpetuates and amplifies existing biases. We demonstrate how initial biases in data collection can lead to the development of models that silence minority voices and how, once deployed, these models further marginalise these groups. Importantly, we argue that the relationship between data science and epistemic injustice is not inherently negative – data science methods can detect biases, mitigate injustices and translate critical reflections into specifications for inclusive systems. By bridging discussions in data science and the philosophy of epistemic injustice, this article contributes to the ongoing discourse on the ethical implications of big data and artificial intelligence, underscoring the importance of embedding epistemic justice considerations throughout the data science lifecycle. |
|---|---|
| ISSN: | 2053-9517 |