Technological bias, illusory impartiality, and the injustice of hermeneutical obstruction

Research Article

Technological bias, illusory impartiality, and the injustice of hermeneutical obstruction

DOI: 10.1080/02580136.2024.2373610
Author(s): Monique Whitaker Philosophy, University of KwaZulu-Natal, South Africa

Abstract

The algorithms and processes of modern technologies affect almost all aspects of our lives. No human being is making individual subjective choices in each, or any, instance of these processes—all cases are treated with perfect technological indifference. Hence, a commonsense assumption about the technologies we interact with, and the algorithms many of them implement, is that they are neutral and impartial. Of course, this is simply not the case. As has been extensively documented and discussed, bias exists in all facets of technology, from how and why it was first conceived and then developed to its inputs, processes and outputs. I examine instances of technological bias in visual representation and criminal justice technologies, and argue that they produce, among other epistemic concerns, a form of hermeneutical injustice by means of obstructing the deployment of the necessary and otherwise available hermeneutical resources needed to accurately understand these technologies and their effects. Such epistemic harm is inevitable, unless the relationship between our human biases and the technological processes produced by us is made explicit and significantly rethought.

Get new issue alerts for South African Journal of Philosophy