Editorial: #5 Spectres of Artificial Intelligence
Research output: Journal contributions › Other (editorial matter etc.) › Research
Standard
In: spheres - Journal for Digital Cultures, Vol. 5, 01.11.2019, p. 1-4.
Research output: Journal contributions › Other (editorial matter etc.) › Research
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - JOUR
T1 - Editorial
T2 - #5 Spectres of Artificial Intelligence
AU - spheres Editorial Collective
AU - Apprich, Clemens
AU - Luchs, Inga
AU - Beverungen, Armin
AU - Wiedemann, Carolin
AU - Yoosuf, Hana
AU - Hille, Laura
AU - Ganesh, Maya Indira
AU - Lohmüller, Stina
PY - 2019/11/1
Y1 - 2019/11/1
N2 - Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd ‘hauntology’ is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.
AB - Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd ‘hauntology’ is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.
KW - Digital media
KW - Media and communication studies
UR - https://spheres-journal.org/wp-content/uploads/spheres-5_Editorial.pdf
UR - https://spheres-journal.org/issue/5-spectres-of-ai/
M3 - Other (editorial matter etc.)
VL - 5
SP - 1
EP - 4
JO - spheres - Journal for Digital Cultures
JF - spheres - Journal for Digital Cultures
SN - 2363-8621
ER -