Editorial: #5 Spectres of Artificial Intelligence

Publikation: Beiträge in ZeitschriftenAndere (Vorworte. Editoral u.ä.)Forschung

Standard

Editorial: #5 Spectres of Artificial Intelligence. / spheres Editorial Collective ; Apprich, Clemens; Beverungen, Armin et al.
in: spheres - Journal for Digital Cultures, Jahrgang 5, 01.11.2019, S. 1-4.

Publikation: Beiträge in ZeitschriftenAndere (Vorworte. Editoral u.ä.)Forschung

Harvard

APA

Vancouver

Bibtex

@article{9b7816a014a44ab9a299229e65e39151,
title = "Editorial: #5 Spectres of Artificial Intelligence",
abstract = "Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd {\textquoteleft}hauntology{\textquoteright} is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.",
keywords = "Digital media, Media and communication studies",
author = "{spheres Editorial Collective} and Clemens Apprich and Inga Luchs and Armin Beverungen and Carolin Wiedemann and Hana Yoosuf and Laura Hille and Ganesh, {Maya Indira} and Stina Lohm{\"u}ller",
year = "2019",
month = nov,
day = "1",
language = "English",
volume = "5",
pages = "1--4",
journal = "spheres - Journal for Digital Cultures",
issn = "2363-8621",
publisher = "Cent­re for Di­gi­tal Cul­tu­res L{\"u}neburg",

}

RIS

TY - JOUR

T1 - Editorial

T2 - #5 Spectres of Artificial Intelligence

AU - spheres Editorial Collective

AU - Apprich, Clemens

AU - Luchs, Inga

AU - Beverungen, Armin

AU - Wiedemann, Carolin

AU - Yoosuf, Hana

AU - Hille, Laura

AU - Ganesh, Maya Indira

AU - Lohmüller, Stina

PY - 2019/11/1

Y1 - 2019/11/1

N2 - Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd ‘hauntology’ is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.

AB - Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd ‘hauntology’ is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.

KW - Digital media

KW - Media and communication studies

UR - https://spheres-journal.org/wp-content/uploads/spheres-5_Editorial.pdf

UR - https://spheres-journal.org/issue/5-spectres-of-ai/

M3 - Other (editorial matter etc.)

VL - 5

SP - 1

EP - 4

JO - spheres - Journal for Digital Cultures

JF - spheres - Journal for Digital Cultures

SN - 2363-8621

ER -

Links