Editorial: #5 Spectres of Artificial Intelligence

Research output: Journal contributionsOther (editorial matter etc.)Research

Authors

Artificial intelligence (AI) is arguably the new spectre of digital cultures. By filtering information out of existing data, it determines the way we see the world and how the world sees us. Yet the vision algorithms have of our future is built on our past. What we teach these algorithms ultimately reflects back on us and it is therefore no surprise when artificial intelligence starts to classify on the basis of race, class and gender. This odd ‘hauntology’ is at the core of what is currently discussed under the labels of algorithmic bias or pattern discrimination. By imposing identity on input data, in order to filter, that is to discriminate signals from noise, machine learning algorithms invoke a ghost story that works at two levels. First, it proposes that there is a reality that is not this one, and that is beyond our reach; to consider this reality can be unnerving. Second, the ghost story is about the horror of the past – its ambitions, materiality and promises – returning compulsively and taking on a present form because of something that went terribly wrong in the passage between one conception of reality and the next. The spectre does not exist, we claim, and yet here it is in our midst, creating fear, and re-shaping our grip on reality.
Original languageEnglish
Journalspheres - Journal for Digital Cultures
Volume5
Pages (from-to)1-4
Number of pages4
ISSN2363-8621
Publication statusPublished - 01.11.2019