Title
Role of emotions in responsible military AI
Author
van Diggelen, J.
Metcalfe, J.S.
van den Bosch, K.
Neerincx, M.
Kerstholt, J.
Publication year
2023
Abstract
Following the rapid rise of military Artificial Intelligence (AI), people have warned against mankind withdrawing from the immediate war-related events resulting in the “dehumanization” of war (Joerden, 2018). The premise that machines decide what is destroyed and what is spared, and when to take a human life, would be deeply morally objectionable. After all, a machine isn’t a conscious being, doesn’t have emotions like empathy, compassion, or remorse, and isn’t subject to military law (Sparrow, 2007). This argument has sparked the call for meaningful human control, requiring moral decisions to be made by humans, not machines (Amoroso & Tamburrini, 2020). The United States have proposed a similar principle, named “appropriate levels of human judgment”Footnote1. Likewise, the NATO principles of responsible use of AI in Defence (NATO, 2021) state that “AI applications will be developed and used with appropriate levels of judgment and care”.
Subject
Emotions
Military
Artificial Intelligence
AI
Humans
Role
To reference this document use:
http://resolver.tudelft.nl/uuid:da2a158e-cc90-45f2-bfba-e66812a420ae
DOI
https://doi.org/10.1007/s10676-023-09695-w
TNO identifier
982704
ISSN
1388-1957
Source
Ethics and Information Technology, 25 (25)
Document type
article