Print Email Facebook Twitter The design and validation of an intuitive confidence measure Title The design and validation of an intuitive confidence measure Author van der Waa, J.S. van Diggelen, J. Neerincx, M.A. Contributor Said, A. (editor) Komatsu, T. (editor) Publication year 2018 Abstract Explainable AI becomes increasingly important as the use of intelligent systems becomes more widespread in high-risk domains. In these domains it is important that the user knows to which degree the system’s decisions can be trusted. To facilitate this, we present the Intuitive Confidence Measure (ICM): A lazy learning meta-model that can predict how likely a given decision is correct. ICM is intended to be easy to understand which we validated in an experiment. We compared ICM with two different methods of computing confidence measures: The numerical output of the model and an actively learned metamodel. The validation was performed using a smart assistant for maritime professionals. Results show that ICM is easier to understand but that each user is unique in its desires for explanations. This user studies with domain experts shows what users need in their explanations and that personalization is crucial. © 2018 Subject CertaintyConfidenceExperimentExplainabilityICMInstance basedLazy learningMachine learningMeasureUserValidationExperimentsLearning systemsNumerical methodsUser interfacesCertaintyConfidenceExplainabilityInstance basedLazy learningMeasureUserValidationIntelligent systems To reference this document use: http://resolver.tudelft.nl/uuid:1078ab7a-e938-405c-9ef7-f689ad1c9471 DOI https://doi.org/10.1145/1235 TNO identifier 788264 Publisher CEUR-WS ISSN 1613-0073 Source 2018 Joint ACM IUI Workshops, ACMIUI-WS 2018. 11 March 2018, 2068 Series CEUR Workshop Proceedings Document type conference paper Files To receive the publication files, please send an e-mail request to TNO Library.