Arnold T, Kasenberg D, Scheutz M. Explaining in Time: Meeting Interactive Standards of Explanation for Robotic Systems. ACM Transactions on Human-Robot Interaction (THRI). 2021;10(3): Article No. 25:1-23. https://doi.org/10.1145/3457183
Summary: This paper explores the need for robust explainability in robotic systems operating in human environments. It critiques current AI explanations, often limited to post-hoc interpretations, and emphasizes that social robots require more rigorous explanations that align with human norms and expectations. The authors argue that explanations in social robotics must encompass causal, purposive, and justificatory dimensions, reflecting the robot's goals, decisions, and recognition of social norms. They propose a cognitive robotic architecture designed for Human-Robot Interaction (HRI) that generates explanations based on the robot’s real-time decision-making and adherence to social and normative principles, supporting interactive and meaningful dialogues with human users.
Comments