Reasoning with preferences in service robots

Service Robots should be able to reason about preferences when assisting people in common daily tasks. This functionality is useful, for instance, to respond to action directives that conflict with the user’s interest or wellbeing or when commands are underspecified. Preferences are defeasible knowledge as they can change with time or context, and should be stored in a non-monotonic knowledge-base system, capable of expressing incomplete knowledge, updating defaults and exceptions dynamically, and handling multiple extensions. In this paper a knowledge-base system with such an expressive power is presented. Non-monotonicity is handled using a generalization of the Principle of Specificity, which states that in case of knowledge conflict the most specific proposition should be preferred. Reasoning about preferences is used on demand through conversational protocols that are generic and domain independent. We describe the general principles underlying such protocols and their implementation through the SitLog programming language. We also show a demonstration scenario in which the robot Golem-III assists human users using such protocols and preferences stored in its non-monotonic knowledege-base service.

Torres, I., Hernández, N., Rodríguez, A., Fuentes, G., & Pineda, L. A. (2019). Reasoning with preferences in service robots. Journal of Intelligent & Fuzzy Systems, 36(5), 5105-5114.