Abstract: For social robots to be successfully integrated and accepted within society, they need to be able to interpret human social cues that are displayed through natural modes of communication. Service robots directly interact with people, so finding a more natural and easy user interface is of fundamental importance. In particular, a key challenge in the design of social robots is developing the robotís ability to recognize a personís affective states (emotions, moods, and attitudes) in order to respond appropriately during social HumanĖRobot Interactions (HRIs). In this project development of a social robot which will be able to autonomously determine a personís degree of accessibility is proposed. The work will determine the performance of our automated system in being able to recognize and classify a personís accessibility levels and investigation of how people interact with an accessibility-aware robot which determines its own behaviours based on a personís accessibility levels.

Keywords: Rehabilitation, Recognizes a body gesture, Face expressions, Modelling of body gestures.