Inferring Human Activities from Observation via Semantic Reasoning: A novel method for transferring skills to robots

November 6,  2015, 11 am




One fundamental issue of autonomous robots in task domains is its capability to

learn new skills and to re-use past experiences under different situations as

efficient, intuitive and reliable as possible. A promising mechanism to achieve

that is via learning from demonstrations. This thesis presents a framework that

includes a new learning technique based on semantic representations that considers

the re-usability and the inclusion of new skills in a robust manner. Then, the

obtained semantics extracts the essence of the observed activities in terms of

human motions and object properties. The introduced framework has been assessed

on a Humanoid robot using different perceptual modalities, under different

constraints and in several scenarios. Each of them poses distinct and challenging

levels of complexity to demonstrate, that our framework does not depend on the

analyzed task. The results show that a robot with our framework is able to

correctly recognizes human behaviors from on-line demonstrations with an accuracy

of 87.44%, which is even better than a random participant recognizing the same

demonstrations (about 76.68%).