Search
Close this search box.

Cartucho J., Ventura R., Veloso M.

Proc. of IROS 2018 - IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain
2018
-

Abstract:

Despite the recent success of state-of-the-art deep learning algorithms in object recognition, when these are deployed as-is on a mobile service robot, we observed that they failed to recognize many objects in real human environments. In this paper, we introduce a learning algorithm in which robots address this flaw by asking humans for help, also known as symbiotic autonomy approach. In particular, we bootstrap YOLOv2, a state-of-the-art deep neural network and create a HUMAN neural net using only the collected data. Using an RGB camera and an on-board tablet, the robot proactively seeks for human input to assist in labeling surrounding objects. Pepper, based in CMU, and Monarch Mbot, based in ISR-Lisbon, are the social robots that we used to validate the proposed approach. We conducted a study in a realistic domestic environment over the course of 20 days with 6 research participants. To improve object recognition, we used the two neural nets, YOLOv2 + HUMAN, in parallel. The robot collects data about where an object is and to whom it
belongs by asking. This enabled us to introduce an approach where the robot can search for a specific person’s object. We view the contribution of this paper to be relevant for service robots in general, in addition to Pepper and Mbot. Following
this methodology, the robot was able to detect twice the number of objects compared to the initial YOLOv2, with an improved average precision.