Multimodal Human-computer interaction technologies and validation for the robot astronaut
- Paper number
IAC-16,B6,3,12,x32441
- Coauthor
Prof. Chen Meng, Institute of Aerospace System Engineering Shanghai, CASC, China
- Coauthor
Mr. Liangliang HAN, Institute of Aerospace System Engineering Shanghai, CASC, China
- Coauthor
Mr. Ping TANG, Institute of Aerospace System Engineering Shanghai, CASC, China
- Year
2016
- Abstract
In order to reduce the risk of astronaut Extra-Vehicular Activities, decrease the frequency and complexity of operation tasks, a robot astronaut system of two arms and dexterous robot hands with multi-degree of freedom has been built to replace the astronaut, which uses a variety of man-machine interactive modes for fine control over the extra-vehicular operations. This system adopts the quad-core embedded controller as the operating platform, to realize the distributed control of humanoid robot based on EtherCAT bus, which has a flexible and efficient network topological structure, modular software architecture. The robot astronaut using multimodal human-computer interaction has many functions that can achieve tele-operation with task levels, command input with mouse or keyboard, path planning, virtual scene display, data management and system communication. It can accept command input and data feedback by using vision camera, force and torque perceived manual controller, data glove, stereo and microphone, and other interactive equipments. Based on the interface software system with operator, the key technologies such as path planning and obstacle avoidance, visual servoing, compliance control at the end of arm have been demonstrated, and the typical operation tasks including grasping, screwing, plugging, switching, moving, collaborative handling with two arms have been validated. Experiments show that the system ensures the real-time performance and reliability of operation, has high control accuracy and stability.
- Abstract document
- Manuscript document
(absent)