Robotic Programing Using Controlled Natural Languages
Modern approaches to program a robot such as procedure coding, teleoperation, and teaching by demonstration may become inefficient and unstable in some circumstances as the flexibility and complexity of manufacturing processes increase. As compared to these approaches, natural-language communication is more convenient, effective and favorable for robotic programing. The project aims to develop a framework of robotic programing via controlled natural languages, which enable a robot to understand human instructions and to interact with humans when there is a necessity to resolve ambiguity. The proposed framework of robotic programing parses human instructions written in controlled natural languages such as WikiHow tutorials, and generates a sequence of robot acting and sensing tasks.
MagicHand: In-Hand Perception for Dexterous Manipulation
The project aims to advance an in-hand perception technolgy for dexterous manipulation using an anthropomorphic robotic hand. Humans can determine a proper strategy and grasp an object according to its physical attributes and the task context. This project proposes an approach to determine grasping strategy for an anthropomorphic robotic hand simply based on in-hand perception and natural-language descriptions.
Traffic Condition Understanding for Autonomous Driving
The project aims to develop an approach to perceive dynamic road conditions for assistive and autonomous driving under difficult weather conditions. Despite the advancements of assistive and autonomous driving technologies, assistive and autonomous driving under adverse weather conditions has not been adequately addressed. By fusing multiple sensing modalities, the goal of this project is to estimate road adhesion coefficients, identify unexpected slippery regions on roads, and develop a vehicle control model that guarantees safety and maximizes driving efficiency.
Wearable Robotic Perception for Smart Health
The goal of this project is to research wearable processing algorithms on smart glasses to assist the visually impaired and blind (VIB) in unknown dynamic environments. Assistive devices in the form of normal glasses are more favorable than current handheld or ambient devices as they are convenient and nonintrusive; however, it is very challenging to rebuild the missed necessary abilities of the VIB caused by vision lost due to complexity of dynamic environments and constrains of onboard sensors on wearable assistive systems.
Affective Intelligence for Social Robotics
Imagine a world in which intelligent robots live among us, learn and grow with us, make us laugh and cry, and ultimately forge long-term endearing emotional bonds with us. In such a world, robots are accepted as members of society because they have every feature of an intelligent sentient being. They learn new knowledge about people and the world around them, accumulate these memories as ‘experience’, and have expressive capabilities for conveying emotions and affection.