ARVL/RAMs Current Research Areas
Coordinated Control of Communication-Enabled Mobile Robots and Vehicles
Our mobile robots emulate autonomous and connected vehicles with vehicle-to-vehicle (V2V), Vehicle-to-Infrastructure (V2I), and Vehicle-to-other entities (V2X) (such as road users, pedestrians, bicyclists, etc.) communications. These robots are equipped with various sensors and on-board processors linked to other robots/vehicles and the surrounding environment. We are developing and evaluating hybrid hierarchical control algorithms for autonomous driving, platooning, merging, collision avoidance, and other dynamic coordinated functions of intelligent vehicles and robots in complex environments.
AI and ML Applications in Autonomous Robots
In this research area, we investigate AI and machine learning methods and develop algorithms to navigate robots in unstructured, previously unseen environments. From target tracking to collision avoidance to learning from past experiences, these are explored in simulation and real-world experiments with mobile robots. The same techniques will be studied for humanoid robots and their interactions with unfamiliar environments and performing tasks not seen previously.
Autonomous Robots Cooperative Perception and Control
When multiple robots need to work together, they must have the capability to sense the environment collectively together. When they cooperatively sense the environment, their field of view is enhanced, their perception quality (and quantity) improves, and they will obtain a much more robust situational awareness to conduct their task. For example, cooperative perception among communicated (connected) cars (vehicles/robots) allows the ability to see beyond obscured scenes in autonomous driving. The ego vehicle can use the information from cars ahead to know where the pedestrian or bicyclist is going (if blocked by other vehicles.) In general, for cooperative robots and drones, this capability allows for understanding the environment beyond the ego robot’s self-field of view. We research the advanced computationally efficient methods for such cooperation among multiple robots or autonomous cars to perceive their surrounding environment, share the data and scenes relevant to their task, and cooperatively plan their function. Furthermore, control systems are developed for cooperating robots and agents (e.g., autonomous cars) working together to perform tasks cooperatively.
Advanced Driver Assistance Systems
Our lab's work in advanced driver assistance systems (ADAS) and active safety also includes a particular interest in mixed modes of autonomous and manual driving. In one area, we seek to understand driver cognitive perception-response abilities through brain, eye, and physiological monitoring; we also seek to model human motor control actions. An in-depth understanding of the driver’s perception response to external stimuli enables the development of more coherent and rational ADAS, thus leading to more intelligent vehicles. These vehicles appropriately interact with drivers as needed and augment driving functions automatically as a continuously supporting co-pilot. Due to its complexity and involvement of continuous sensing, decision-making, and perception-response tasks, driving is a suitable testbed for brain monitoring studies. Furthermore, some of our research discoveries would be equally useful and applicable to other brain-controlled tasks and functions such as limb control, patient rehabilitation, and brain control of machines.
Robotic Drone Research
The Autonomous Robots and Vehicles Laboratory (ARVL) has an indoor Robotic Drone System for state-of-the-art research, including designing and testing new algorithms based on sensor integration and advanced control laws (See the Capabilities and Equipment section for more details and specs.) It consists of nano drones that can be flown individually or as a swarm, carrying sensors for guidance, navigation, and control and video cameras with A.I. computer vision capability. The drones can also interact with small ground robotic vehicles wirelessly connected to a ground station located in the Lab.
Our research addresses swarm drone sensing fusions, trajectory plans, and task coordination in various applications. Projects in intelligent integrated drones and land vehicles include computer vision with AI algorithms to extract real-time road information, such as lane lines, traffic signs, and objects in places that produce potential hazards, both onboard and on the ground station. Drones can also assist both autonomous and remote-controlled ground robots, helping to provide closed-loop control commands to autonomous vehicles and advanced driver assistance to humans doing the remote control.