Shared spaces are characterized by a mix of different road users (e.g., pedestrians, cyclists, and vehicles) with reduced traffic infrastructure. While the focus of such designs is to increase social interactions and non-verbal communication (via hand gestures or others), priority conflicts and confusion can occur. Also in unclear situations, collisions that might occur between distracted pedestrians or cyclists could put pedestrians at a safety risk. Augmented reality (AR) wearables with their power of 3D visualization could be used to superimpose virtual safety signals. These devices also include RGBD sensors that capture the image and point cloud data for environment sensing.
This project focuses on exploring both the sensing and 3D visualization capabilities of an AR wearable for pedestrian safety. A perception pipeline that first detects ego pedestrians and predicts the future position can visualize this information on the 3D AR glass. With the prediction information of ego road users, the walking behavior of the AR headset could be influenced when he decides to make alternative route choices/detours. However, this would require that both the visualization and the detection pipeline has to be both intuitive and latency-free. This project focuses on solving key research questions for a real-time AR motion prediction pipeline for influencing walking behavior. While the topic is focused on pedestrian safety using AR wearables, this could also be extended to the cyclist in shared spaces.
Researcher: Vinu Kamalasanan