Abstract: From localizing a self-driving vehicle on the street to reconstructing a digital asset through intelligent wearables, spatial perception plays a pivotal role to realize these embodied intelligence systems. While several advances in spatial perception have been witnessed over the last decade, perception robustness against adverse conditions (e.g., sensing degradation) remains a challenge. This talk will present our recent contributions in improving the practical robustness of spatial perception against visual degradation – one of the most common adverse conditions threatening a perception system in the wild. The trajectory of our research is characterized by the usage of increasingly powerful and compact sensors operating in the non-visible spectrum, combining recent breakthroughs in artificial intelligence and multimodal data fusion. Two works focused on 3D localization and environment mapping will be introduced to demonstrate their perception robustness for both mobile robotics and intelligent wearables.
![](https://i.ytimg.com/vi/JnwjdMwsY3Y/maxresdefault.jpg)