Our perception of our surrounding environment is limited by the constraints of human biology. The field of augmented perception asks how our sensory capabilities can be usefully extended through computational means. We argue that spatial awareness can be enhanced by exploiting recent advances in computer vision which make high-accuracy, real-time object detection feasible in everyday settings. We introduce HindSight, a wearable system that increases spatial awareness by detecting relevant objects in live 360-degree video and sonifying their position and class through bone conduction headphones. HindSight uses a deep neural network to locate and attribute semantic information to objects surrounding a user through a head-worn panoramic camera. It then uses bone conduction headphones, which preserve natural auditory acuity, to transmit audio notifications for detected objects of interest. We develop an application using HindSight to warn cyclists of approaching vehicles outside their field of view. To evaluate HindSight, we first conduct an exploratory study with 15 users. We next create a VR platform to simulate realistic traffic scenarios and use it to evaluate HindSight in a controlled user study with 21 participants. Participants using HindSight had fewer collisions, increased their space to other vehicles, experienced reduced cognitive load, and reported a perceived increase in awareness.





Download Full History