The rise of autonomous and artificially intelligent systems promises to deliver efficient and optimal performance as well as unprecedented functionality in a variety of fields and human endeavors. As varied as these fields are, a common property of most autonomous systems is that they convert raw data about their relationship with the environment into optimal actions to achieve a desired goal. One may further divide this conversion into the sub-tasks of sensing and perception, planning, and actuation. Perception is an important part of this pipeline since it describes the set of algorithms for turning raw data into useful bits of information that can be used for planning.

At its core, we may view perception using an estimation framework. Perception algorithms essentially seek to estimate information from noisy, incomplete and raw data obtained from sensors. However, the accuracy of the information obtained (and consequently effectiveness of the actions taken by the system) are dependent on the assumptions and models used in designing perception algorithms. Therefore, it is important to take care in designing accurate but tractable models for perception. Unfortunately, many perception tasks involve complex functions and sensor models relating observed data to information. These functions are difficult to model from physical properties. In these cases data-driven methods can provide complementary techniques that result in excellent models for very complex systems.

In this dissertation, we present three perception algorithms designed for applications in Autonomous Driving, Energy Systems and Mixed Reality. We make use of data driven methods to provide approximations to complex sensor models and present tractable estimation algorithms for turning raw data into information. We verify our approach on both synthetic and real data and report excellent results.




Download Full History