Go to main content

PDF

Description

Autonomous and semi-autonomous systems can encounter situations where timely attention of a human operator is required to take over some aspect of decision making or control. For certain human robot interaction (HRI) applications, like Autonomous Vehicle (AV) operations, these decisions could be both time-critical and safety-critical. Given this, it is important to ensure that the human is brought into the decision making loop in a manner that enables them to make a timely and correct decision. In this paper, we consider one such application, which we refer to as the perception hand-off problem, which brings the driver into the loop when the perception module of an AV is uncertain about the environment. We formalize the perception hand-off problem using a Partially Observable Markov Decision Process (POMDP) model with a problem specific structure. This model captures the latent cognitive state of the driver which can be influenced through a query-based Human-Machine Interface (HMI). Through a human-study experiment on the perception hand-off problem for object recognition, we learn such a model and validate our hypotheses about the hand-off problem and the impact of our query-based HMI. The results from this study show that the state of attentiveness does indeed impact the human performance, and our proposed active information gathering (AIG) actions, or queries, result in 7% faster responses from the human. We also use this experimental data to learn the proposed POMDP model parameters. Simulations with this identified model show that a policy for deploying the AIG actions improves the percentage of correct responses from the human in the perception hand-off by around 5.5%, outperforming other baselines while also using fewer of these actions.

Details

Files

Statistics

from
to
Export
Download Full History
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DublinCore
EndNote
NLM
RefWorks
RIS