PDF

Description

With the explosive growth of wearable devices across a wide range of medical applications, the ability to monitor a broad range of biosignals becomes increasingly viable. However, these sensors face limitations in hardware resources and battery life. Low-power in-sensor intelligence can replace costly transmission of raw data streams to improve battery life. For a neural prosthetic, for example, various biosensors can be used to intelligently map the user's intended movements into prosthetic actuation. Achieving this locally can extend the lifetime of the device and reduce latency, significantly improving the user experience. This thesis leverages the emerging brain-inspired hyperdimensional computing (HDC) paradigm, which uses an inherently simple binary representation, to address this need.

The first section of this thesis explores the energy efficiency of HDC for machine learning, including a comparison against traditional ML algorithms, through design and post-layout simulation of biosignal classification ASICs. With on-the-fly generation instead of memory storage, and vector folding, the proposed architecture achieves 39.1 nJ/prediction; a 4.9x improvement over the state-of-the-art HDC processor and 9.5x over an optimized SVM processor, paving the way for it to become the paradigm of choice for in-sensor classification.

The second section of this thesis explores the use of the paradigm for robotics including the development of a novel reactive robotics algorithm with a weighted heterogeneous sensor encoding scheme that intelligently prioritizes successful behaviors, boosting the success rate in a 2-D navigation task by over 30%, even when integrated into a neural network.

The final section of this thesis pulls together the prior elements for the realization of a user-adaptive neural prosthetic with shared control. The controller recognizes the user’s behaviors, predicts their next action based on habitual sequences, and determines prosthetic actuation through intelligent deliberation between the user's goal and sensor feedback-driven autonomy. With each layer designed for hardware-efficiency to enable in-sensor implementation, the system achieves an overall accuracy of 93%.

Details

Files

Statistics

from
to
Export
Download Full History