The industry of robotics is on the rise; an increasing number of robots are being used in factories and households. Our capstone project – Controls for Assistive Robots – deals with this market demand. We aim to implement and integrate various subsystems of UC Berkeley’s InterACT lab to enhance its infrastructure. My technical contribution is a robot perception system that gives the robot the ability to “see” and interact with the environment. Specifically, we implemented a vision system using Microsoft’s Kinect v2 sensor. We managed to make the Kinect a quantitative sensor ready to use in research. We also integrated it with other subsystems in the lab and build a system so that the robot could detect the pose of certain objects in real world, and plan to grab them in a 3D virtual environment. With the systems we developed, future researchers in the lab will be able to conduct their own research and keep making a meaningful impact to the robotics industry.




Download Full History