If robots are to function in unconstrained, dynamic, and real-world environments, it is critical that are able to interact with deformable materials. Deformable materials have historically been overlooked in traditional robotics due to prevailing assumption of robot and object rigidty. These assumptions, though perfectly appropriate for constrained environments such as factory settings, are often broken in the real-world. At the same time, deformable interactions are uniquely challenging due to the infinite-dimensional, non-linear nature of the deformable materials involved. To address the challenges of introducing and implementing deformation in robotics, we present this thesis work in two parts. First, we study deformation on the robot side, where we design a soft tactile sensor and demonstrate its use in human-robot interaction and automation. Second, we also explore deformation on the object side, where in particular we focus on optimizing grasp strategies over 3D field quantities.

In our first group of work, we propose and fabricate a novel soft tactile device that utilizes an embedded 3D depth-sensing camera to produce interpretable signals for geometry and force sensing. We demonstrate that this sensor is inherently safe and functional for applications including physical upper limb assistance for humans, contour following for domestic wiping tasks, as well as geometry-dependent learning from demonstration tasks for general contact-rich manipulation tasks. In the second part of the thesis, we develop grasp planners for 3D deformable objects (e.g., fruits, internal organs, containers) for applications in food processing, robotic surgery, and household automation. In particular, we optimize for field quantities that are not only inaccessible in the real world, but have also been, until recent years, computationally intractable to model. We create DefGraspSim, a finite element method-based physics simulator of arbitrary grasps on arbitrary 3D meshes over a wide range of material parameters. We also create DefGraspNets, a graph neural network-based forward dynamics model that is not only up to 1500x faster than DefGraspSim, but also enables gradient-based grasp optimization. For both methods, we demonstrate generalized performance across multiple test sets, including on real-world experiments.




Download Full History