Tool use has long been a feature of human intelligence, as well as a practical problem to solve for a wide range of robotic applications. But machines still struggle to exert the right amount of force to control tools that aren’t rigidly attached to their hands.
To manipulate said tools more robustly, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), designed a system capable of grabbing tools and apply the appropriate amount of force for a given task. , like wiping up a liquid or writing a word with a pen.
The system, dubbed Series Elastic End Effectors, or SEED, uses flexible bubble grippers and built-in cameras to map how the grippers deform over a six-dimensional space (think an airbag that inflates and deflates) and applies a force to a tool. Using six degrees of freedom, the object can be moved left to right, up and down, back and forth, roll, pitch, and yaw. The closed-loop controller – a self-regulating system that maintains a desired state without human interaction – uses SEED and visuotactile feedback to adjust the position of the robot arm to apply the desired force.
This could be useful, for example, for someone using tools when there is uncertainty in the height of a table, as a pre-programmed trajectory could completely miss the table. “We relied heavily on the work of Mason, Raibert, and Craig on what we call a hybrid position of force controller,” says Hyung Ju Suh, a PhD student in electrical and computer engineering at MIT, affiliated with CSAIL and lead author of a new article on SEED. “It’s the idea, that if you actually had three dimensions to move when you write on a blackboard, you want to be able to control position on some of the axes, while controlling force on the other axis.”
Rigid-bodied robots and their counterparts can only take us so far; the softness and suppleness offer the luxury and the ability to deform, to feel the interaction between the tool and the hand.
With SEED, each run detected by the robot is a recent 3D image of the grippers, allowing real-time tracking of how the grippers change shape around an object. These images are used to reconstruct the tool position, and the robot uses a learned model to map the tool position to the measured force. The learned model is obtained by using the robot’s previous experience, where it perturbs a force torque sensor to determine the stiffness of the bubble clamps. Now, once the robot detects the force, it compares it with the force the user is commanding, and maybe it says to itself, “It turns out that the force I’m feeling right now isn’t not quite there. I have to press harder. It would then move in the direction to increase strength, all on 6D space.
During the “squeegee task”, SEED was given just the right amount of force to wipe up liquid on an airplane, where basic methods struggled to get the right sweep. When asked to put paper on a pen, the bot actually wrote “MIT” and it was also able to apply the right amount of force to drive a screw.
While SEED was aware that it had to command force or torque for a given task, if gripped too hard the object would inevitably slip, so there is an upper limit to this exerted harshness. Also, if you are a stiff robot, you can simulate systems that are softer than your natural mechanical stiffness, but not the other way around.
Currently, the system assumes a very specific geometry for the tools: it must be cylindrical, and there are still many limitations on how it can generalize when encountering new types of shapes. Future work may involve generalizing the framework to different shapes so that it can handle arbitrary tools in nature.
“No one will be surprised that compliance can help with tools, or that force sensing is a good idea; the question here is where on the robot compliance should go and how gentle it should be,” says paper co-author Russ Tedrake, Toyota Professor of Electrical Engineering and Computer Science, Aeronautics and Astronautics and Mechanical Engineering at MIT and Director. CSAIL investigator. “Here we explore regulating a fairly mild six degrees of freedom stiffness directly at the hand/tool interface, and show that there are some nice benefits to doing so.”
Suh authored the article alongside Naveen Kuppuswamy, senior researcher at the Toyota Research Institute; Tao Pang, PhD student in mechanical engineering at MIT and affiliated with CSAIL; Paul Mitiguy and Alex Alspach of TRI; and Tedrake. They will present their work at the IEEE/RSJ International Conference on Intelligent Robots and Systems in October.
The Toyota Research Institute provided funds to support this work.
#Soft #robots #grab #force