The length of this robotic arm is about 1.5 meters. With a four-fingered hand, the robot has 3 joints and is kept in an upright position. The ability of catching objects in humans is a collection of many parameters that work wonders for us. Imitating the same in robots must have been a behemoth of a task. Developing a robot that can dodge or catch a complex object that is coming at you in full-motion, not only needs high-end responsive ability, but also techniques that can predict the moving object’s dynamics and generate a movement in the opposite direction.
It should be noted here that the machines in use today are mostly pre-programmed. They won't be able to assimilate data changes in a real fast manner. Tasks like catching objects mid-air needs that these robots do complex calculations in a fraction of a second and then take decisions based on the resulting data.
Inspired by human ability to learn by trial and error and imitation, the researchers at EPFL LASA tried to replicate the adaptability and the required speed. A technique that does not involve giving specific directions to the robot, also called "Programming by Demonstration" was employed to teach the robotic arm how it can catch objects flying at it. They manually repeated the task of throwing objects at it in different trajectories and made it learn how to use its arm.
The researchers conducted experiments of throwing and catching using different objects such as a ball, a half full bottle, a hammer, an empty bottle, and a tennis racket. These five common objects were selected because they offer a varied range of situations in which the part of the object that the robot has to catch (the handle of the racket, for example) does not correspond to its center of gravity. For example, in the case of the bottle with water the center of gravity moves several times during its trajectory. When projected into the air, all these items will make even more complex movements, often involving several axes.
The robotic arm uses a series of cameras that it is surrounded with to create a model for the objects’ kinetics based on the different parameters such as - speed, trajectories and rotational movement. Using a pre-programmed equation, the robot tries to position itself very quickly in the right direction whenever an object is thrown. In a matter of a few milliseconds, the robot refines and corrects the trajectory for a real-time and high precision capture. The researchers have further enhanced the efficiency by developing controllers that couple and synchronize the movements of the hand and fingers.
Do take a look at the video put together by the EPFL team -
What are your thoughts on that? Share with us in comments below.
Source: EPFL Research News