Artificial Intelligence strengthens robots’ perception

The AI advances very fast and it seems that soon robots will be able to feel by means of touch and seeing abilities

Scientists at the Massachusetts Institute of Technology’s (MIT) Computer Science and Artificial Intelligence Laboratory developed a new technology that allows robots to feel objects and textures from the gaze.

The gaze creates a file of touch information according to the representation of the objects the robot obtains by touching each thing. By seeing the sequence, the model robot can imagine the sensation of touching a flat surface or a sharp object.

Similarly, by blindly touching around the model can predict interactions with the environment, based solely on tactile sensations. The two sensations together can hinder the robot and reduce the information that might be needed for tasks that involve manipulating and collecting objects.

Yunzhu Li, lead author of this new system, argues that this new Artificial Intelligence technology will allow the robot to handle real objects in a more natural way. Without a doubt, it is the first method that can convincingly translate the visual and tactile signals of the devices, which represents a potential and very useful method for robotics.

It seems that soon we will be able to see robots able to adapt in a more effective way to their environment and to the objects that surround them. The model has presented a great competition in terms of the feeling what it holds.

L.Sáenz

Source: Fayer Wayer 

You might also like