MIT sensor-based gloves teach AI on objects identification

The efficiency reached in the tests was of 76% which augurs a promising future in the areas of robotics and medicine, specifically in the creation of prostheses for disabled people

A group of researchers from the Massachusetts Institute of Technology (MIT) developed the scalable TActile glove (STAG), which purpose isto help determine how humans identify what they touch using the 550 pressure minisensors that make it up.

The glove will allow to establish patterns for the design of robotic manipulators and advanced hand prostheses for people with disabilities or who have suffered amputation after an accident.

The group of researchers from MIT has managed to record the largest amount of data captured by the artificial intelligence incorporated into the STAG glove. “In this sense, the 550 mini-sensors allowed them to collect 135,000 video frames of 26 everyday objects, such as spoons, soda cans, scissors, cups and pens,” reported the institute.

In the grip process, which involves holding an object, the neural network managed to produce semi-random frames that led to form a total image of the object. Subsequently, the scientists “used semi-random images for the neural network to feed on groups of images strictly linked to the grip, so the AI ​​does not use irrelevant data.”

The accuracy achieved was 76%, a highly satisfactory result for the group of researchers, who also highlight two characteristics of the system, such as its low cost and high level of sensitivity.

When comparing the novel glove with other similar ones in the market it was possible to verify that it uses materials that can be obtained easily and its cost is of hardly US $ 10, with respect to similar others that cost thousands of dollars.

M.Pino

Source: Teckcrispy

You might also like