Researchers just developed an electronic glove that gives robots a sense of touch.

A team of engineers from Stanford University just created an electronic glove equipped with sensors. If scalable, it could provide robotic hands with better dexterity in the future. In a study published in Science Robotics, Stanford researchers detailed how the sensors enabled a robotic hand to touch a berry without squashing it.

“This technology puts us on a path to one day giving robots the sort of sensing capabilities found in human skin,” Zhenan Bao, a chemical engineer from Stanford University and lead author of the study, said.

According to Bao, the sensors they put on the tips of the glove’s fingers can measure intensity and pressure direction at the same time. This allows the glove to achieve human-like dexterity manually.

At the moment, Bao and her team are yet to perfect their technology and allow the sensors to work automatically. The goal is for a robot wearing the glove to hold an egg between its forefinger and thumb without breaking or letting it slip.

Read More: New Haptic Armband Gives VR a new Sense of Touch

Imitating Human Sense of Touch

The electronic glove mimics the way human skin works, giving our hands the sensitivity they need to function normally.

The sensors the Stanford researchers put on the fingertips of the electronic glove copies the human sense of touch mechanism. The team created each of the three sensors with three flexible layers that work simultaneously together.

The top and bottom layers are made up of electrical lines laid in grids perpendicular to each other. The structure creates a dense array of small sensing pixels and makes both the surfaces electrically active.

In the middle of the two layers, there’s a rubber insulator that keeps the two electrodes apart. The space in between plays a crucial role as this is where electrical energy is stored. Once you apply pressure to the sensors, the top layer pushes closer to the bottom, increasing the amount of stored energy.

The bumpy bottom layer maps the intensity and the direction of pressure to particular points on the perpendicular grids. This is similar to how the human skin works.

“We can program a robotic hand to touch a raspberry without crushing it, but we’re a long way from being able to touch and detect that it is raspberry and enable the robot to pick it up,” Bao went on to say.

What are the possible implications of giving robots human-like senses?

banner ad to seo services page