Robotic Finger Digs, Finds, Identifies Small Buried Objects

With the help of AI and a neural network, this wiggling robotic finger digs into substances such as sand to locate and identify buried objects. Robotic hands, fingers, grippers, and their various sensors are usually deployed to function in “free space” for many obvious reasons. But in the real world, it’s often necessary to poke and probe into materials such as sand while looking for objects like buried nails or coins (best case) or explosives (a far-worse case). 

Addressing this challenge, a team at the Massachusetts Institute of Technology (MIT) has devised and tested their second-generation robotic finger specifically designed for this class of mission with impressive success: Their “Digger Finger” was able to dig through granular media such as sand and rice and correctly sense the shapes of submerged items it encountered. The objective is an electromechanical finger that replicates the capabilities of a human one, which can find, feel, and “see” (mentally) the shape of a buried object based on sensing and learned experience.

The project consists of two major aspects: the robotic finger that can penetrate into a granular mass to get close enough to illuminate and capture a reflected—albeit highly distorted—pattern, followed by artificial intelligence/neural-net data-extraction and analysis algorithms. At the core of the hardware design is GelSight, a sharp-tipped robot finger developed by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

What you’ll learn:

  • Why and how a complicated robotic-finger design was improved by a research team.
  • How this finger is used to dig into bulk materials to locate and “see” small objects.
  • How the reflected image of the object is the basis for the data set that determines the actual shape of the object.
  • How AI and a neural net were employed to analyze the distorted, reflected object images that were captured.

Click here to read the full article in Electric Design.

[PODCAST] Tech of Sports: Felix Breitschädel, PhD, Norwegian Olympic Sports Center

On the ‘Tech of Sports’ podcast, Rick talks with Felix Breitschädel, PhD, Norwegian Olympic Sports Centre about how the Norwegian Ski Team is utilizing unique tech to give their skiers an edge on the slopes by arming them with detailed data, thanks to GelSight – whose handheld 3D measurement device the Norwegian ski team is using to quite literally, measure snow.

The skiers need to understand how the surface of their ski will interact with the snow, and even though there were other measurement tools they could use to measure the topography of the snow, it had to be taken out of the natural environment and into the lab, which impacted results. With the handheld device, GelSight mobile, Felix and his team can investigate the surface of the snow right out on the slopes or rink to identify the snow grains and take pictures of the snow’s surface for future reference. It helps also inform R&D decisions to develop new grinds for equipment for upcoming competitions.

Click here to listen to the episode.

How Computers with Humanlike Sense Will Changes Our Lives

The Future of Everything covers the innovation and technology transforming the way we live, work and play, with monthly issues on healthmoneycities and more. This month is Artificial Intelligence, online starting July 2 and in the paper on July 9.

Even the smartest computers cannot fully understand the world without the ability to see, hear, smell, taste or touch. But in the decadeslong race to make software think like humans—and beat them at “Jeopardy!”—the idea of endowing a machine with humanlike senses seemed far-fetched. Not anymore, engineers and researchers say.

Capabilities powered by artificial intelligence, like image or voice recognition, are already commonplace features of smartphones and virtual assistants. Now, customized sensors, machine learning and neural networks—a subset of AI that mimics the way our brains work—are pushing digital senses to the next level, creating robots that can tell when a package is fragile, sniff out an overheated radiator or identify phony Chardonnay.

Hype around AI is running high, and much of the research is in early stages. Here, we look at 10 working models and prototypes of AI with sensory abilities.

Click here to read the full article in the Wall Street Journal (subscription may be required).