Teaching robots to perceive, understand, and interact through touch

Facebook battles the challenges of tactile sensing

Facebook is enabling a new generation of touchy-feely robots

Mobile Device for Surface Characterization

Gelsight introduced the new version of its GelSight Mobile probe, the Series 2. The mobile device offers a sleek form-factor that is one-third lighter and less than half the volume of its predecessor, allowing it to scan surfaces in tighter spaces, while maintaining accuracy, speed, and field of view. The technology enables digital tactile sensing with the sensitivity and resolution of human touch. Data captured by the elastomeric tactile sensing platform leverages proprietary software and algorithms to provide detailed, accurate surface characterization. The handheld device can be deployed on production and assembly lines to enable rapid and documented quality assurance decisions.

This article was originally posted on inVision News.

GelSight Mobile Series 2 Offers Compact and Ergonomic Tactile Measurement

GelSight has introduced the latest version of its GelSight Mobile probe, the Series 2. This new generation of GelSight’s mobile device offers a sleek form-factor that is one-third lighter and less than half the volume of its predecessor, allowing it to scan surfaces in tighter spaces, while maintaining accuracy, speed, and field of view.

GelSight’s technology enables digital tactile sensing with the sensitivity and resolution of human touch. Data captured by GelSight’s elastomeric tactile sensing platform leverages proprietary software and algorithms to provide detailed, accurate surface characterization that can generate significant gains in productivity both on the production floor and in the field while also reducing the costs associated with manual or tool-based visual inspection. The handheld GelSight Mobile device can be deployed on production and assembly lines to enable rapid and well documented quality assurance decisions. Dimensions of scratches, dents, hits, gaps, offset, hole diameter, and fastener flushness can be measured in high-resolution, on any surface, in seconds.

Click here to read the full article in Metrology News.

Robotic Finger Digs, Finds, Identifies Small Buried Objects

With the help of AI and a neural network, this wiggling robotic finger digs into substances such as sand to locate and identify buried objects. Robotic hands, fingers, grippers, and their various sensors are usually deployed to function in “free space” for many obvious reasons. But in the real world, it’s often necessary to poke and probe into materials such as sand while looking for objects like buried nails or coins (best case) or explosives (a far-worse case). 

Addressing this challenge, a team at the Massachusetts Institute of Technology (MIT) has devised and tested their second-generation robotic finger specifically designed for this class of mission with impressive success: Their “Digger Finger” was able to dig through granular media such as sand and rice and correctly sense the shapes of submerged items it encountered. The objective is an electromechanical finger that replicates the capabilities of a human one, which can find, feel, and “see” (mentally) the shape of a buried object based on sensing and learned experience.

The project consists of two major aspects: the robotic finger that can penetrate into a granular mass to get close enough to illuminate and capture a reflected—albeit highly distorted—pattern, followed by artificial intelligence/neural-net data-extraction and analysis algorithms. At the core of the hardware design is GelSight, a sharp-tipped robot finger developed by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

What you’ll learn:

  • Why and how a complicated robotic-finger design was improved by a research team.
  • How this finger is used to dig into bulk materials to locate and “see” small objects.
  • How the reflected image of the object is the basis for the data set that determines the actual shape of the object.
  • How AI and a neural net were employed to analyze the distorted, reflected object images that were captured.

Click here to read the full article in Electric Design.

[PODCAST] Tech of Sports: Felix Breitschädel, PhD, Norwegian Olympic Sports Center

On the ‘Tech of Sports’ podcast, Rick talks with Felix Breitschädel, PhD, Norwegian Olympic Sports Centre about how the Norwegian Ski Team is utilizing unique tech to give their skiers an edge on the slopes by arming them with detailed data, thanks to GelSight – whose handheld 3D measurement device the Norwegian ski team is using to quite literally, measure snow.

The skiers need to understand how the surface of their ski will interact with the snow, and even though there were other measurement tools they could use to measure the topography of the snow, it had to be taken out of the natural environment and into the lab, which impacted results. With the handheld device, GelSight mobile, Felix and his team can investigate the surface of the snow right out on the slopes or rink to identify the snow grains and take pictures of the snow’s surface for future reference. It helps also inform R&D decisions to develop new grinds for equipment for upcoming competitions.

Click here to listen to the episode.

How Computers with Humanlike Sense Will Changes Our Lives

The Future of Everything covers the innovation and technology transforming the way we live, work and play, with monthly issues on healthmoneycities and more. This month is Artificial Intelligence, online starting July 2 and in the paper on July 9.

Even the smartest computers cannot fully understand the world without the ability to see, hear, smell, taste or touch. But in the decadeslong race to make software think like humans—and beat them at “Jeopardy!”—the idea of endowing a machine with humanlike senses seemed far-fetched. Not anymore, engineers and researchers say.

Capabilities powered by artificial intelligence, like image or voice recognition, are already commonplace features of smartphones and virtual assistants. Now, customized sensors, machine learning and neural networks—a subset of AI that mimics the way our brains work—are pushing digital senses to the next level, creating robots that can tell when a package is fragile, sniff out an overheated radiator or identify phony Chardonnay.

Hype around AI is running high, and much of the research is in early stages. Here, we look at 10 working models and prototypes of AI with sensory abilities.

Click here to read the full article in the Wall Street Journal (subscription may be required).

Enabling 3D Measurement for Shot-Peened Surface Characterization

GelSight’s Chief Product Officer and co-founder, Dr. Kimo Johnson, spoke with the team at Shot Peener Magazine about how our technology can help provide more accurate assessments of shot peening coverage, rate and other process parameters.

Click here to learn more in the full feature on page 36 of their Summer 2021 issue.

Slender robotic finger senses buried items

Over the years, robots have gotten quite good at identifying objects — as long as they’re out in the open.

Discerning buried items in granular material like sand is a taller order. To do that, a robot would need fingers that were slender enough to penetrate the sand, mobile enough to wriggle free when sand grains jam, and sensitive enough to feel the detailed shape of the buried object.

MIT researchers have now designed a sharp-tipped robot finger equipped with tactile sensing to meet the challenge of identifying buried objects. In experiments, the aptly named Digger Finger was able to dig through granular media such as sand and rice, and it correctly sensed the shapes of submerged items it encountered. The researchers say the robot might one day perform various subterranean duties, such as finding buried cables or disarming buried bombs.

The research will be presented at the next International Symposium on Experimental Robotics. The study’s lead author is Radhen Patel, a postdoc in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). Co-authors include CSAIL PhD student Branden Romero, Harvard University PhD student Nancy Ouyang, and Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in CSAIL and the Department of Brain and Cognitive Sciences.

Seeking to identify objects buried in granular material — sand, gravel, and other types of loosely packed particles — isn’t a brand new quest. Previously, researchers have used technologies that sense the subterranean from above, such as Ground Penetrating Radar or ultrasonic vibrations. But these techniques provide only a hazy view of submerged objects. They might struggle to differentiate rock from bone, for example.

“So, the idea is to make a finger that has a good sense of touch and can distinguish between the various things it’s feeling,” says Adelson. “That would be helpful if you’re trying to find and disable buried bombs, for example.” Making that idea a reality meant clearing a number of hurdles.

The team’s first challenge was a matter of form: The robotic finger had to be slender and sharp-tipped.

In prior work, the researchers had used a tactile sensor called GelSight. The sensor consisted of a clear gel covered with a reflective membrane that deformed when objects pressed against it. Behind the membrane were three colors of LED lights and a camera. The lights shone through the gel and onto the membrane, while the camera collected the membrane’s pattern of reflection. Computer vision algorithms then extracted the 3D shape of the contact area where the soft finger touched the object. The contraption provided an excellent sense of artificial touch, but it was inconveniently bulky.

For the Digger Finger, the researchers slimmed down their GelSight sensor in two main ways. First, they changed the shape to be a slender cylinder with a beveled tip. Next, they ditched two-thirds of the LED lights, using a combination of blue LEDs and colored fluorescent paint. “That saved a lot of complexity and space,” says Ouyang. “That’s how we were able to get it into such a compact form.” The final product featured a device whose tactile sensing membrane was about 2 square centimeters, similar to the tip of a finger.

With size sorted out, the researchers turned their attention to motion, mounting the finger on a robot arm and digging through fine-grained sand and coarse-grained rice. Granular media have a tendency to jam when numerous particles become locked in place. That makes it difficult to penetrate. So, the team added vibration to the Digger Finger’s capabilities and put it through a battery of tests.

“We wanted to see how mechanical vibrations aid in digging deeper and getting through jams,” says Patel. “We ran the vibrating motor at different operating voltages, which changes the amplitude and frequency of the vibrations.” They found that rapid vibrations helped “fluidize” the media, clearing jams and allowing for deeper burrowing — though this fluidizing effect was harder to achieve in sand than in rice.

They also tested various twisting motions in both the rice and sand. Sometimes, grains of each type of media would get stuck between the Digger-Finger’s tactile membrane and the buried object it was trying to sense. When this happened with rice, the trapped grains were large enough to completely obscure the shape of the object, though the occlusion could usually be cleared with a little robotic wiggling. Trapped sand was harder to clear, though the grains’ small size meant the Digger Finger could still sense the general contours of target object.

Patel says that operators will have to adjust the Digger Finger’s motion pattern for different settings “depending on the type of media and on the size and shape of the grains.” The team plans to keep exploring new motions to optimize the Digger Finger’s ability to navigate various media.

Adelson says the Digger Finger is part of a program extending the domains in which robotic touch can be used. Humans use their fingers amidst complex environments, whether fishing for a key in a pants pocket or feeling for a tumor during surgery. “As we get better at artificial touch, we want to be able to use it in situations when you’re surrounded by all kinds of distracting information,” says Adelson. “We want to be able to distinguish between the stuff that’s important and the stuff that’s not.”

Funding for this research was provided, in part, by the Toyota Research Institute through the Toyota-CSAIL Joint Research Center; the Office of Naval Research; and the Norwegian Research Council.

Original article published on MIT News: https://news.mit.edu/2021/robotic-finger-buried-underground-0526