‘Digging’ robot can locate objects concealed by granular media

A researcher from Massachusetts Institute of Technology (MIT) has developed a robot that can search for objects buried beneath sand and other granular media.

The so-called Digger Finger, which features a GelSight sensor — a vision-based tactile sensor — can reportedly penetrate granular media to locate concealed objects.

According to the research, the prototype penetrates the granular media via mechanical vibrations that cause the granular media to take on fluid-like properties, whereas current tactile sensors used for similar functions are typically jammed by the granular media. Meanwhile, high resolution tactile sensing enables the identification of the objects concealed by the granular media.

The prototype is being considered for applications including deep sea exploration and mining. The robot could also potentially be used to conduct explosive ordnance disposal and improvised explosive device (IED) detection functions, and buried cable retrieval operations.

The Digger Finger is detailed in a paper titled “Digger Finger: GelSight Tactile Sensor for Object Identification Inside Granular Media,” which appears in the journal arXiv.To contact the author of this article, email donlon@globalspec.com

Original article published on Engineering360: https://insights.globalspec.com/article/16051/digging-robot-can-locate-objects-concealed-by-granular-media

Researcher turns £1 make-up sponge into sensor with potential to improve medical care

A Liverpool Hope University student has found a revolutionary way to improve medical care using a simple silicone make-up sponge – bought from a high street chemist.

Alexander Co Abad discovered the £1 beauty product could form the main component of a sensor designed to be as sensitive as human skin when it comes to providing ‘touch’ feedback.

He also says there’s the potential for it to be used in a range of medical procedures – from helping robotic arms to grip instruments during surgery, to even detecting tumours.

Alexander’s work has been centred on advancing something called the ‘GelSight’ sensor, first created back in 2009 by scientists at America’s Massachusetts Institute of Technology (MIT).

Alexander is from Manila and is currently working towards a PhD at Hope through a scholarship from the Philippine government. He had been “fascinated” by the product since it was introduced to him by his research supervisor, but wanted to make it even more accessible to medical professionals.

To do that he created a homemade, low-cost version of the sensor which utilises an UltraViolet (UV) torch, an LED light and a webcam to provide a highly detailed visual 3D ‘map’ of any surface it touches.

He says the gadget can spot details not detectable with the human eye, as well as picking up slight vibrations and variations in pressure.

Alexander, who is studying computer science and informatics while specialising in robotics, said: ” The idea of the sensor is to be as sensitive as human skin when it comes to ‘touch’ – meaning it can detect the smallest details on the surface of a coin, for example, or even the tiniest vibrations.

“When attached to a robot’s finger, it is able to sense your pulse. And it can even differentiate between rough and smooth surfaces.

“The overall concept is to enable a robot to feel and sense like a human would, which gives it many advantages for medical work.

“Another application it can be used for is to detect things under the skin, so potentially in the future it could be used to check for tumours or to detect possible lumps in the breast.”

Alexander believes that another key application is in helping a robot arm to grip. He said: “Right now there’s a problem in the robotics community when it comes to gripping.

“Sometimes the object a robotic arm touches is so smooth there’s a tendency for it to slide when gripped. We need a sensor to measure this slip so that we can increase the gripping force, and that’s where this technology comes into play.”

Alexander’s dream is for the sensor to be used during surgery. He explained that endoscopic procedures, where a doctor can examine internal organs using a camera, is another example where the sensor could be effective as it could give doctors extra feedback about activity within the human body.

The student began his studies at Hope back in 2019, having come to Liverpool from De La Salle University in Manila, and began his research to create a low cost version of a sensor that replicates the qualities of human skin.

He said: “I thought the best place to start would be shops that sell cosmetics.

“I first noticed a pink sponge and realised that, if I took off the layer of coloured sponge, I had the silicone I needed for just £1 and that’s the basic component for the technology.

“I then put a camera on one side so you can see and record the object touching the silicone, and managed to make something for a very low cost.”

As a teacher in the electronics and communications engineering department at De La Salle University, Alexander is eventually planning to take his newfound knowledge back to the Philippines while continuing his research in his home city once he has graduated from Hope.

Now in the final year of his studies in Liverpool, Alexander is continuing his research and looking for new ways to improve his device, including the addition of temperature sensing capabilities.

What Robots Need to Become Better Helpers

Both the government and private sector continue to work on building more functional robots to accomplish various tasks, especially ones that aren’t suited or safe for humans. For example, NASA’s Mars Perseverance Mission, which is fully robotic, is scheduled to make planetfall on Mars next week. In addition to the Pathfinder robot, which is pretty well-known at this point, it will also be carrying the Ingenuity Mars Helicopter, a robotic drone specially designed to fly around and explore within the thin atmosphere of Mars.

But mobility is only one aspect of creating the advanced robots and robotic tools of the future. For the most part, we have the locomotion part down. We already have thousands of flying drones and robots, plus specialized models that can climb up the side of cliffs or work completely in or under the water.

The problem is that once we get those robots into inaccessible or inhospitable places, they need to be able to actually manipulate their environment in the same way that a human would. And for that, they pretty much need hands, ideally ones with fingers and maybe a thumb. I recently talked with a researcher at the Army Research Laboratory who told me that the ability to manipulate physical space, through either some type of actuator or robotic hand, would be an important key to successful robot deployments in the future.

Last week, we got a first look at what that might look like. Boston Dynamics, one of the most advanced robot-making companies in the world, upgraded their well-known dog-like robot model named Spot with a very functional robotic hand. Previously, Spot robots were able to traverse rough terrain and even stairs but were stymied by things like a closed door. The company released a fascinating video showing Spot making good use of its new appendage. The hand is mounted on the end of an articulated arm in the center of the robot which lets it extend in almost any direction.

“Since first launching Spot, we have worked closely with our customers to identify how the robot could best support their mission-critical applications,” said Robert Playter, CEO of Boston Dynamics. “Our customers want reliable data collection in remote, hazardous and dynamic worksites. We developed the new Spot products with these needs in mind, and with the goal of making it easy to regularly and remotely perform critical inspections, improving safety and operations.”

The video shows Spot performing some very fine manipulations with its hand, including planting a sapling (after first digging a hole for it) without snapping the delicate young tree in half. It also does some other tasks in the video including collecting a bundle of cloth out in the snowy woods, opening an office door and shutting off a valve to stop a leaking pipe. All of that is impressive, but one wonders just how delicate Spot, or any robot, can really be without a real sense of touch.

It’s a question that Professor Ted Adelson of the Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory has been working on for many years. He has designed a way for robots to simulate a sense of touch, which he believes will eventually enable them to be as precise as a human hand.

“In order for robots to do good manipulation, they need good fingers,” he said. “We’re trying to make fingers that can match the capabilities of human fingers.”

The technology that Adelson and the team at MIT developed is called GelSight, and it involves deploying a soft covering over a robotic hand. Tiny cameras in the material monitor the surrounding soft “skin” and record how much it deforms as the hand grips objects. That data is then fed into a computer model that helps the robot “see” how much pressure is needed to grasp an object without squeezing it too hard. The fingers can also be used to measure force, shear and slip.

He talked about this new technology and its importance in an interview posted on YouTube last month. From his explanation, it seems like the next step is making sure that robots can use the collected touch data intelligently with their new hands so they can accomplish a variety of tasks requiring everything from brute strength to fine motor skills.

Of course, the other thing that our robots need to help usher in the future is the ability to perform tasks independently without human interaction. Technically, the definition of a robot is a device that can carry out complex actions automatically. So having a human piloting a device’s every move means that it’s not technically a robot at all. But we are working on that too, with great strides in artificial intelligence and machine learning being made every day. 

It’s just that when working on something as complex and powerful as creating artificial intelligence, it’s easy to forget little things, like the power and the necessity of touch. Hopefully, we are starting to see the tip of the iceberg now in that new area of artificial senses, with future robots literally getting a helping hand from the latest research.

Original article published on NextGov: https://www.nextgov.com/ideas/2021/02/what-robots-need-become-better-helpers/171962/