AI has given robots the ability to “hear” and “see” the world to understand human commands and perform tasks better, but Meta’s AI researchers are now testing ways to let robots also mimic the sense of touch. Meta’s Fundamental AI Research (FAIR) division has just introduced a suite of tools that allow robotic instruments to detect, decipher, and respond to what they touch. That could make even the most basic robotic arm sensitive enough to handle delicate objects without damaging them, and make them useful in more environments.
Meta showcased a combination of new technologies and features working together to give robots the ability to sense things. Touch-sensitive technology Sparsh gives AI a way to identify things like pressure, texture and movement without the need for a massive database. It’s like an AI version of how you can sense something in the dark and describe what it feels like even if you don’t know what you’re touching.
To send information about what the robot is touching to the AI model, Meta worked with a company called GelSIght to essentially create a robot fingertip called Digit 360. The sensors in Digit 360 are very sensitive, so the AI can’t just determine details about what the robot is touching, but also applies pressure appropriate to a task involving the object, such as lifting or turning it.
For the rest of the robot hand (or an equivalent device), Meta worked with Wonik Robotics to create a system called Plexus to distribute multiple touch sensors across the device. Meta claims that Plexus can sufficiently mimic the human sense of touch for fragile or difficult objects. You can see below how the three technologies work together in a robotic hand.
Sensitive AI
“The human hand is great at communicating information to the brain, from the fingertips to the palm. This allows the muscles in the hand to be controlled when making decisions, such as how to type on a keyboard or how to use handle an object that is too hot.” ,” Meta explained in a blog after. “Achieving embodied AI requires similar coordination between the tactile detection and motor activation of a robotic hand.”
There are many ways in which robotic hands that “feel” connected to AI that can interpret these sensations could be useful. Imagine robotic surgical assistants that can sense small changes in the body and respond more quickly, with precise but gentle movements that match or even better than human responses. The same goes for manufacturing delicate devices without breaking them and perhaps better coordination between multiple robotic hands, as humans do with their pairs. It could make virtual experiences feel more real to people, with insight into how objects and environments should feel when used to inform their virtual counterparts.
Using AI to mimic robots’ sense of touch isn’t the only human experience that AI is replicating for machines. Researchers at Penn State recently showed how AI models linked to an electronic tongue can simulate a sense of taste good enough to detect small differences in taste. Meanwhile, a company called Osmo has taught AI models how to mimic a sense of smell that’s far better than a human’s. The company demonstrated how its AI can analyze a scent accurately enough to even create it from scratch by picking and combining chemicals without human intervention