Meta AI researchers are giving robots a sense of touch and we’re all getting creepy feelings

AI has given robots the ability to “hear” and “see” the world to understand human commands and perform tasks better, but Meta’s AI researchers are now testing ways to let robots also mimic the sense of touch. Meta’s Fundamental AI Research (FAIR) division has just introduced a suite of tools that allow robotic instruments to detect, decipher, and respond to what they touch. That could make even the most basic robotic arm sensitive enough to handle delicate objects without damaging them, and make them useful in more environments.

Meta showcased a combination of new technologies and features working together to give robots the ability to sense things. Touch-sensitive technology Sparsh gives AI a way to identify things like pressure, texture and movement without the need for a massive database. It’s like an AI version of how you can sense something in the dark and describe what it feels like even if you don’t know what you’re touching.