Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!
When it comes to AI, the final frontier may be giving robots a soul and telling them that they’re real (and maybe stamping a Tyrell Corporation logo somewhere on them for good measure). But one would-be sensitive robot soul already is getting a first chance at developing less-abstract feelings, thanks to a new initiative at MIT.
Scientists at the school’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are trying to give a robot the literal feels by training it to sense the tactile nuances of an object’s texture and to help the 'bot associate those “sensations” with learned visual cues based on how the object looks.
Just as a person develops an understanding of what a golf ball will feel like based on its appearance, MIT’s 'bot is getting a crash course in making a similar connection.
According to Engadget, CSAIL’s modified robotic arm is learning to anticipate how stuff feels, before even making contact, by studying a human-curated assembly of real, in-the-flesh items alongside corresponding pictures of each item the machine touches.
Drawing on a library of 12,000 videos and a related batch of the 200 or so items portrayed in them — including fabric, tools, and other household objects — the bot is being conditioned to relate the visual information it’s seeing with the experience of how an object’s surface is likely to feel. That may be a far cry from knowing what to do with the information, but it's a definite first step toward adding one of our five senses to the mix as AI learning grows ever more integrated.
As with all things robotic, there’s an ocean of distance between the coldly deliberate, highly-controlled research environment of today’s lab and the sensory elegance of movie-bots (like WALL-E or Blade Runner’s Rachael) that can light up with emotion at a single touch. But for us humans, maybe that’s a good thing. In a world where some of us still feel guilty for letting our robot vacuums take an occasional nose-dive down the stairs, we may not yet be emotionally prepared for the day when machines are finally able to say "ouch."