Event-Driven Visual-Tactile Sensing and Learning for Robots

Human beings accomplish a large amount of actions employing multiple sensory modalities and consume much less energy than multi-modal deep neural networks utilised in latest artificial systems. A modern analyze on arXiv.org proposes an asynchronous and event-driven visual-tactile notion technique, impressed by organic systems.

A novel fingertip tactile sensor is established, and a visual-tactile spiking neural network is formulated. In contrary to common neural networks, it can process discrete spikes asynchronously. The robots had to ascertain the style of container they deal with, the amount of money of liquid held inside, and to detect rotational slip. Spiking neural networks obtained competitive effectiveness when when compared to artificial neural networks and consumed close to 1900 instances much less energy than GPU in a true-time simulation. This analyze opens the door to future-technology true-time autonomous robots that are energy-efficient.

This do the job contributes an event-driven visual-tactile notion technique, comprising a novel biologically-impressed tactile sensor and multi-modal spike-dependent studying. Our neuromorphic fingertip tactile sensor, NeuTouch, scales properly with the number of taxels many thanks to its event-dependent mother nature. Likewise, our Visible-Tactile Spiking Neural Network (VT-SNN) enables fast notion when coupled with event sensors. We assess our visual-tactile technique (employing the NeuTouch and Prophesee event digicam) on two robot jobs: container classification and rotational slip detection. On equally jobs, we notice very good accuracies relative to standard deep studying techniques. We have produced our visual-tactile datasets freely-out there to motivate study on multi-modal event-driven robot notion, which we feel is a promising tactic toward smart energy-efficient robot systems.

Link: https://arxiv.org/abs/2009.07083