Visuotactile Sensor Fusion for Bio-inspired Multisensory Integration

M.U.K. Sadaf, N.U. Sakib, A. Pannone, H. Ravichandran, S. Das
Pennsylvania State University,
United States

Keywords: neuromorphic computing, 2D materials


The brain has a remarkable ability to integrate information from multiple senses instead of relying on a single sensory input, particularly when the signals from one sense are weak. This process involves specialized neurons that receive input from two or more sensory modalities, paving the way for multisensory integration. In the realm of artificial intelligence, replicating the response of these multisensory neurons through solid-state devices holds great potential for advancing neuromorphic computing and bridging the gap between natural and artificial intelligence. In our research, we present an artificial visuotactile neuron designed by integrating a photosensitive monolayer MoS2 memtransistor with a triboelectric tactile sensor. This combination intricately captures three essential facets of multisensory integration, namely, the "super-additive" response, the "inverse effectiveness effect," and "temporal congruency." Our approach extends beyond simple demonstration, as we also designed a circuit capable of processing information like the brain by encoding visuotactile information into digital spike trains. By capturing the three essential features along with demonstrating the spike encoding circuit, we believe this work has the potential to propel the field of neuromorphic computing and artificial intelligence forward. With current technology traditionally centered on unisensory intelligence and information processing, our work broadens the scope of possibilities, opening new avenues for more sophisticated and nuanced artificial intelligence systems giving them capabilities to perform tasks even in extremely resource-constrained environments. This work has been accepted and published in the journal, Nature Communications 14(1), 5729.