Robot-assisted minimally invasive surgery is hampered by a complete lack of haptic feedback. Adding feedback has been shown to improve performance in multiple ways, however the algorithms for driving the feedback often need to be tuned for a specific task and typically are not designed around the class of actuator. We hypothesized that if we can convert the information from a sensor to a vibration and a pneumatic actuator in ways that are consistent with the way the information is parsed in the skin, then performance should be superior to that when utilizing previous control functions.
We used an algorithm that simulates the responses of peripheral touch receptors to drive the vibration actuator to preferentially drive rapidly adapting afferents and to drive the pneumatic actuator to preferentially drive slowly adapting afferents. We tested this bio-inspired algorithm against two standard algorithms and when no feedback was presented on 10 expert robot-assisted surgeons and 15 novices performing a tumor detection task using a da Vinci surgical robot. Of the three algorithms, the novel bio-inspired algorithm was the only one that allowed both novices and expert robot-assisted surgeons to easily identify the locations of hard, medium and soft tumors and they did so with reduced contact force and tumor contact time. Although tested with only a single sensor and actuator of each class, the algorithm is designed to handle multiple inputs and simulates the outputs across the entire surface of the finger pad, allowing it to scale up for greater coverage and sensitivity in future iterations.