Institute for Anthropomatics and Robotics - Intelligent Process Automation and Robotics Lab

Development of a tactile display for minimally invasive surgery

  • contact:

    Jan Hergenhan

  • project group:

  • funding:

    DFG Graduiertenkolleg 1126

  • startdate:

    August 2011

  • enddate:

    July 2014

The project is part of the interdisciplinary Research Training Group 1126 “Intelligent Surgery – Development of new computer-based methods for the future workplace in surgery”. The goal is the development of a tactile display which makes it possible for the surgeon to feel palpated tissue during minimally invasive surgery.

The haptic attributes of the palpated tissue can be divided into tactile information (textures) and kinaesthetic information (compliance). They will be reproduced by the tactile display and the haptic input device respectively. The information can be gained by the tactile sensor which will be developed in the single-port project.

The tactile display is based on electric stimulation of the mechanoreceptors’ nerve fibres through the skin at the fingertip. In that manner, the tactile display shall produce the same feeling as mechanic stimulation of the receptors.

The representation of the kinaesthetic information will be done by the actively controllable gripper of the haptic input device. While gripping, a counterforce will simulate the compliance of the tissue.

Eventually, the tactile display will be mounted on the gripper under the surgeon’s finger. In that way, tactile and kinaesthetic information can be displayed simultaneously, just like in actual touching.