Senssorized HALS. Sensorized Environment for Laparoscopic Surgery Assisted by the Hand

Funding Entity: Ministry of Economy and Competitiveness - National R & D Plan.
Reference: DDPI2013-47196-C3-3-R
Participating entity: ITAP (Institute of Advanced Production Technologies), University of Valladolid. Project coordinated with the robotics group of the University of Málaga and the Nbio group of the Miguel Hernández University of Elche.
Duration: 01-January 2014 - 31-December-2016


Hand-assisted surgery (HALS) represents a new intermediate scenario between classical laparoscopic surgery and laparotomy. In it, the surgeon introduces one hand in the abdomen to manipulate the organs, while with the other he uses the conventional instruments of minimally invasive surgery. This type of approach is useful in complex situations, where classic laparoscopic surgery can not be applied, and it has been demonstrated that patient recovery times are not superior to this last technique.

The developed project addressed the development of a robotic system oriented to the HALS approach and framed within the “co-worker” robot concept, where the machine works side by side with the surgeon, collaborating in the surgical maneuvers and learning from the practice.

For this, the robotic system that is proposed is formed by a manipulator capable of operating an articulated laparoscopic tool and an endoscope, and another specialized in the movement of mini-robots within the abdominal cavity. On the other hand, it includes a person-machine interface based on an intelligent surgical glove and the ability to emulate the concept of “transparent abdomen” by combining real image with augmented reality.

The system recognizes the current phase of the intervention through the use of the hand gestures of the surgeon, the movement performed with the laparoscopic instrument or through the readings of the physiological signals combined with a patient-intervention model. With this information, the robotic arms act collaboratively with the surgeon and assist with the articulated tool, locating the endoscope and mini-robots in the appropriate locations to provide a complete and adequate view of the surgical field.

In this project, techniques and methodologies for the accomplishment of collaborative movements with laparoscopic instruments that contemplate both position control and forces to assist in maneuvers of knotting, clipping and cauterization, among others, were addressed. Secondly, the problem of detection of HALS surgical gestures was addressed, which together with the integration with information of the physiological signals, will help to identify the current phase of the intervention. In addition, as a third component, methods were developed to emulate the concept of “transparent abdomen” through the use of personalized virtual models of the patient, the use of a vision system consisting of an endoscope and mini-robot cameras, and information from the glove.