A system that recognizes hand gestures to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation is being developed by Purdue University researchers.
Juan Pablo Wachs, of the Purdue University, said that both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection.
The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters.
The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.
At the same time, a robotic scrub nurse represents a potential new tool that might improve operating-room efficiency, said Wachs.
"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," said Wachs.
Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.
A scrub nurse assists the surgeon and hands the proper surgical instruments to the doctor when needed.
"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room. In that case, a robotic scrub nurse could be better," he said.
The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, which senses three-dimensional space.
The findings have been detailed in the journal Communications of the ACM, the flagship publication of the Association for Computing Machinery.