As physician-guided robots routinely operate on patients at most major hospitals, the next generation robot could eliminate a surprising element from that scenario – the doctor. Feasibility studies conducted by Duke University bioengineers have demonstrated that a robot can locate a man-made, or phantom, lesion in simulated human organs, guide a device to the lesion and take multiple samples during a single session.
The researchers believe that as the technology is further developed, autonomous robots could some day perform many more simple surgical tasks.
The Duke team combined a “souped-up” version of an existing robot arm with an ultrasound system of its own design. The ultrasound serves as the robot’s eyes by collecting data from its scan and locating its target. The robot is controlled not by a physician, but by an artificial intelligence program that takes the real-time 3-D information, processes it and gives the robot specific commands to perform. The robot arm has a mechanical hand that can manipulate the same biopsy plunger device that physicians use to reach a lesion and take samples.
In the latest series of experiments, the robot guided the plunger to eight different locations on the simulated prostate tissue in 93 percent of its attempts. This is important because multiple samples can also determine the extent of any lesion. An important challenge to be overcome is the speed of data acquisition and processing, though the researchers are confident that faster processors and better algorithms will address that issue. To be clinically useful, all of the robot’s actions would need to be in real time, the researchers said. Advances in ultrasound technology have made these latest experiments possible, the researchers said, by generating detailed, 3-D, moving images in real-time.
This is a shortened version of an article that can be found here.