Okay, you may have already met Raven. Researchers unveiled the latest version of this robotic surgical assistant in January. But if you missed it, like me, here’s what’s up.
The surgical manipulator has two winglike arms that end in tiny claws to help surgeons see and navigate around, say, the heart. The naked eye just can’t see everything, and even a surgeon’s trained hands can’t feel everything.
Blake Hannaford at the University of Washington and Jacob Rosen at the University of California, Santa Cruz built the original Raven for telerobotic surgery study back in 2005 for around $250,000. Now, they’ve developed a new version, Raven II: smaller, has more dexterity in its hands, and can hold surgical tools during operations.
Their approach uses 3D ultrasound imaging to show internal organs in real time. Volumetric images are taken, and fast image processing software locates the target tissue and the instrument. (The same graphics processors that produce high-quality computer-game images are ideal for real-time medical imaging.)
They’ve created software to work with the Robot Operating System, a popular open-source robotics code, so labs can easily connect the Raven to other devices and share ideas. The Linux-based operating system lets anyone modify and improve the original code, creating a way for researchers to experiment and collaborate, the Economist explains.
Ravens have been deployed to biorobotics labs around the country, to Harvard, Johns Hopkins University, University of Nebraska-Lincoln, UCLA, and UC Berkeley (Go Bears). Some things Raven can do:
- A Harvard team matches the robot to beating heart tissue. When the surgical instrument reaches the tissue, a control loop is closed around them so that the instrument automatically moves in tandem with the beating motion of the heart. (It’s almost as if the surgeon is working on a stationary heart.)
- A Johns Hopkins team is investigating whether it could make invasive operations safer. In functional endoscopic sinus surgery, an endoscope finds and treats nasal polyps and sinus inflammation. (It’s close to your eyes and brain.) Raven uses imaging to identify the 3D location of the endoscope’s tip as the surgeon maneuvers through the sinuses.
- By superimposing the surgeon’s field of view on standard medical images, surgeons don’t need to constantly look back and forth between a map of the patient’s particular anatomy and the patient through the endoscope.
Watch a video of Raven work.
“To do superhuman surgery will require robots to have enough intelligence to recognize what the surgeon is doing and to offer appropriate assistance, remotely setting up no-fly zones for safety, superimposing images,” says Gregory Hager of Johns Hopkins. “All of that is coming down the road.”
[Via Popular Mechanics]
Image: University of Washington
Related video on SmartPlanet: