Researchers take aim at next-generation image-guided surgery systems

Virtsurgsmall

Just as virtual guided imaging is beginning to pay off in diagnostics, a group of techno-wizards have taken aim at the next goal: putting powerful, intuitive, real-time image guidance systems into surgeons' hands. With the U.S. government dangling major research funds, the systems may come about sooner rather than later.

The rapidly evolving field of image-guided surgery promises to advance two of medicine's most cherished goals: lower costs and better patient outcomes. Driven by a new generation of navigational systems and clinical applications, the systems will bring dramatic changes to healthcare, according to Ramin Shahidi, Ph.D., director of the Stanford University Medical School's Image Guidance Laboratories in Stanford, CA.

"When images are fused and registered to the patient's anatomy in the operating room to give surgeons x-ray vision, then you're talking about a whole new paradigm in image guidance -- and the evolution will turn into a revolution," he said. "That's something we've been doing...and in fact everyone on the panel has been moving toward."

Shahidi co-moderated a June 30 panel discussion on the evolution of image-guided surgery at the 14th annual Computer Assisted Radiology and Surgery (CARS 2000) conference in San Francisco.

The panelists, a virtual who's who of image-guidance pioneers, talked about what's been accomplished and what must still be done in order to build the next generation of therapeutic systems.

Since the 1970s, researchers have faced daunting technical, mathematical, and computational challenges in their efforts to tie radiologic images to surgery.

In order for systems to work, image data must be segmented as a first step in producing clinically relevant 3-D images. The images are then registered, or aligned, with the patient's anatomy. Surgical instruments must be tracked using video cameras, CCDs, LEDs, or electromagnetic devices. Finally, the images must be postprocessed in order to create visually useful displays for surgeons.

The work has produced many useful applications to date, including CCD camera-equipped endoscopes; the StealthStation, first deployed at St. Louis University in 1990; and the intraoperative MRI unit developed at the Harvard Medical School Surgical Planning Laboratory headed by panelist Dr. Ron Kikinis. Although much has been accomplished, the panelists said, it's only a start.

The work ahead

Doctors and engineers will need to work together to develop advanced volumetric visualization tools to construct the images, and intuitive human/machine interfaces to guide the surgery. Performance metrics must be advanced and standardized in order to translate human into machine movement, and coordination systems must be developed to tie the myriad technologies together, the panelists said.

The systems will need to be highly automated, and able to execute human and machine commands accurately and precisely during procedures. Surgeons will require high-quality, real-time feedback of both data and images while avoiding information overload. Finally, intuitive machine/user interfaces must be crafted to manipulate the machines accurately during surgery.

The potential payoff is immense. Doctors will be able to target and treat pathologies more precisely and completely than ever, and develop new procedures that were simply not possible before, the panelists said.

Among them, Dr. William Bucholz, director of image-guided surgery at the St. Louis University School of Medicine, was the chief surgeon behind the development of the current state-of-the-art guided surgery system, the StealthStation, (now manufactured by Medtronic Sofamor Danek of Memphis) in 1990. He said many factors must coalesce in order to advance the systems.


FluoroNav StealthStation courtesy of Medtronic Sofamor Danek

"I think a critical shared resource for any group looking at this area is to have an open operating room; that is, an operating room capable of using the prototypes very quickly. The surgical environment is unique, and almost impossible to replicate in any type of engineering or modeling environment," Bucholz said.

At the same time, engineers must carefully balance technical progress against the requirements of the clinical environment to ensure patient safety, he said.

"Our surgical interventions are the few minutes patients have to be either cured of the disease or live with the disease debilitated due to problems that occurred in surgery. The doctors and surgeons really have to be involved, and really have to want this [technology] developed."

Next-generation machines will require coordinated sharing of information among researchers to save resources now wasted by continually reinventing things, he said, as well as federally funded research to support defined research areas across university lines.

"It's a rare institution that will have both the medical and engineering group dedicated to the area of image guidance," Bucholz said. Image-guided surgery offers the opportunity to build bridges between engineering and medicine -- and between the past and the future, he said.

Image guidance in action

Richard Robb Ph.D., biomedical imaging director and professor of molecular neuroscience at the Mayo Clinic in Rochester, NY, has devoted nearly 30 years to the core problems of imaging systems, including image segmentation and registration, and the measurement, modeling, and rapid rendering of image data.

The work has led to the development of commercially available software that enables advanced multidimensional visualization and analysis of organs such as the heart, Robb said.

"We're able to obtain real-time images of the beating heart simultaneously with electrical activation mappings, with electro-baskets placed in the patient, to create structurofunctional displays of activation and anatomy at the same time, and then use those displays to guide the electrophysiologist or cardiologist to the tissue [that needs] to be ablated," Robb said. The patient spends less time in the OR and has a better result, he added.

Advancement of the systems will require better coordination and communication of research efforts, and standardization in areas such as coordinate systems, which are now completely different for every guidance system, he said. A core problem is that the accuracy and precision of the new systems can't be validated without years of basic work in performance metrics.

More data of all kinds needs to be created and shared among institutions, he said. This will require both real data and phantom studies, which allow researchers to control data parameters to get a better idea of how the algorithms are performing. The resulting information should be compared with data from other centers and multiple trials to validate the methods as robustly as possible for specificity, sensitivity, and reproducibility, as well as more subjective factors.

"The new techniques have sometimes very compelling advantages in comparison to other techniques, and it's really a shame that we don't start using them on patients...for lack of what we call FDA-level validation," Robb said.

Dr. Kikinis also noted a lack of validation, and said researchers must begin to share resources in order to develop robust algorithms that can be used in multiple applications. "Particularly in image segmentation we often lack a gold standard," he said.

Robb said technological convergence will bring about such wonders as the tracking of surgeons' hands without gloves, smart rooms, voice-controlled systems, and scores of other innovations. As a centralized institution, the U.S. government will play a crucial role in coordinating their development, he said, in part by funding pilot projects and centers of excellence.

Toward a plan of action

The panel defined several new research opportunities that will result from a new push to develop image-guided surgery systems, including:

  • Standards for image segmentation, from acquisition to interpretation
  • Standards for assessing system accuracy and precision
  • Intraoperative and real-time data visualization and registration for multiple modalities
  • Tissue deformation compensation models
  • Precise mathematical models for tracking and registration
  • Automated and smart tools for machine and human interfaces
  • New clinical applications to take advantage of the technologies

Lawrence Clarke, branch chief for imaging technology development for the Biomedical Imaging Program (BIP) of the National Cancer Institute (part of the National Institutes of Health), was on hand to discuss the U.S. agency's plans to promote the development of image-guided systems through grants aimed at the research community, and especially at small businesses developing related technologies.

New funds resulting from growing political support for cancer research means BIP will bestow about $100 million in imaging-related research funding this year, Clarke said, with budgets expected to continue growing over the next several years.

Clarke is appearing at medical meetings across the U.S. in order to solicit ideas from researchers, small businesses, and the imaging industry on how research efforts should be organized -- and the money spent -- to promote development of the systems.

By Eric Barnes
AuntMinnie.com staff writer
July 17, 2000

Let AuntMinnie.com know what you think about this story.

Copyright © 2000 AuntMinnie.com

Page 1 of 1264
Next Page