A system for microscope-assisted guided interventions

Stereotact Funct Neurosurg. 1999;72(2-4):107-11. doi: 10.1159/000029708.

Abstract

We present a system for surgical navigation using stereo overlays in the operating microscope aligned to the operative scene. This augmented reality system provides 3D information about nearby structures and offers a significant advancement over pointer-based guidance, which provides only the location of one point and requires the surgeon to look away from the operative scene. With a previous version of this system, we demonstrated feasibility, but it became clear that to achieve convincing guidance through the magnified microscope view, a very high alignment accuracy was required. We have made progress with several aspects of the system, including automated calibration, error simulation, bone-implanted fiducials and a dental attachment for tracking. We have performed experiments to establish the visual display parameters required to perceive overlaid structures beneath the operative surface. Easy perception of real and virtual structures with the correct transparency has been demonstrated in a laboratory and through the microscope. The result is a system with a predicted accuracy of 0.9 mm and phantom errors of 0.5 mm. In clinical practice errors are 0.5-1.5 mm, rising to 2-4 mm when brain deformation occurs.

MeSH terms

  • Bone Cysts / pathology
  • Bone Cysts / surgery
  • Calibration
  • Computer Simulation
  • Equipment Design
  • Facial Paralysis / surgery
  • Feasibility Studies
  • Geniculate Ganglion / surgery
  • Humans
  • Intracranial Arteriovenous Malformations / pathology
  • Intracranial Arteriovenous Malformations / surgery
  • Intraoperative Care
  • Man-Machine Systems
  • Microscopy / instrumentation*
  • Microscopy / methods
  • Models, Anatomic
  • Neurosurgical Procedures / instrumentation
  • Neurosurgical Procedures / methods*
  • Preoperative Care
  • Prostheses and Implants
  • Stereotaxic Techniques / instrumentation*