Neurochirurgie 60 (2014) 304–306

Disponible en ligne sur

ScienceDirect www.sciencedirect.com

Technical note

Augmented reality-assisted skull base surgery Chirurgie de la base du crâne assistée par la réalité augmentée I. Cabrilo a,∗ , A. Sarrafzadeh a , P. Bijlenga a , B.N. Landis b , K. Schaller a a

Neurosurgery division, department of clinical neurosciences, faculty of medicine, Geneva university medical center, rue Gabrielle-Perret-Gentil 4, 1211 Geneva 14, Switzerland b Rhinology-olfactology unit, division of otorhinolaryngology, head and neck surgery, department of clinical neurosciences, faculty of medicine, Geneva university medical center, rue Gabrielle-Perret-Gentil 4, 1211 Geneva 14, Switzerland

a r t i c l e

i n f o

Article history: Received 7 March 2014 Received in revised form 29 May 2014 Accepted 19 July 2014 Available online 20 September 2014 Keywords: Neuronavigation Augmented reality Skull base surgery Image-guided surgery

a b s t r a c t Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of augmented reality, present advantages over traditional point-based neuronavigation. However, this development has not yet made its way into routine surgical practice, possibly due to a lack of acquaintance with these systems. In this report, we illustrate the usefulness and easy application of augmented reality-based neuronavigation through a case example of a patient with a clivus chordoma. We also demonstrate how augmented reality can help throughout all phases of a skull base procedure, from the verification of neuronavigation accuracy to intraoperative image-guidance. © 2014 Elsevier Masson SAS. All rights reserved.

r é s u m é Mots clés : Neuronavigation Réalité augmentée Chirurgie de la base du crâne Chirurgie guidée par images

La neuronavigation est largement admise comme un outil précieux dans la chirurgie de la base du crâne. Les progrès technologiques dans le domaine de la neuronavigation, intégrant la réalité augmentée, présentent des avantages sur la neuronavigation traditionnelle, qui est dépendante du pointeur. Cependant, ces avancées n’ont pas encore trouvé leur place dans la routine chirurgicale, possiblement car elles restent méconnues. Nous illustrons ici l’utilité et l’application aisée de la neuronavigation basée sur la réalité augmentée, à travers l’exemple d’un patient avec un chordome du clivus ; et nous démontrons l’aide que peut apporter la réalité augmentée durant toutes les phases d’une intervention de la base du crâne, depuis la vérification de la précision de la neuronavigation à l’orientation intra-opératoire guidée par images. © 2014 Elsevier Masson SAS. Tous droits réservés.

1. Introduction The use of intraoperative neuronavigation is widely considered as mainstay during skull base procedures: firstly, because it helps in orientating through the complex anatomy of the cranial base; secondly, neuronavigation during such procedures is less subject to brain shift [1]. However, standard neuronavigation is based on the use of a bayonet probe – or more recently, electromagnetic

∗ Corresponding author. E-mail address: [email protected] (I. Cabrilo). http://dx.doi.org/10.1016/j.neuchi.2014.07.001 0028-3770/© 2014 Elsevier Masson SAS. All rights reserved.

devices – requiring the surgeon to look away from the operating microscope towards the neuronavigation workstation, and to mentally match neuronavigational image data to what is actually seen in the operating field. Furthermore, placing a bayonet probe into a deep-seated operating field cannot only be troublesome, but also potentially hazardous. Augmented reality-based neuronavigation, as described in this report, is a practical solution to this problem, as preoperatively segmented structures can be injected into the microscope’s ocular for same-time visualization of both neuronavigation data and the real-time operating field [2–5]. However, although this technology has been available for several years, it has not yet found its

I. Cabrilo et al. / Neurochirurgie 60 (2014) 304–306

305

Fig. 1. A. Sagittal sequence of a preoperative gadolinium-injected T1-weighted MRI of a patient with a clivus chordoma recurrence extending into the retropharyngeal space. B and C. Intraoperative image-injection into the operating microscope’s ocular of the segmented patient’s face upon his real-world face, demonstrating the accuracy of patient-neuronavigation co-registration; transparency of injected images can be accentuated (B) or decreased (C). D. 3D image-injection of the tumor (in yellow) and of both vertebral arteries (in red). E. Same angle of view as in (D), after placement of mouth and soft palate retractors: image-injection of the tumour guides resection; here the image of the tumour is injected in 2D, where the tumour in the plane of focus appears in full colour and the tumour below the plane of focus is circumscribed by the dotted line. Note the 3D image-injection of both internal carotid arteries (upper right corner and left side of the picture). A. Séquence injectée T1 d’une IRM préopératoire d’un patient avec une récidive d’un chordome du clivus, s’étendant dans l’espace rétropharyngé. B et C. Injection, dans l’oculaire du microscope, de l’image segmentée du visage du patient sur son visage réel, démontrant la précision de la co-registration du patient à la station de neuronavigation ; la transparence des images injectées peut être accentuée (B) ou atténuée (C). D. Injection d’images 3D de la tumeur (jaune) et des artères vertébrales (rouge). E. Même angle de vue que dans (D), après avoir positionné les rétracteurs buccal et palatin : l’injection d’images de la tumeur guide la résection ; ici, l’image de la tumeur est injectée en 2D, où la portion de la tumeur dans le plan focal apparaît en couleur pleine, alors que le contour de la tumeur en dessous du plan focal apparaît en pointillé. Noter l’injection d’images 3D des artères carotides internes (coin supérieur à droite et bord gauche de l’image).

way into daily surgical routine. Possibly, this owes to the perception of many surgeons that the setup is complicated and does not have an advantage over traditional neuronavigation. In this paper, we illustrate the easy application and the usefulness of augmented reality-guided surgery during a skull base procedure, using a commercialized system available in many operating theaters; furthermore, we demonstrate how image-injection can also be used for verification of neuronavigation co-registration from the very start of the procedure.

neuronavigation co-registration of the patient and the microscope lasted approximately 10 min. Structures of interest (patient’s skin, tumor, carotid and vertebral arteries) were preoperatively segmented from high-definition thin-slice MRI and CT scans, using BrainLAB’s Iplan platform (BrainLAB, Feldkirchen, Germany), and took approximately 15 min using the automatic segmentation function. These structures were then selectively injected into the microscope’s ocular, either in 2D or 3D. They could be turned on or off during the procedure and their transparency level could be changed (Fig. 1B and C).

2. Clinical case 3. Discussion A 46-year-old male patient was admitted to our department for a recurrence of an inferior clivus chordoma with retropharyngeal extension (Fig. 1A). He had previously been operated on three occasions (transoral; transmaxillar LeFort I; transnasal approaches) and had received post-operative radiotherapy totaling 74 Gy. Due to the significant tumor regrowth demonstrated on radiological followup, he recently underwent a combined transoral and endoscopic transnasal redo surgery. The patient was placed on the operating table and his head immobilized in a head-holder with a neuronavigation reference star. Patient-neuronavigation co-registration was performed using surface-matching systems (Z-touch® and Softouch® , KolibriTM ; BrainLAB, Feldkirchen, Germany). A reference star was then placed on the neuronavigable operating microscope (Zeiss Pentero 600; Zeiss, Oberkochen, Germany) and microscopic calibration was performed centered on the patient’s reference star. This process of

Augmented reality is defined as the overlay of virtual images upon real-world structures; when used during surgery, preoperative image datasets, e.g. CT or MRI, are projected upon the operating field for image-guidance [2]. A broad variety of systems have been reported in the literature, based on digital photograph superposition [6], image projection using a projector [7], mirror reflection [8], through augmented image-video superposition on a separate monitor [9–12], through endoscopy [13], using a tablet PC [14], using integral videography [15], via a head-mounted operating binocular [16], and through image-injection into the operating microscope [3–5,17], as is the case in this report. In this setup, the microscope itself serves as a virtual neuronavigation pointer where the tip of the pointer is in fact the microscope’s point of focus; whereas certain reported setups rely on manual adjustment for image alignment [6–9], our system integrates

306

I. Cabrilo et al. / Neurochirurgie 60 (2014) 304–306

spatial, focus and zoom parameters in real-time so that the injected images appear in accordance with all of these, relative to the navigated patient. Furthermore, both the real-world and the augmented environment are seen through the same channel of vision, i.e. the microscope, allowing for 3D stereoscopic visualization, also provided by a small number of other systems [15,16]. However, in contrast to most previous reports [6–12,14–16], additional hardware is not required, other than the neuronavigation station and microscope, thereby further contributing to the ergonomics of this system. Prior to draping, image-injection into the microscope’s eyepiece of a 3D model of the patient’s face was used to visually estimate the degree of accuracy of patient-neuronavigation co-registration (Fig. 1B and C). A shift can easily be noticed by overlaying the 3D virtual image upon the corresponding real-world structure. Furthermore, its spatial perception is more intuitive than when using the neuronavigation pointer. If shift occurs, the surgeon can correct the mismatch by recalibrating the microscope upon the patient’s reference star. However, it is conceivable that further technological development could allow for automatic mismatch recognition and correction. Once the accuracy of co-registration is confirmed, imageinjection of the tumour can be used to optimize the position of the patient’s head, as its “see-through” property allows to anticipate the angle of surgery (Fig. 1D). After draping, image-injection of the tumour indicated where to incise the mucosa and guided resection by giving real-time information on the three-dimensional extent of the lesion (Fig. 1E). This was of particular interest in the presented patient because fibrosis from previous operations cleaved the tumour into layers where appreciation of depth was difficult. Furthermore, image-injection of the neighboring carotid and vertebral arteries allowed to “keep an eye” on these structures during the course of resection. We believe augmented reality-assisted neuronavigation has a definite advantage over traditional neuronavigation during skull base procedures, in that it significantly spares the surgeon the task of mentally merging the neuronavigational data with the operating field; this is because traditional neuronavigation is point-based, while augmented reality-based neuronavigation integrates neuronavigational and real-world data into one comprehensive view. Furthermore, image-injection allows the surgeon to keep his/her eyes at all time on the surgical field. Although microscope calibration is an additional stage to the neuronavigation setup, the technique does not disrupt the surgical workflow, but possibly actually gains on operating time because it provides real-time information and increases the surgeon’s comfort. The cumbersome bayonet probe, as well as electromagnetic probes, could therefore become, for the most part, obsolete. The transoral phase of the operation was followed by a samesession endoscopic transnasal approach. Although endoscopes can be neuronavigated [18,19], only few reports exist of endoscope image-injection [13,20], indicating that its clinical use is currently limited; endoscopy-assisted skull base surgery still seems to rely on traditional neuronavigation – where neuronavigation probes are inserted through already cramped nostrils – leaving space for novel technological development [21]. 4. Conclusion Augmented reality-based neuronavigation is a useful asset during skull base procedures, as it directly overlays 3D

neuronavigational data onto the operating field. It helps during all phases of surgery – from neuronavigation accuracy verification to intraoperative orientation – and represents an intuitive form of image-guided surgery, especially in previously operated patients. Furthermore, integrating image-injection into the endoscope could be an interesting and practical technological development in the current general effort of rendering neurosurgical procedures minimally invasive. Disclosure of interest The authors declare that they have no conflicts of interest concerning this article. References [1] Sure U, Alberti O, Petermeyer M, Becker R, Bertalanffy H. Advanced imageguided skull base surgery. Surg Neurol 2000;53(6):563–72. [2] Shuhaiber JH. Augmented reality in surgery. Arch Surg 2004;139(2):170–4. [3] Edwards PJ, King AP, Maurer Jr CR, de Cunha DA, Hawkes DJ, Hill DL, et al. Design and evaluation of a system for microscope-assisted guided interventions (MAGI). IEEE Trans Med Imaging 2000;19(11):1082–93. [4] Brinker T, Arango G, Kaminsky J, Samii A, Thorns U, Vorkapic P, et al. An experimental approach to image-guided skull base surgery employing a microscopebased neuronavigation system. Acta Neurochir 1998;140(9):883–9. [5] Nijmeh AD, Goodger NM, Hawkes D, Edwards PJ, McGurk M. Image-guided navigation in oral and maxillofacial surgery. Br J Oral Maxillofac Surg 2005;43(4):294–302. [6] Lovo EE, Quintana JC, Puebla MC, Torrealba G, Santos JL, Lira IH, et al. A novel, inexpensive method of image coregistration for applications in image-guided surgery using augmented reality. Neurosurgery 2007;60(4 Suppl. 2):366– 71. [7] Mahvash M, Besharati Tabrizi L. A novel augmented reality system of image projection for image-guided neurosurgery. Acta Neurochir 2013;155(5):943– 7. [8] Fichtinger G, Deguet A, Masamune K, Balogh E, Fischer GS, Mathieu H, et al. Image overlay guidance for needle insertion in CT scanner. IEEE Trans Biomed Eng 2005;52(8):1415–24. [9] Gleason PL, Kikinis R, Altobelli D, Wells W, Alexander 3rd E, Black PM, et al. Video registration virtual reality for nonlinkage stereotactic surgery. Stereotact Funct Neurosurg 1994;63(1–4):139–43. [10] Kockro RA, Tsai YT, Ng I, Hwang P, Zhu C, Agusanto K, et al. Dex-ray: augmented reality neurosurgical navigation with a handheld video probe. Neurosurgery 2009;65(4):795–807. [11] Low D, Lee CK, Dip LL, Ng WH, Ang BT, Ng I. Augmented reality neurosurgical planning and navigation for surgical excision of parasagittal, falcine and convexity meningiomas. Br J Neurosurg 2010;24(1):69–74. [12] Inoue D, Cho B, Mori M, Kikkawa Y, Amano T, Nakamizo A, et al. Preliminary study on the clinical application of augmented reality neuronavigation. J Neurol Surg A Cent Eur Neurosurg 2013;74(2):71–6. [13] Kawamata T, Iseki H, Shibasaki T, Hori T. Endoscopic augmented reality navigation system for endonasal transsphenoidal surgery to treat pituitary tumors: technical note. Neurosurgery 2002;50(6):1393–7. [14] Deng W, Li F, Wang M, Song Z. Easy-to-use augmented reality neuronavigation using a wireless tablet PC. Stereotact Funct Neurosurg 2014;92(1):17–24. [15] Suenaga H, Hoang Tran H, Liao H, Masamune K, Dohi T, Hoshi K, et al. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study. Int J Oral Sci 2013;5(2):98–102. [16] Birkfellner W, Figl M, Huber K, Watzinger F, Wanschitz F, Hummel J, et al. A head-mounted operating binocular for augmented reality visualization in medicine: design and initial evaluation. IEEE Trans Med Imaging 2002;21(8):991–7. [17] Cabrilo I, Bijlenga P, Schaller K. Augmented reality in the surgery of cerebral aneurysms: a technical report. Neurosurgery 2014;10(Suppl. 2):252– 61. [18] Rhoten RL, Luciano MG, Barnett GH. Computer-assisted endoscopy for neurosurgical procedures: technical note. Neurosurgery 1997;40(3):632–7. [19] Mert A, Gan LS, Knosp E, Sutherland GR, Wolfsberger S. Advanced cranial navigation. Neurosurgery 2013;72(Suppl. 1):43–53. [20] Freysinger W, Gunkel AR, Thumfart WF. Image-guided endoscopic ENT surgery. Eur Arch Otorhinolaryngol 1997;254(7):343–6. [21] Marcus HJ, Cundy TP, Hughes-Hallett A, Yang GZ, Darzi A, Nandi D. Endoscopic and keyhole endoscope-assisted neurosurgical approaches: a qualitative survey on technical challenges and technological solutions. Br J Neurosurg 2014, http://dx.doi.org/10.3109/02688697.2014.887654.

Augmented reality-assisted skull base surgery.

Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of aug...
596KB Sizes 1 Downloads 9 Views