1. What is the Future of Minimally Invasive Sinus Surgery: Computer-Assisted Navigation, 3D-Surgical Planner, Augmented Reality in the Operating Room with ‘in the Air’ Surgeon’s Commands as 'Biomechanics' of the New Era in Personalized Contactless Hand-Gesture Non-Invasive Surgeon-Computer Interaction?
- Author
-
Kubat Goranka, Berlengi Nedjeljka, Majhen Zlatko, Žagar Martin, Benić Igor, Kostelac Milan, Duspara Alen, Klapan Ivica, and Zemba Mladen
- Subjects
Computer science ,Non invasive ,Biomechanics ,General Medicine ,Virtual reality ,Sinus surgery ,Planner ,Human–computer interaction ,Gesture recognition ,Gesture control ,Voice commands ,Region of interest ,3D volume rendering ,Leap Motion ,OsiriX MD ,Virtual endoscopy ,Virtual surgery ,Contactless surgery ,Swarm intelligence ,Augmented reality ,computer ,computer.programming_language ,Gesture - Abstract
Full-Text HTML Abstract Full-Text PDF Full-Text XML How to Cite Research Article What is the Future of Minimally Invasive Sinus Surgery: Computer-Assisted Navigation, 3D-Surgical Planner, Augmented Reality in the Operating Room with ‘in the Air’ Surgeon’s Commands as “Biomechanics” of the New Era in Personalized Contactless Hand- Gesture Non-Invasive SurgeonComputer Interaction? Klapan Ivica1, 2, 3, 4*, Duspara Alen6, Majhen Zlatko5, 7, Benić Igor8, Kostelac Milan8, Kubat Goranka9, Berlengi Nedjeljka10, Zemba Mladen10, Žagar Martin11 Author Affiliations Received: July 11, 2019 | Published: July 23, 2019 Corresponding author: Ivica Klapan, Klapan Medical Group Polyclinic, Ilica 191A, HR-10000 Zagreb, Croatia, EU DOI: 10.26717/BJSTR.2019.19.003377 Abstract Purpose: We were focused on the development of personal-3D-navigation system and application of augmented reality in the operating room per viam personalized contactless hand-gesture non-invasive surgeon-computer interaction, with higher intraoperative safety, reduction of operating time, as well as the length of patient postoperative recovery. Methods: Simultaneous use of video image, 3D anatomic fields and navigation in space, with the application of our original special plug-in application for OsiriX platform, enabling users to use LM-sensor as an interface for camera positioning in 3DVR and VE views, and integrating speech recognition as a VC solution, in an original way. Results: Management of image 2D-3D-video-medical documentation, as well as the control of marker- based virtual reality simulation in real time during real operation with per viam our personalized contactless “in the air” surgeon’s commands. Conclusion: The use of modern technologies in head and neck surgery in the last 30 years (e.g., FESS, NESS, and robotic surgery) has enabled surgeons to demonstrate spatial anatomic elements in the operating field, which was quite inconceivable before. This approach has not yet been used in rhinosinusology or otorhinolaryngology, and to our knowledge, not even in general surgery. The question that we have to ask ourselves now is what prerequisites, realizable in the future, should be realized in our “on the fly” gesturecontrolled and incisionless virtual surgical interventions for the eventual utilization to meet the most demanding requirements in the OR?
- Published
- 2019
- Full Text
- View/download PDF