12 results on '"Christopher Wedlake"'
Search Results
2. Object identification accuracy under ultrasound enhanced virtual reality for minimally invasive cardiac surgery.
- Author
-
Andrew D. Wiles, John Moore 0001, Cristian A. Linte, Christopher Wedlake, Anis Ahmad, and Terry M. Peters
- Published
- 2008
- Full Text
- View/download PDF
3. Navigation accuracy for an intracardiac procedure using ultrasound enhanced virtual reality.
- Author
-
Andrew D. Wiles, Gerard M. Guiraudon, John Moore 0001, Christopher Wedlake, Cristian A. Linte, Daniel Bainbridge, Douglas L. Jones, and Terry M. Peters
- Published
- 2007
- Full Text
- View/download PDF
4. A navigation platform for guidance of beating heart transapical mitral valve repair
- Author
-
John Moore, Michael W.A. Chu, Gerard M. Guiraudon, Terry M. Peters, Daniel Bainbridge, Christopher Wedlake, Maria E. Currie, Bob Kiaii, Martin Rajchl, and Rajni V. Patel
- Subjects
Beating heart ,medicine.medical_specialty ,Swine ,medicine.medical_treatment ,Biomedical Engineering ,Less invasive ,Coronary Artery Bypass, Off-Pump ,law.invention ,law ,Internal medicine ,medicine ,Cardiopulmonary bypass ,Animals ,Cardiac Surgical Procedures ,Mitral valve repair ,Surgical approach ,business.industry ,Ultrasound ,Reproducibility of Results ,Surgery ,Ultrasound guidance ,Image-guided surgery ,Surgery, Computer-Assisted ,Echocardiography ,Cardiology ,Mitral Valve ,business - Abstract
Traditional surgical approaches for repairing diseased mitral valves (MVs) have relied on placing the patient on cardiopulmonary bypass (on pump), stopping the heart and accessing the arrested heart directly. However, because this approach has the potential for adverse neurological, vascular, and immunological sequelae, less invasive beating heart alternatives are desirable. Emerging beating heart techniques have been developed to offer high-risk patients MV repair using ultrasound guidance alone without stopping the heart. This paper describes the first porcine trials of the NeoChord DS1000 (Minnetonka, MN), employed to attach neochordae to a MV leaflet using the traditional ultrasound-guided protocol augmented by dynamic virtual geometric models. The distance errors of the tracked tool tip from the intended midline trajectory (5.2 ± 2.4 mm versus 16.8 ± 10.9 mm, p = 0.003), navigation times (16.7 ± 8.0 s versus 92.0 ± 84.5 s, p = 0.004), and total path lengths (225.2 ± 120.3 mm versus 1128.9 ± 931.1 mm, p = 0.003) were significantly shorter in the augmented ultrasound compared to navigation with ultrasound alone,1 indicating a substantial improvement in the safety and simplicity of the procedure.
- Published
- 2012
5. Augmented reality image guidance during off-pump mitral valve replacement through the guiraudon universal cardiac introducer
- Author
-
Gerard M. Guiraudon, Douglas L. Jones, Daniel Bainbridge, Cristian Linte, Danielle Pace, John Moore, Christopher Wedlake, Pencilla Lang, and Terry M. Peters
- Subjects
Pulmonary and Respiratory Medicine ,Surgery ,General Medicine ,Cardiology and Cardiovascular Medicine - Abstract
Objective We report our experience with ultrasound augmented reality (US-AR) guidance for mitral valve prosthesis (MVP) implantation in the pig using off-pump, closed, beating intracardiac access through the Guiraudon Universal Cardiac Introducer attached to the left atrial appendage. Methods Before testing US-AR guidance, a feasibility pilot study on nine pigs was performed using US alone. US-AR guidance, tested on a heart phantom, was subsequently used in three pigs (~65 kg) using a tracked transesophageal echocardiography probe, augmented with registration of a 3D computed tomography scan, and virtual representation of the MVP and clip-delivering tool (Clipper); three pigs were used to test feature-based registration. Results Navigation of the MVP was facilitated by the 3D anatomic display. AR displayed the MVP and the Clipper within the Atamai Viewer, with excellent accuracy for tool placement. Positioning the Clipper was hampered by the design of the MVP holder and Clipper. These limitations were well displayed by AR, which provided guidance for improved design of tools. Conclusions US-AR provided informative image guidance. It documented the flaws of the current implantation technology. This information could not be obtained by any other method of evaluation. These evaluations provided guidance for designing an integrated tool: combining an unobtrusive valve holder that allows the MVP to function properly as soon as positioned, and an anchoring system, with clips that can be released one at a time, and retracted if necessary, for optimal results. The portability of Real-time US-AR may prove to be the ideal practical image guidance system for all closed intracardiac interventions.
- Published
- 2012
6. Virtual reality imaging with real-time ultrasound guidance for facet joint injection: a proof of concept
- Author
-
John W. Moore, Timothy O. Wilson, Collin Clarke, Christopher Wedlake, Terry M. Peters, Su Ganapathy, Maher Salbalbal, Donald H. Lee, and Daniel Bainbridge
- Subjects
Male ,medicine.medical_specialty ,Facet (geometry) ,Interface (computing) ,Real time ultrasound ,Virtual reality ,Zygapophyseal Joint ,Cadaver ,medicine ,Computer Graphics ,Image Processing, Computer-Assisted ,Humans ,Computer vision ,Ultrasonography ,business.industry ,Phantoms, Imaging ,Ultrasound ,Facet joint injection ,Spine ,Anesthesiology and Pain Medicine ,Proof of concept ,Needles ,Radiology ,Artificial intelligence ,business ,Tomography, X-Ray Computed - Abstract
Facet interventions continue to be used in pain management. Computed tomographic (CT) images can be registered into a virtual world that includes images generated by an ultrasound (US) probe tracked in real time, permitting guidance of tracked needles. We acquired CT-generated 3-dimensional (3D) images of 2 models and a cadaver. Three-dimensional representations of a US probe and needle were generated. A magnetic system tracked the needle and US probe. Using the US, 3D CT images were registered to the model/cadaver. Images were fused on a single interface. Facet injections were performed in the models and cadaver with radio-opaque markers. A postprocedure CT image determined appropriate placement. The virtual reality system described demonstrates technical innovations that may lead to future advancements in the area of percutaneous interventions in the management of pain.
- Published
- 2010
7. Fusion of stereoscopic video and laparoscopic ultrasound for minimally invasive partial nephrectomy
- Author
-
Stephen E. Pautler, Terry M. Peters, Carling L. Cheung, Christopher Wedlake, John Moore, and Anis Suriati Ahmad
- Subjects
Image-Guided Therapy ,Computer science ,medicine.medical_treatment ,Stereoscopy ,Context (language use) ,law.invention ,law ,medicine ,Computer vision ,Intraoperative imaging ,medicine.diagnostic_test ,business.industry ,Prostatectomy ,Distortion (optics) ,Ultrasound ,Laparoscopic ultrasound ,Nephrectomy ,Visualization ,Endoscopy ,medicine.anatomical_structure ,Ultrasound imaging ,Abdomen ,Augmented reality ,Artificial intelligence ,Ultrasonography ,business ,Depth perception - Abstract
The development of an augmented reality environment that combines laparoscopic video and ultrasound imaging for image-guided minimally invasive abdominal surgical procedures, such as partial nephrectomy and radical prostatectomy, is an ongoing project in our laboratory. Our system overlays magnetically tracked ultrasound images onto endoscopic video to create a more intuitive visualization for mapping lesions intraoperatively and to give the ultrasound image context in 3D space. By presenting data in a common environment, this system will allow surgeons to visualize the multimodality information without having to switch between different images. A stereoscopic laparoscope from Visionsense Limited enhances our current system by providing surgeons with additional visual information through improved depth perception. In this paper, we develop and validate a calibration method that determines the transformation between the images from the stereoscopic laparoscope and the 3D locations of structures represented by a tracked laparoscopic ultrasound probe. We first calibrate the laparoscope with a checkerboard pattern and measure how accurate the transformation from image space to tracking space is. We then perform a target localization task using our fused environment. Our initial experience has demonstrated an RMS registration accuracy in 3D of 2.21mm for the laparoscope and 1.16mm for the ultrasound in a working volume of 0.125m3, indicating that magnetically tracked stereoscopic laparoscope and ultrasound images may be appropriately combined using magnetic tracking as long as steps are taken to ensure that the magnetic field generated by the system is not distorted by surrounding objects close to the working volume.
- Published
- 2009
8. Object identification accuracy under ultrasound enhanced virtual reality for minimally invasive cardiac surgery
- Author
-
John Moore, Andrew D. Wiles, Anis Suriati Ahmad, Cristian A. Linte, Christopher Wedlake, and Terry M. Peters
- Subjects
Image-Guided Therapy ,business.industry ,medicine.medical_treatment ,Ultrasound ,Mitral valve replacement ,medicine ,Minimally invasive cardiac surgery ,Virtual reality ,Guidance system ,business ,Imaging phantom ,Rigid transformation ,Biomedical engineering - Abstract
A 2D ultrasound enhanced virtual reality surgical guidance system has been under development for some time in our lab. The new surgical guidance platform has been shown to be effective in both the laboratory and clinical settings, however, the accuracy of the tracked 2D ultrasound has not been investigated in detail in terms of the applications for which we intend to use it (i.e., mitral valve replacement and atrial septal defect closure). This work focuses on the development of an accuracy assessment protocol specific to the assessment of the calibration methods used to determine the rigid transformation between the ultrasound image and the tracked sensor. Specifically, we test a Z-bar phantom calibration method and a phantomless calibration method and compared the accuracy of tracking ultrasound images from neuro, transesophageal, intracardiac and laparoscopic ultrasound transducers. This work provides a fundamental quantitative description of the image-guided accuracy that can be obtained with this new surgical guidance system.
- Published
- 2008
9. From pre-operative cardiac modeling to intra-operative virtual environments for surgical guidance: an in vivo study
- Author
-
Douglas L. Jones, Terry M. Peters, Marcin Wierzbicki, Christopher Wedlake, John Moore, Cristian A. Linte, Gerard M. Guiraudon, Daniel Bainbridge, and Andrew D. Wiles
- Subjects
Engineering ,Image-Guided Therapy ,business.industry ,Virtual reality ,computer.software_genre ,Intracardiac injection ,Visualization ,Feature (computer vision) ,Virtual machine ,Augmented reality ,Computer vision ,Artificial intelligence ,business ,computer ,Data integration - Abstract
As part of an ongoing theme in our laboratory on reducing morbidity during minimally-invasive intracardiacprocedures, we developed a computer-assisted interventi on system that provides safe access inside the beatingheart and sucient visualization to deliver therapy to intracardiac targets while maintaining the ecacy of theprocedure. Integrating pre-operative information, 2D trans-esophageal ultrasound for real-time intra-operativeimaging, and surgical tool tracking using the NDI Aurora TM magnetic tracking system in an augmented virtualenvironment, our system allows the surgeons to navigate instruments inside the heart in spite of the lack ofdirect target visualization. This work focuses on further enhancing intracardiac visualization and navigation bysupplying the surgeons with detailed 3D dynamic cardiac models constructed from high-resolution pre-operativeMR data and overlaid onto the intra-operative imaging environment. Here we report our experience during an invivo porcine study. A feature-based registration technique previously explored and validated in our laboratorywas employed for the pre-operative to intra-operative mapping. This registration method is suitable for invivo interventional applications as it involves the selection of easily identiable landmarks, while ensuring a goodalignment of the pre-operative and intra-operative surgical targets. The resulting augmented reality environmentfuses the pre-operative cardiac model with the intra-operative real-time US images with approximately 5 mmaccuracy for structures located in the vicinity of the valvular region. Therefore, we strongly believe that ouraugmented virtual environment signicantly enhances intracardiac navigation of surgical instruments, while on-target detailed manipulations are performed under real-time US guidance.Keywords: Image-Guided Cardiac Procedures, Pre-operative Modeling, Intra-operative Imaging, AugmentedVirtual Reality, Data Integration for the Clinic/OR.
- Published
- 2008
10. Navigation accuracy for an intracardiac procedure using ultrasound enhanced virtual reality
- Author
-
Douglas L. Jones, Andrew D. Wiles, Gerard M. Guiraudon, Terry M. Peters, Daniel Bainbridge, Christopher Wedlake, John Moore, and Cristian A. Linte
- Subjects
Engineering ,Image-Guided Therapy ,business.industry ,System of measurement ,Ultrasound ,Calibration ,Virtual reality ,Focus (optics) ,business ,Intracardiac injection ,Imaging phantom ,Biomedical engineering - Abstract
Minimally invasive techniques for use inside the beating heart, such as mitral valve replacement and septal defect repair, are the focus of this work. Traditional techniques for these procedures require an open chest approach and a cardiopulmonary bypass machine. New techniques using port access and a combined surgical guidance tool that includes an overlaid two-dimensional ultrasound image in a virtual reality environment are being developed. To test this technique, a cardiac phantom was developed to simulate the anatomy. The phantom consists of an acrylic box filled with a 7% glycerol solution with ultrasound properties similar to human tissue. Plate inserts mounted in the box simulate the physical anatomy. An accuracy assessment was completed to evaluate the performance of the system. Using the cardiac phantom, a 2mm diameter glass toroid was attached to a vertical plate as the target location. An elastic material was placed between the target and plate to simulate the target lying on a soft tissue structure. The target was measured using an independent measurement system and was represented as a sphere in the virtual reality system. The goal was to test the ability of a user to probe the target using three guidance methods: (i) 2D ultrasound only, (ii) virtual reality only and (iii) ultrasound enhanced virtual reality. Three users attempted the task three times each for each method. An independent measurement system was used to validate the measurement. The ultrasound imaging alone was poor in locating the target (5.42 mm RMS) while the other methods proved to be significantly better (1.02 mm RMS and 1.47 mm RMS respectively). The ultrasound enhancement is expected to be more useful in a dynamic environment where the system registration may be disturbed.
- Published
- 2007
11. Neonav: Augmented Reality Echocardiographic Navigation and Guidance for Beating Heart Mitral Valve Repair
- Author
-
Terry M. Peters, Richard C. Daly, D. Bainbridge, John Moore, Gerard M. Guiraudon, David McCarty, Christopher Wedlake, Bob Kiaii, Michael W.A. Chu, Maria E. Currie, and Rajni V. Patel
- Subjects
medicine.medical_specialty ,Mitral valve repair ,Beating heart ,business.industry ,medicine.medical_treatment ,Internal medicine ,Cardiology ,Medicine ,Augmented reality ,Cardiology and Cardiovascular Medicine ,business - Published
- 2013
12. Sci-Thur PM: YIS - 02: Intraoperative Guidance for Minimally Invasive Abdominal Surgery Using Fused Video and Ultrasound Images: A Phantom Study
- Author
-
Christopher Wedlake, John Moore, Stephen E. Pautler, Carling L. Cheung, and Terry M. Peters
- Subjects
medicine.medical_specialty ,business.industry ,Ultrasound ,Tracking system ,General Medicine ,Imaging phantom ,Visualization ,medicine ,Augmented reality ,Computer vision ,Artificial intelligence ,Radiology ,Image sensor ,business ,Depth perception ,Abdominal surgery - Abstract
Many abdominal surgery procedures are now performed minimally invasively. We consider tumour resection, where surgeons use a laparoscopic camera to view the organ surface and a laparoscopic ultrasound(US) probe to visualize the tumour to plan and perform the excision. Conventionally, images are displayed separately and are typically presented in 2D. Therefore, the surgeon has to look back and forth between the images and mentally map the US onto the video to determine the tumour location relative to the surface. Furthermore, the 2D nature of the images decreases depth perception. To address these limitations, we developed an augmented reality visualization that fusesimages in a common 3D environment. Instruments were tracked using sensors spatially identified with a magnetic field generator. Through calibration, their image locations were determined in real time. The accuracy of the camera and UScalibrations was determined both relative to the tracking system and to each other using target localization. We evaluated the efficacy of the fusion with a phantom experiment. A surgeon performed tumour resections on polyvinyl alcohol‐cryogel phantoms under the guidance of the conventional visualization and the fusion system presented in 2D and in 3D. The target localization error was 1.20±0.08mm for the camera, 1.85±0.14mm for the US, and 2.38±0.11mm between the camera and the US. Early results demonstrate a faster resection planning time using fusion compared to the conventional setup while maintaining similar margin accuracy. This study supports the implementation of fusion for guidance of time‐sensitive resection tasks performed under conditions of warm ischemia.
- Published
- 2010
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.