One unmistakable topic at RSNA 2017 was virtual and augmented reality—and how advancements will affect medicine. With numerous presentations and interactive booths at the annual conference in Chicago, the technologies are clearly growing in popularity in interventional radiology and health imaging.
Eliot Siegel, MD, professor and vice chair of information systems at the University of Maryland, discussed defining and differentiating virtual reality (VR) and augmented reality (AR) in a Nov. 28 session. Though the two overlap, there are notable distinctions. VR is immersive technology that can stimulate an experience in real time, Siegel explained. No matter where the users are physically located, they will all undergo the same experience. AR also stimulates an experience in real time, though it projects experiences onto the user’s physical surroundings.
According to Siegel, the potential to view MRI and CT images in real time through VR and AR could be revolutionary for diagnostic practice and interventional radiology.
One type of effective VR in medicine is 3D printing. In the last five years, 3D printing has exploded, taking clinical data to an interactive level. Allowing clinicians of all subspecialties to touch, feel, and see medical images from various angles, Siegel predicts that 3D printing is very beneficial for surgical planning though will unlikely be the future of virtual reality in medicine. The high cost of 3D printing and its slow rate of production are also reasons for this conclusion.
Startups are currently using VR and AR technologies for employee training, which can help familiarize trainees with different environments such as a trauma center. It can also help users to approach cases in different healthcare roles before actually being put to the test in the OR. Both technologies are also being used in rehabilitation as a distraction tool for pediatric testing and to access and project patient data onto physical spaces.
Siegel discussed implications for VR and AR in diagnostic imaging specifically in supplementing traditional displays and creating virtual multi-monitor workstation, novel 3D visualization and surgical planning. Still, AR has more real-time applications that can be more beneficial for clinicians
“AR technology can help you [simulate] before surgery or fuse previously taken images with patient images to save time, reduce x-ray dosage and ultimately reduce the number of surgical complications,” Siegel said.
AR technologies currently on the market, such as the Microsoft HoloLens and Google Glass, are being driven by prototype medical applications at the University of Maryland, Siegel explained.
A 3D viewer for DICOM data can serve as in-vivo navigation for CT-guided intervention. According to Siegel and his team, it has the potential to dynamically increase the number of monitors a radiologist has when examining images or medical records. It also may render imaging data slice by slice onto a physical space and can record a surgeon when performing procedures so they can personally review their work.
With Google Glass, clinicians can potentially use its wireless monitor for ultrasound-guided procedures or can contact a subspecialist during cases when additional guidance and input is needed. Additionally, in ultrasound-guided procedures, it can allow the operator to visualize needle and ultrasound during the procedure.
Touchless voice-control and moving images onto the patient is possible with VR technology, cross-sectional imaging. Teleradiology, or tele-VR, can allow different people in different places to “be in the room together virtually,” Siegel said, leading to mobile, lower-cost workstation.
Accuracy in head tracking, head-mounted cameras and camera-to-eye registration are hurdles to increasing VR and AR use. But the technologies have the potential to improve stand-alone, small-footprint workstations and increase synergy with artificial intelligence.