Technology Lets Doctors “See” Their Patients’ Pain
Many patients, especially those who are anesthetized or emotionally challenged, cannot communicate precisely about their pain. That’s why researchers at the University of Michigan researchers developed a technology to help clinicians “see” and map patient pain in real-time, through special augmented reality glasses.
The devices was tested on 21 volunteer dental patients. The portable CLARAi (clinical augmented reality and artificial intelligence) platform combines visualization with brain data using neuroimaging to navigate through a patient’s brain.
“It’s hard to measure and express our pain, including its expectation and associated anxiety,” say the researchers. “Right now, we have a one to 10 rating system, but that’s far from a reliable and objective pain measurement.”
Hassan Jassar (seated) wears a sensor-outfitted cap that detects changes in blood flow and oxygenation, thus sensing brain activity. That information is transmitted to a computer and interpreted. Thiago Nascimento, left, views this brain activity in real-time while wearing augmented reality glasses, and the computer image shows that particular pain signature in the brain. From left to right, also pictured: Dr. Alex DaSilva, Dajung Kim, Manyoel Lim, and Xiao-su Hu.
In the study, researchers triggered pain by administering cold to the teeth. Researchers used brain pain data to develop algorithms that, when coupled with new software and neuroimaging hardware, predicted pain or the absence of it about 70% of the time.
Participants wore a sensor-studded cap that detected changes in blood flow and oxygenation, thus measuring brain activity and responses to pain. That information was transmitted to a computer and interpreted.
Wearing special augmented reality glasses (in this case, the Microsoft HoloLens), researchers viewed the subject’s brain activity in real time on a reconstructed brain template, while subjects sat in the clinical chair. Red and blue dots on the image denoted locations and levels of brain activity, and this “pain signature” was mirror-displayed on the augmented reality screen. The more pain signatures the algorithm learns to read, the more accurate the pain assessment.