Technology by Researchers at Ann Arbor’s U-M Allows Clinicians to See Pain

Researchers at the University of Michigan in Ann Arbor have developed a technology to help clinicians see and map patient pain in real time through augmented reality glasses. The solution is helpful for patients who are anesthetized or cannot communicate precisely about their pain.
1173
patient wearing technology that allows scientists to see pain
U-M researchers have created a device that uses augmented reality glasses to view pain. Alex DaSilva is third from the left. // Photo courtesy of the University of Michigan

Researchers at the University of Michigan in Ann Arbor have developed a technology to help clinicians see and map patient pain in real time through augmented reality glasses. The solution is helpful for patients who are anesthetized or cannot communicate precisely about their pain.

The technology was tested on 21 volunteer dental patients, and researchers hope to one day include other types of pain and conditions. However, it’s years away from widespread use in a clinical setting, says Alex DaSilva, associate professor at the U-M School of Dentistry and director of the Headache and Orofacial Pain Effort Lab.

The portable platform combines visualization with brain data using neuroimaging to navigate through patients’ brains while they’re in the chair.

“It’s very hard for us to measure and express our pain, including its expectation and associated anxiety,” says DaSilva. “Right now, we have a one-to-10 rating system, but that’s far from a reliable and objective pain measurement.”

In the study, researchers triggered pain by making patients’ teeth cold. They used brain pain data to develop algorithms that were coupled with software and neuroimaging hardware to predict pain or its absence about 70 percent of the time.

The augmented reality glasses allowed researchers to view the subject’s brain activity in real time on a reconstructed brain template while the subject sat in the clinical chair. The red and blue dots on the image denote location and level of brain activity, and the pain signature was displayed on the augmented reality screen.

The more pain signatures the algorithm learns to read, the more accurate the pain assessment will be.