By: Cassidy Delamarter, University Communications and Marketing
How do you know when a newborn is in pain 鈥 especially one too weak to cry? A groundbreaking interdisciplinary project at the OB体育官网 is answering that question with artificial intelligence, aiming to transform treatment in Neonatal Intensive Care Units across the country.
Funded by the National Institutes of Health, researchers from and the USF Bellini College of Artificial Intelligence, Cybersecurity and Computing are working with Tampa General Hospital to improve the comfort of newborns through real-time pain detection 鈥 utilizing technology and algorithms that have been in the making for nearly a decade.

Dr. Thao "Tina" Ho in the Tampa General Hospital NICU

Each family is asked if they would like to participate in the study before any data collection begins.

Photos by: Andres Faza, University Communications and Marketing
鈥淭here are many challenges in managing pain in non-verbal and vulnerable preterm infants,鈥 said , associate professor of pediatrics at the and practicing neonatologist at TGH. 鈥淭his study will develop a reliable system to detect pain continuously by interpreting the infant鈥檚 physical movements, facial expressions, heart rate and respiratory rate all together. This will allow the infant鈥檚 bedside nurses to respond timely to their pain to optimize their comfort and minimize medication exposure.鈥
Approximately one in 10 babies are admitted to the NICU each year to receive critical care, according to . Pain assessments typically rely on nurses鈥 observations and scoring, which can vary widely. The new AI system aims to deliver a standardized, continuous and objective assessment using non-invasive, affordable sensors and cameras that integrate easily into existing hospital settings.
For parents like Mari Womack, whose daughter was born prematurely and spent time in the NICU, the potential for such a tool is deeply personal.

Womack after her baby was born

About a week into her NICU stay
鈥淪he was too little and weak to even cry, so it was hard to know if she was in pain,鈥 Womack said. Born prematurely and appearing blue due to lack of oxygen, a condition known as cyanosis, her daughter needed to be monitored closely and given special medications to improve for nearly a week. 鈥淚 feel like this technology would have given us peace of mind. It could really help other families like ours.鈥

Sun in his robotics lab with students
鈥淭his is exactly the kind of project that shows what USF is all about 鈥 innovation that has real-world health care impact,鈥 said , principal investigator of the project and a professor of computer science and engineering at USF who specializes in AI and health care applications. 鈥淲e鈥檙e combining our expertise in AI with pediatric medicine to create tools that improve care for vulnerable patients.鈥
The team is gathering photos, videos and vital signs of infants before and after surgery. In addition to TGH, they鈥檙e collecting data from partnering NICUs at Stanford University Hospital and Inova Hospital in Virginia, which will be analyzed for patterns to better understand how pain presents in newborns through their facial expressions, movements and vital signs. USF computer science doctoral student Jacqueline Hausmann and Distinguished University Professor Dmitry Goldgof will then train artificial intelligence to recognize those key indicators of pain and create a system that notifies nurses in real-time of distress.
鈥淭he goal is to improve pain management,鈥 Sun said. 鈥淥ur automated neonatal pain assessment system will detect pain before it spikes so it could be treated early. So the pain management will be more effective and rely less on opioid-based medications.鈥
Researchers hope to advance the study to randomized clinical trials in two years to evaluate the system鈥檚 effectiveness in real-time pain management and outcomes.
The collected data set is partially available to other researchers by request, with safeguards in place to protect patient privacy. .