Biomedical engineers at Duke University have developed an Artificial Intelligence (AI)-based automated process that can trace the shapes of active neurons as accurately as human researchers, that too in a fraction of the time.
The new technique, based on using AI to interpret video images, addresses a critical roadblock in neuron analysis, allowing researchers to rapidly gather and process neuronal signals for real-time behavioural studies, said a paper published in the Proceedings of the National Academy of Sciences.
"As a critical step towards complete mapping of brain activity, we were tasked with the formidable challenge of developing a fast automated algorithm that is as accurate as humans for segmenting a variety of active neurons imaged under different experimental settings," said Sina Farsiu, the Paul Ruffin Scarborough Associate Professor of Engineering at Duke.
To measure neural activity, researchers use a process known as "two-photon calcium imaging" which allows them to record the activity of individual neurons in the brains of live animals.
These recordings enable researchers to track which neurons are firing, and how they potentially correspond to different behaviours.
A researcher can spend anywhere from four to 24 hours segmenting neurons in a 30-minute video recording.
In contrast, a new open source automated algorithm developed by image processing and neuroscience researchers in Duke's Department of Biomedical Engineering can accurately identify and segment neurons in minutes.
"The data analysis bottleneck has existed in neuroscience for a long time. Data analysts have spent hours and hours processing minutes of data, but this algorithm can process a 30-minute video in 20 to 30 minutes," said Yiyang Gong, assistant professor in Duke BME.
The advance is a critical step towards allowing neuroscientists to track neural activity in real time.