Pioneering Real-Time Big Data Processing System Enables Whole-Brain Optical Interface-Mediated Virtual Reality and Closed-Loop Neuroscience Research Paradigm


On March 11, 2024, Nature Neuroscience published a paper titled "Real-time Analysis of Large-scale Neuronal Imaging Enables Closed-loop Investigation of Neural Dynamics," showcasing a collaborative effort by Groups of DU Jiulin and MU Yu at the Center for Excellence in Brain Science and Intelligence Technology (CEBSIT), Chinese Academy of Sciences (CAS), and HAO Jie at Institute of Automation, CAS. The achievement has been issued the invention patent “Optical Brain-Machine Interface System and Method” (Patent No.: ZL202310131178.9).


This study utilized data processing techniques from astronomy and adopted a FPGA-GPU hybrid architecture to successfully perform real-time registration, signal extraction, and analysis on data streams of up to 500MB/s. With this breakthrough technology, the research team achieved real-time analysis of hundreds of thousands of neurons in the zebrafish brain for the first time, enabling decoding of the activities of arbitrarily selected neuron ensembles to control external devices. This achievement marks a crucial step in the application of techniques such as virtual reality based on whole-brain cellular-resolution optical imaging and optogenetic control in the field of closed-loop whole-brain-scale research.


Whole-brain neuron activity imaging is a powerful tool for deciphering the principles of the brain. However, its enormous data processing demands have become a bottleneck in technological development, making real-time analysis and closed-loop research of brain functions challenging. Inspired by rapid radio burst detection technology in astronomy, the researchers employed the design of the FX system and utilized the flexibility of FPGA programming to establish an optical neural signal preprocessing system. This system regularizes signals from optical sensors and sends them to a GPU-based real-time processing system for high-speed nonlinear registration, neural signals extraction and decoding, and obtaining feedback signals for controlling external devices. The system generates feedback signals by continuously monitoring the activities of zebrafish whole-brain neurons, with a feedback delay of less than 70.5 milliseconds.


The performance of the system was demonstrated in three closed-loop brain science research scenarios: real-time optogenetic stimulation locked to the activity of arbitrarily selected neuron ensembles, real-time visual stimulation locked to specific brain functional states, and virtual reality directly driven by neuronal activities in the brain.


Closed-loop optogenetic based on real-time neuronal ensemble activities: By functionally clustering neurons in the whole brain, the spontaneous activity of the selected ensembles was used as a trigger signal to implement real-time optogenetic stimulation on target neuron ensembles. Compared to open-loop stimulation, closed-loop stimulation effectively activated downstream brain areas.


Real-time visual stimulation locked to specific brain functional states: By real-time monitoring of the activity of the locus coeruleus (LC) norepinephrinergic system, visual stimulation was applied during the excitatory phase of LC neurons representing the animal's awake state, resulting in stronger responses of neurons across the brain. This indicates that brain states modulate the processing of visual information, and that closed-loop sensory stimulation facilitate the study of the interaction between internal brain states and the external environment.


Virtual reality based on optical brain-machine interface (BMI): Real-time dimensionality reduction of all brain neurons’ activities to multiple neuron ensembles and closed-loop coupling with the visual environment enabled to establish a virtual reality system directly driven by the activities of neurons in the brain. In this virtual reality, the gain coupling between the neuronal activity and the environment can be adjusted arbitrarily, allowing the neuron ensemble controlling the environment to adaptively adjust its output based on gain change. Leveraging the real-time analysis of big data streams and high-throughput whole-brain imaging technology, the researchers will screen out neuronal activity characteristics suitable for optical BMI, uncover the underlying mechanisms, and develop more efficient optical BMI technologies.


DU Jiulin, HAO Jie, and MU Yu are co-corresponding authors; SHANG Chunfeng (principle investigator at Jinan University/Shenzhen Institutes of Neuroscience, formerly Associate Researcher at DU Jiulins lab), WANG Yufan (post-doc in DU Jiulin’s lab), Dr. ZHAO Meiting (assistant researcher in HAO Jie’s lab) are co-first authors, FAN Qiuxiang, ZHAO Shan, QIAN Yu, and XU Shengjin made significant contributions. This work was supported by the Ministry of Science and Technology, National Natural Science Foundation of China, Chinese Acade.


Left: An artist’s interpretation of how this innovative system seamlessly translates the full spectrum of neuronal dialogue into an interface that connects the depths of the neural “sea” in the inner world with the vast “sky” of the external real-world environment.

Right: Virtual reality enabled by ensemble neuronal activity-triggered sensory feedback.