4D-Brains
Extracting Activity from Large
4D Whole-Brain Image Datasets
Abstract
Whole-brain recordings hold promise to revolutionize neuroscience. In the last decade, innovations in fast 3D microscopy, protein engineering, genetics, and microfluidics have allowed brain researchers to read out calcium activity at high temporal resolution from many neurons in the brains of Caenorhabditis Elegans, Danionella Translucida, Hydra, and Zebrafish simultaneously. This technology is considered to be a game changer for neuroscience as it leaves far fewer variables hidden than when only a tiny fraction of neuronal activities could be recorded. Many fundamental and challenging questions of neuroscience can now be pursued:
- What global brain activity determines an organism’s responses to stimuli?
- How are decisions computed by networks of neurons?
- What is the idle activity of an unstimulated brain?
The field suffers from a critical bottleneck. Neuronal activities are recorded as local intensity changes in 4D microscopy images. Extracting this information for a moving animal is very labor-intensive and requires expertise. The promise of whole-brain recordings cannot be fully realized unless the image analysis problem is solved.
There are several challenges:
A) 3D images are generally tricky to annotate manually
B) The worm moves, rotates, bends, and compresses fast
C) To avoid blurring, the exposure time and the image quality are limited
D) The resolution in the z-direction is low
Started
July 2021
ONGOING
PI / Partners
Laboratory of the Physics of Biological Systems (EPFL):
-
- Prof. Sahand Rahi
- Dr. Elif Gençtürk
- Alice Gross
- Mahsa Barzegarkeshteli
- Matthieu Schmidt
Description
Problem:
The goals of the collaboration consist of identifying specific neurons across 4D images (segmentation & tracking), mapping every pixel in 4D images onto a 3D reference (registration), and speeding up the former tasks for real-time feedback to the animal.
Proposed approach:
The SDSC will help to design a robust, efficient algorithm for tracking a specific set of neurons in videos of freely moving worms, and will propose machine learning techniques to align the worm images within each video and extract the activities of the neurons.
Impact:
Efficient image analysis techniques would reduce the burden of manual annotation and unleash the growth of the field. Faster image analysis would mean that:
- more and more diverse range of experiments can be performed
- more animals can be analyzed “per paper”, making results more statistically rigorous
- more scientists could perform such experiments
- “high-throughput neuroscience” with freely moving animals will become possible
- new questions will become accessible: For example, individual differences between animals cannot be studied in a statistically rigorous way with the few worms that are usually analyzed “per paper”.

We aim to identify pieces of neurons or whole neurons in 3D images and track them in time. This can be done by mapping 3D images from different time points onto the same reference 3D image.
Publications
-
C. F. Park, M. B. Keshteli, K. Korchagina, A. Delrocq, V. Susoy, C. L. Jones, A. D. T. Samuel, S. J. Rahi. Automated neuron tracking inside moving and deforming animals using deep learning and targeted augmentation. In bioRxiv 2022.03.15.484536, 2022.
Related Pages
The code for the paper “Automated neuron tracking inside moving and deforming animals using deep learning and targeted augmentation“ is publicly available at https://github.com/lpbsscientist/targettrack
Bibliography
- S. Chaudhary, S. A. Lee, Y. Li, D. S. Patel, and H. Lu. Graphical-model framework for automated annotation of cell identities in dense cellular images. In eLife 10:e60321, Feb. 2021.
- S. Kato, H. S. Kaplan, T. Schrödel, S. Skora, T. H. Lindsay, E. Yemini, S. Lockery, and M. Zimmer. Global brain dynamics embed the motor command sequence of Caenorhabditis elegans. In Cell, 163 (3):656–669, Oct. 2015.
- J. P. Nguyen, A. N. Linder, G. S. Plummer, J. W. Shaevitz, and A. M. Leifer. Automatically tracking neurons in a moving and deforming brain. In PLOS Computational Biology, 13(5):1–19, 05 2017.
- X. Yu, M. S. Creamer, F. Randi, A. K. Sharma, S. W. Linderman, and A. M. Leifer. Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training. In eLife, 10:e66410, Jul. 2021.