Research

SUMMARY

The human brain has amazing information processing capacities, which rely on the coordinated activity of 80 billion neurons, each of them interacting with thousands of other neurons. With the aim to understand how the collective neural dynamics ultimately gives rise to the brain’s information processing capacities, I work on a class of collective states, i.e. critical states, because these maximize information processing capacities in models. In this context I recently provided the first evidence that the human brain operates not in a critical state, but keeps a distance to criticality [VP et al., 2013], despite the computational advantages of criticality. I confirmed these results for highly parallel spike recordings from rats, cats, and monkeys [VP et al., 2014]. This indicates that maximizing information processing is not the only goal function for the brain. Based on these results, I suggest that the brain only tunes itself closer to criticality when strictly necessary for information processing. Else the brain maintains a larger safety margin, thereby losing processing capacity, but avoiding instability  [Wilting et al., 2018; Wilting & VP, 2018]. In the near future, I want to reveal how the brain should allocate its resources and tune its distance to criticality for a given task. My long-term goal is to understand how nervous systems self-organize in a given task to optimize their performance.

PROJECT DETAILS

I. Subsampling. In most large networks, it is impossible to sample the activity of all nodes in parallel. For example, the human brain comprises 80 billion neurons, but current techniques allow sampling the activity from only a few hundred neurons at a time. I showed for collective states of networks that subsampling can severely impede inferences about the properties of the full system. In detail, subsampling in critical models can distort the expected power law relations, and thereby a critical system can be misinterpreted as sub- or super-critical. I am developing approaches to overcome subsampling effects by extending methods from finite size scaling theory, and harnessing the theory of non-equilibrium phase transitions.
Selected References:
Levina & Priesemann, Nature Communications (2017)
VP et al., Frontiers in Systems Neuroscience (2014)
Wilting &VP, Nature Communications (2018)

In experiments only a tiny fraction (right) of all neurons can be sampled in parallel (figure generated with TREES).

II. Reverberating, Subcritical Dynamics in vivo. In neuroscience, a popular hypothesis states that the collective neural dynamics should self-organize to a critical state, because at criticality simple models maximize their information processing capacities. However, criticality also comes with the risk of spontaneous runaway activity (epilepsy). I obtained the first evidence that neural dynamics maintains a distance to criticality, and thus can keep a safety margin to runaway activity. This distance to criticality is consistent across different species, but changes from wakefulness to deep sleep. Most recently, I succeeded in taming the out-of-equilibrium effects imposed by ongoing external stimulation, and as a result developed a very effective methods to quantify the distance to criticality from very little data. This opens novel avenues to study how networks tune their distance to criticality from one moment to the other, depending on task requirement and context.
References: VP et al., Plos Comp. Biol. (2013)
VP et al., Front. Syst. Neurosci. (2014)
Wilting & VP, Nature Communications (2018)
Wilting & VP, Cerebral Cortex (2019)
Zierenberg et al., Physical Review X (2018)

III. Information Theory. I harness information theory in two complementary ways: To infer the coding principles of living neural networks, which were developed, tested and selected in million years of evolution, and as guiding principle to design networks for optimized processing. In this realm, information theory provides the ideal language, because it is independent of semantics. Thereby it enables a comparison of coding principles across very diverse architectures, and implementations. This is useful when e.g. comparing coding across species and areas, and has been key for us to unravel a clear hierarchy of processing across cortex.  Our ongoing/future work goes into the role of phase transitions in processing; learning as symmetry breaking; architectures to learn Gibbs sampling despite strong correlations; and invariance of information flow under coarse-graining (contact me if any of this catches your interest!).
Selected References:
Wibral, Lizier & VP, Frontiers in Robotics and AI (2015)
Wibral, Finn, Wollstadt, Lizier, VP, Entropy (2018)
Wollstadt et al.,  PLOS Computational Biology (2017)

Partial information decomposition (PID) allows to assess fingerprints of neural processing [figure from Wibral, et al., Entropy (2017)].
IV. Music. Long range correlations dominate neural activity. Therefore music, which may be considered the mirror of the soul or the brain, should also reflect these correlations. In collaboration with Theo Geisel, we investigate how long range correlations and information theoretic quantities change with genres, and whether long range correlations play a central role in making Swing swing.
Reference: Sogorski, Geisel & VP, Plos ONE (2018)
Datseris et al., arxiv (2019)

To analyze long-range correlations in the tempo fluctuations of music performances, the beat of each piece must be extracted with millisecond precision [workflow from Sogorski, Geisel & Priesemann, 2018].

RECENT HIGHLIGHTS

* Operating in a reverberating regime enables rapid tuning of network states to task requirements: Wilting et al., Front. Syst. Neurosci., (2018)

* What makes in vitro and in vivo dynamics so different? – We derive the central role that the strength of external input plays in shaping collective dynamics: Zierenberg et al., PRX (2018)

* We developed two analytical solutions to overcome subsampling-induced bias, one for time series: Wilting & Priesemann, arXiv / Nature Comm. (2018), and one for spatial structures, such as degree distributions in a graph or avalanches:
Levina & Priesemann, Nature Comm. (2017)