Research

SELF-ORGANIZATION, SPREADING DYNAMICS & INFORMATION PROCESSING

The human brain proves amazing information processing capacities, which rely on the coordinated activity of 80 billion neurons; each neuron interacting with thousands of other neurons. I want to understand how these neurons jointly form a functional network for information processing. Given that trillions of neural connections (synapses) have to be coordinated to perform meaningful information processing, this “infogenesis” must be primarily driven by self-organizing principles. A key role is dedicated to the self-organization through local learning rules, thus principles that update the connection strength between neurons only based on locally available information. My aim is to derive from first principles how such local learning rules can optimize a network for information processing – ideally even before ever being exposed to specific stimuli of the external world. If identified, such a self-organization mechanism on the one hand can improve the initialization and effectiveness of artificial neural networks, and on the other hand shed light on the riddle of how our complex brains cope so seemingly effortless with the complex environment.

PROJECT DETAILS

I. Subsampling. In order to understand living information processing, we need to sample neural activity from the brain. However, in the brain like in most large systems, it is impossible to sample the activity of all units in parallel. The human brain for example comprises 80 billion neurons, but current techniques allow sampling the activity from only a few thousand neurons at a time. How can we infer properties from the full system if only observing a tiny fraction? Unfortunately, inferences are at times not straight forward, but systematically biased. For example, subsampling in critical models can distort the expected power law relations, and thereby a critical system can be misinterpreted as sub- or super-critical. We are developing approaches to overcome subsampling effects by extending methods from finite size scaling theory, employing stochastic processes and harnessing the theory of non-equilibrium phase transitions. For example, we now know how to infer spreading dynamics even if only a tiny fraction of all neurons is sampled — under idealized condition sampling a single unit would be sufficient to estimate the spreading rate!

Selected References:
VP et al., Frontiers in Systems Neuroscience (2014)
Levina & VP, Nature Communications (2017)
Wilting &VP, Nature Communications (2018)
Neto, Spitzner & VP, arxiv

In experiments only a tiny fraction (right) of all neurons can be sampled in parallel (figure from Wilting & VP, 2018, generated with TREES).

 

II. Reverberating, Subcritical Dynamics in vivo. In neuroscience, a popular hypothesis states that the collective neural dynamics should self-organize to a critical state, because at criticality simple models maximize their information processing capacities. However, criticality may also come with the risk of spontaneous runaway activity (epilepsy). I obtained the clear evidence that neural dynamics maintains a distance to criticality, and thereby keep a safety margin to runaway activity. This distance to criticality is consistent across different species, but changes from wakefulness to deep sleep. Most recently, I succeeded in taming the out-of-equilibrium effects imposed by ongoing external stimulation, and as a result developed a very effective methods to quantify the distance to criticality from very little data. This opens novel avenues to study how networks tune their distance to criticality – and hence their information processing – from one moment to the other, depending on task requirement and context.

Selected References:
Cramer et int., VP, Nature Communications (2020)
VP et al., Plos Comp. Biol. (2013)
Wilting & VP, Cerebral Cortex (2019)
Wilting & VP, Current Opinion Neurobiol. (2019)
Zierenberg, Wilting & VP, Physical Review X (2018)

III. Information Theory. Information theory is the natural language to quantify computation. I employ it in two complementary manners: To infer the coding principles of living neural networks, which were developed, tested and selected over million years of evolution, and to design networks for optimized processing from first principles.  Going a step further, we do not only design the networks directly, but we derive the learning rules that shape the network autonomously. In this realm, information theory provides the ideal language, because it is independent of semantics. Thereby it enables a comparison of coding principles across very diverse architectures, and implementations. This is useful when e.g. comparing coding across species and areas, where it has been key to unravel a clear hierarchy of processing across cortex.  Our ongoing/future work goes into the role of phase transitions in information processing; local learning as a form of symmetry breaking; architectures to optimize Gibbs sampling despite strong correlations; and invariance of information flow under coarse-graining.

Selected References:
Wibral, Lizier & Priesemann, Frontiers in Robotics and AI (2015)
Wibral, Finn, Wollstadt, Lizier, Priesemann, Entropy (2017)
Wollstadt et al.,  PLOS Computational Biology (2017)

Figure from Wibral et al (2017).

 

IV. Epidemics and Infodemics. With the outbreak of COVID-19, it turned out that our work on spreading dynamics in the brain is highly valuable to investigate the spread of SARS-CoV-2.  During the pandemic, not only the spread of the virus, but also that of news and fake news (“infodemic“) challenged societies.  We are studying the interaction of pandemic and infodemic spread using the example of the COVID-19 pandemic because of the abundance of disease and news data. Understanding the spread of (mis)information will be crucial when facing any of the future crises.
In more detail, (1) we have already shown how the information about the viral spread can mitigate the extend of pandemic waves. (2) We investigated how the reproduction number R changes as a consequence of a certain intervention like e.g. closing schools or imposing a contact ban. (3) We then continued to investigate the challenges when mitigating COVID-19 spread via test-trace-isolate strategies. These strategies are inherently imperfect, and their capacity is limited. Hence, we identified an important tipping point: If case numbers are too high, so that a local health authority cannot test and trace the contacts anymore, and thus are too slow to break the infection chains, we see increasingly more unaware carriers. These unaware carriers are the drivers of a self-accelerating spread. The conclusion is straight forward: If case numbers are low, a control of the spread is much easier. Beyond the tipping point, regaining control becomes increasingly more difficult.
In parallel, we are developing a hierarchical Bayesian model to investigate the impact of non-pharmaceutical interventions (NPIs) using European as well as federal data.

Based on this research, I became member of the national expert panel on COVID-19 of the federal government (Expertenrat).  I initiated a number of international, interdisciplinary consensus statements that were published e.g. by The Lancet, and I co-authored a number of statements that were published by the presidents of the Fraunhofer, Helmholtz, Leibniz and Max Planck Society, or by the Leopoldina.

Selected References:
Alwan et al., The Lancet, 2020
Bauer et int., VP, Plos Computational Biology, 2021
Contreras et int., VP, Nature Communications, 2021 / arxiv
Contreras et int., VP, Science Advances, 2021
Dehning et int., VP, Science, 2020/ arxiv
Dehning et int., VP, medrxiv
Linden et int., VP, Deutsches Ärzteblatt Int., 2020 / arxiv
Priesemann et al., The Lancet, 2021a,b,c

Github: https://github.com/Priesemann-Group/

Meyer-Hermmann*, Pigeot*, Priesemann*, Schöbel*: Adaptive Strategien zur Eindämmung der COVID-19 Epidemie
Meyer-Hermmann*, Pigeot*, Priesemann*, Schöbel*: Gemeinsam können wir es schaffen: Jeder einzelne Beitrag schützt Gesundheit, Gesellschaft und Wirtschaft

 

V. Music. Long-range correlations characterizes neural activity in the brain. Therefore music, which may be considered the mirror of the soul or the brain, should also reflect these correlations. In collaboration with Theo Geisel, we investigate how long range correlations and information-theoretic quantities change with genres, and whether long-range correlations play a central role in making Swing swing.
References:
Sogorski, Geisel & VP, Plos ONE (2018)
Datseris et al., arxiv / in press (2019)

To analyze long-range correlations in the tempo fluctuations of music performances, the beat of each piece must be extracted with millisecond precision [workflow from Sogorski, Geisel & Priesemann, 2018].

RECENT HIGHLIGHTS (03/2022)

Mikulasch, Rudelt, Priesemann, Local dendritic balance enables learning of efficient representations in networks of spiking neurons, Proceedings of the National Academy of Sciences, 2021

RECENT HIGHLIGHTS (10/2018)

Operating in a reverberating regime enables rapid tuning of network states to task requirements: Wilting et al., Front. Syst. Neurosci., (2018)

What makes in vitro and in vivo dynamics so different? – We derive the central role that the strength of external input plays in shaping collective dynamics: Zierenberg et al., PRX (2018)

We developed two analytical solutions to overcome subsampling-induced bias,
one for time series: Wilting & Priesemann, arXiv / Nature Comm. (2018),
and one for spatial structures, such as degree distributions in a graph or avalanches: Levina & Priesemann, Nature Comm. (2017)