The human brain has amazing information processing capacities, which rely on the coordinated activity of 80 billion neurons, each of them interacting with thousands of other neurons. With the aim to understand how the collective neural dynamics ultimately gives rise to the brain’s information processing capacities, I work on a class of collective states, i.e. critical states, because these maximize information processing capacities in models. In this context I recently provided the first evidence that the human brain operates not in a critical state, but keeps a distance to criticality [Priesemann et al. 2013], despite the computational advantages of criticality. I confirmed these results for highly parallel spike recordings from rats, cats, and monkeys [Priesemann et al. 2014]. This indicates that maximizing information processing is not the only goal function for the brain. Based on these results, I suggest that the brain only tunes itself closer to criticality when strictly necessary for information processing. Else the brain maintains a larger safety margin, thereby losing processing capacity, but avoiding instability. In the near future, I want to reveal how the brain should allocate its resources and tune its distance to criticality for a given task. In the far future, this will provide insight of how information processing emerges from the collective neural dynamics in nervous systems.
I. Subsampling. In most large networks, it is impossible to sample the activity of all nodes in parallel. For example, the human brain comprises 80 billion neurons, but current techniques allow sampling the activity from only a few hundred neurons at a time. I showed for collective states of networks that subsampling can severely impede inferences about the properties of the full system. In detail, subsampling in critical models can distort the expected power law relations, and thereby a critical system can be misinterpreted as sub- or super-critical. I am currently developing an approach to overcome subsampling effects by extending methods from finite size scaling theory.
References: Priesemann et al., 2009, 2013, 2014; Wilting & Priesemann, arXiv; Levina & Priesemann, 2017
II. Subcritical Dynamics in vivo. In neuroscience, a popular hypothesis is that the collective neural dynamics should self-organize to a critical state, because at criticality simple models maximize their information processing capacities. However, criticality also comes with the risk of spontaneous runaway activity (epilepsy). I recently obtained the first evidence that neural dynamics maintains a distance to criticality, and thus can keep a safety margin to runaway activity. This distance to criticality is consistent across different species, but changes with vigilance states. Currently, I refine the methods to quantify the distance to criticality, with the aim to understand how this distance is adjusted depending on needs.
References: Priesemann et al., 2014; Wilting & Priesemann, arXiv; Levina & Priesemann, 2017
III. Information theory. To understand information processing in the brain, I co-developed an open source toolbox (TRENTOOL) that is specialized on estimating information theoretic quantities from neural recordings. Using this toolbox, I showed that transfer entropy is capable of detecting the interaction delay and the direction of information flow between brain areas, using my own turtle ex vivo recordings. Currently, the toolbox is extended to also quantify active information storage, and return the time resolved versions of both, active information storage and transfer.
References: Lindner et al., 2011; Wibral, Lizier & Priesemann, 2015, 2017; Wibral et al., 2013, 2014, 2017a, 2017b; Wollstadt et al., 2017
IV. Music. Long range correlations dominate neural activity. Therefore music, which is considered the mirror of the soul and the brain, should also reflect these correlations. In collaboration with Theo Geisel, we currently investigate how long range correlations and information theoretic quantities change with genres, and whether long range correlations play a central role in making Swing swing.
Reference: Sogorski, Geisel & Priesemann, under review
We developed two analytical solutions to overcome subsampling-induced bias, one for time series: Wilting & Priesemann (arXiv / Nature Comm in press), and one for spatial structures, such as degree distributions in a graph:
Levina & Priesemann, Nature Comm. (2017)