**Information, Complexity and Consciousness**

Quantifying
integrated information is a leading approach towards building a fundamental
theory of consciousness. Integrated Information Theory (IIT) has gained
attention in this regard due to its theoretically strong framework. However, it
faces some limitations such as current state dependence, computationally
expensive and inability to be applied to real brain data. On the other hand,
Perturbational Complexity Index (PCI) is a clinical measure for distinguishing
different levels of consciousness. Though PCI claims to capture the functional
differentiation and integration in brain networks (similar to IIT), its link to
integrated information theories is rather weak. Inspired by these two
approaches, we propose a new measure - Φ^C using a novel
compression-complexity perspective that serves as a bridge between the two, for
the first time. Φ^C is founded
on the principles of lossless data compression based complexity measures which
characterize the dynamical complexity of brain networks. Φ^C exhibits following salient innovations: (i)
mathematically well bounded, (ii) negligible current state dependence
unlike Φ, (iii) integrated
information measured as compression-complexity rather than as an infotheoretic
quantity, and (iv) faster to compute since number of atomic bipartitions scales
linearly with the number of nodes of the network, thus avoiding combinatorial
explosion. Our computer simulations show that Φ^C has similar hierarchy to <Φ> for several multiple-node networks and it
demonstrates a rich interplay between differentiation, integration and entropy
of the nodes of a network. Φ^C is
a promising heuristic measure to characterize the quantity of integrated
information (and hence a measure of quantity of consciousness) in larger
networks like human brain and provides an opportunity to test the predictions
of brain complexity on real neural data.

For further details, see: https://arxiv.org/abs/1608.08450

**Neural Signal Multiplexing**

Transport of neural signals in the brain is challenging owing to neural interference and neural noise. There is experimental evidence of multiplexing of sensory information across population of neurons, particularly in the vertebrate visual and olfactory systems. Recently, it has been discovered that in lateral intraparietal cortex of the brain, decision signals are multiplexed with decision-irrelevant visual signals. Furthermore, it is well known that several cortical neurons exhibit chaotic spiking patterns. Multiplexing of chaotic neural signals and their successful demultiplexing in the neurons amidst interference and noise, is difficult to explain. In this work, a novel compressed sensing model for efficient multiplexing of chaotic neural signals constructed using the Hindmarsh-Rose spiking model is proposed. The signals are multiplexed from a pre-synaptic neuron to its neighbouring post-synaptic neuron, in the presence of 10,000 interfering noisy neural signals and demultiplexed using compressed sensing techniques.

For further details, see: https://arxiv.org/abs/1601.03214

**Perspectives on Complexity**

There is no single
universally accepted definition of "

**Complexity**". There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. We explore the following perspectives on complexity: "effort-to-describe" (Shannon entropy H, Lempel-Ziv complexity LZ), "effort-to-compress" (ETC complexity) and "degree-of-order" (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used, ETC is a recently proposed measure for time series. We also propose a novel normalized measure SubSym based on the existing idea of counting the number of subsymmetries or palindromes within a sequence. We compare the performance of these complexity measures on the following tasks: a) characterizing complexity of short binary sequences of lengths 4 to 16, b) distinguishing periodic and chaotic time series from 1D logistic map and 2D H\'{e}non map, and c) distinguishing between tonic and irregular spiking patterns generated from the "Adaptive exponential integrate-and-fire" neuron model. Our study reveals that each perspective has its own advantages and uniqueness while also having an overlap with each other.For further details, see: https://arxiv.org/abs/1611.00607

**Aging and Cardiovascular Complexity**

As we age, our hearts
undergo changes that result in a reduction in complexity of physiological
interactions between different control mechanisms. This results in a potential
risk of cardiovascular diseases which are the number one cause of death globally.
Since cardiac signals are nonstationary and nonlinear in nature, complexity
measures are better suited to handle such data. In this study, three complexity
measures are used, namely Lempel–Ziv complexity (LZ), Sample Entropy (SampEn)
and Effort-To-Compress (ETC). We determined the minimum length of RR tachogram
required for characterizing complexity of healthy young and healthy old hearts.
All the three measures indicated significantly lower complexity values for
older subjects than younger ones. However, the minimum length of heart-beat
interval data needed differs for the three measures, with LZ and ETC needing as
low as 10 samples, whereas SampEn requires at least 80 samples. Our study
indicates that complexity measures such as LZ and ETC are good candidates for
the analysis of cardiovascular dynamics since they are able to work with very
short RR tachograms.

For further details, see: https://peerj.com/articles/2755/