Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks
<p>Information diagram (<b>a</b>) and mutual information diagram (<b>b</b>,<b>c</b>) depicting the relations between the basic information-theoretic measures defined for three random variables <span class="html-italic">X</span>, <b><span class="html-italic">Z</span></b>, <b><span class="html-italic">U</span></b>: the information <span class="html-italic">H</span>(∙), the conditional information <span class="html-italic">H</span>(∙|∙), the mutual information <span class="html-italic">I</span>(∙;∙), the conditional mutual information <span class="html-italic">I</span>(∙;∙|∙), and the interaction information <span class="html-italic">I</span>(∙;∙;∙). Note that the interaction information <span class="html-italic">I</span>(<span class="html-italic">X</span>;<b><span class="html-italic">Z</span></b>;<b><span class="html-italic">U</span></b>) = <span class="html-italic">I</span>(<span class="html-italic">X</span>;<b><span class="html-italic">Z</span></b>) – <span class="html-italic">I</span>(<span class="html-italic">X</span>;<b><span class="html-italic">Z|U</span></b>) can take both positive and negative values. In this study, all interaction information terms are depicted with gray shaded areas, and all diagrams are intended for positive values of these terms. Accordingly, the case of positive interaction information is depicted in (<b>b</b>), and that of negative interaction information is depicted in (<b>c</b>).</p> "> Figure 2
<p>Graphical representation of the information theoretic quantities resulting from the decomposition of the information carried by the target <span class="html-italic">Y</span> of a network of interacting stationary processes <b><span class="html-italic">S</span></b> = {<b><span class="html-italic">X</span></b>,<span class="html-italic">Y</span>} = {<b><span class="html-italic">V</span></b>,<b><span class="html-italic">W</span></b>,<span class="html-italic">Y</span>}. (<b>a</b>) Exemplary realizations of a six-dimensional process <b><span class="html-italic">S</span></b> composed of the target process <span class="html-italic">Y</span> and the source processes <b><span class="html-italic">V</span></b> = {<span class="html-italic">V</span><sub>1</sub>,<span class="html-italic">V</span><sub>2</sub>} and <b><span class="html-italic">W</span></b> = {<span class="html-italic">W</span><sub>1</sub>, <span class="html-italic">W</span><sub>2</sub>, <span class="html-italic">W</span><sub>3</sub>}, with representation of the variables used for information domain analysis: the present of the target, <math display="inline"> <semantics> <mrow> <msub> <mi>Y</mi> <mi>n</mi> </msub> </mrow> </semantics> </math>, the past of the target, <math display="inline"> <semantics> <mrow> <msubsup> <mi>Y</mi> <mi>n</mi> <mo>−</mo> </msubsup> </mrow> </semantics> </math>, and the past of the sources, <math display="inline"> <semantics> <mrow> <msubsup> <mi mathvariant="bold-italic">V</mi> <mi>n</mi> <mo>−</mo> </msubsup> </mrow> </semantics> </math> and <math display="inline"> <semantics> <mrow> <msubsup> <mi mathvariant="bold-italic">W</mi> <mi>n</mi> <mo>−</mo> </msubsup> </mrow> </semantics> </math>. (<b>b</b>) Venn diagram showing that the information of the target process <span class="html-italic">H<sub>Y</sub></span> is the sum of the new information (<span class="html-italic">N<sub>Y</sub></span>, yellow-shaded area) and the predictive information (<span class="html-italic">P<sub>Y</sub></span>, all other shaded areas with labels); the latter is expanded according to the predictive information decomposition (PID) as the sum of the information storage (<span class="html-italic">S<sub>Y</sub></span> = <span class="html-italic">S<sub>Y|<b>X</b></sub></span> + <span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">V</span></b>|<b><span class="html-italic">W</span></b></sub> + <span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">W</span></b>|<b><span class="html-italic">V</span></b></sub> + <span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">W</span></b>;<b><span class="html-italic">V</span></b></sub>) and the information transfer (<span class="html-italic">T<b><sub>X</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span> = <span class="html-italic">T<b><sub>V</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span><sub>|<b><span class="html-italic">W</span></b></sub> + <span class="html-italic">T<b><sub>W</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span><sub>|<b><span class="html-italic">V</span></b></sub> + <span class="html-italic">I<sup>Y</sup><b><sub>V</sub></b></span><sub>;<b><span class="html-italic">W</span></b>|Y</sub>); the information storage decomposition dissects <span class="html-italic">S<sub>Y</sub></span> as the sum of the internal information (<span class="html-italic">S<sub>Y|<b>X</b></sub></span>), conditional interaction terms (<span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">V</span></b>|<b><span class="html-italic">W</span></b></sub> and <span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">W</span></b>|<b><span class="html-italic">V</span></b></sub>) and multivariate interaction (<span class="html-italic">I<sup>Y</sup><sub>Y</sub></span><sub>;<b><span class="html-italic">W</span></b>;<b><span class="html-italic">V</span></b></sub>). The information transfer decomposition dissects <span class="html-italic">T<b><sub>X</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span> as the sum of conditional information transfer terms (<span class="html-italic">T<b><sub>V</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span><sub>|<b><span class="html-italic">W</span></b></sub> and <span class="html-italic">T<b><sub>W</sub></b></span><sub>→</sub><span class="html-italic"><sub>Y</sub></span><sub>|<b><span class="html-italic">V</span></b></sub>) and interaction information transfer (<span class="html-italic">I<sup>Y</sup><b><sub>V</sub></b></span><sub>;<b><span class="html-italic">W</span></b>|Y</sub>).</p> "> Figure 3
<p>Graphical representation of the trivariate VAR process of Equation (21) with parameters set according the first configuration reproducing basic dynamics and interactions (<b>a</b>) and to the second configuration reproducing realistic cardiovascular and cardiorespiratory dynamics and interactions (<b>b</b>). The theoretical power spectral densities of the three processes <span class="html-italic">V</span>, <span class="html-italic">W</span> and <span class="html-italic">Y</span> corresponding to the parameter setting with <span class="html-italic">c</span> = 1 are also depicted in panel (<b>c</b>) (see text for details).</p> "> Figure 4
<p>Information decomposition for the stationary Gaussian VAR process composed by the target <span class="html-italic">Y</span> and the sources <b><span class="html-italic">X</span></b> = {<span class="html-italic">V</span>,<span class="html-italic">W</span>}, generated according to Equation (21). The Venn diagrams of the predictive information decomposition (PID), information storage decomposition (ISD) and information transfer decomposition (ITD) are depicted on the left. The interaction structure of the VAR process set according to the two types of simulation are depicted on the top. The information measures relevant to (<b>a</b>–<b>d</b>) PID (<math display="inline"> <semantics> <mrow> <msub> <mi>H</mi> <mi>Y</mi> </msub> <mo>=</mo> <msub> <mi>N</mi> <mi>Y</mi> </msub> <mo>+</mo> <msub> <mi>S</mi> <mi>Y</mi> </msub> <mo>+</mo> <msub> <mi>T</mi> <mrow> <mi mathvariant="bold-italic">X</mi> <mo>→</mo> <mi>Y</mi> </mrow> </msub> </mrow> </semantics> </math>), (<b>e</b>–<b>h</b>) ISD (<math display="inline"> <semantics> <mrow> <msub> <mi>S</mi> <mi>Y</mi> </msub> <mo>=</mo> <msub> <mi>S</mi> <mrow> <mi>Y</mi> <mo stretchy="false">|</mo> <mi mathvariant="bold-italic">X</mi> </mrow> </msub> <mo>+</mo> <msubsup> <mi>I</mi> <mrow> <mi>Y</mi> <mo>;</mo> <mi mathvariant="bold-italic">V</mi> <mo stretchy="false">|</mo> <mi mathvariant="bold-italic">W</mi> </mrow> <mi>Y</mi> </msubsup> <mo>+</mo> <msubsup> <mi>I</mi> <mrow> <mi>Y</mi> <mo>;</mo> <mi mathvariant="bold-italic">W</mi> <mo stretchy="false">|</mo> <mi mathvariant="bold-italic">V</mi> </mrow> <mi>Y</mi> </msubsup> <mo>+</mo> <msubsup> <mi>I</mi> <mrow> <mi>Y</mi> <mo>;</mo> <mi mathvariant="bold-italic">V</mi> <mo>;</mo> <mi mathvariant="bold-italic">W</mi> </mrow> <mi>Y</mi> </msubsup> </mrow> </semantics> </math>) and (<b>i</b>–<b>l</b>) ITD (<math display="inline"> <semantics> <mrow> <msub> <mi>T</mi> <mrow> <mi mathvariant="bold-italic">X</mi> <mo>→</mo> <mi>Y</mi> </mrow> </msub> <mo>=</mo> <msub> <mi>T</mi> <mrow> <mi mathvariant="bold-italic">V</mi> <mo>→</mo> <mi>Y</mi> <mo stretchy="false">|</mo> <mi mathvariant="bold-italic">W</mi> </mrow> </msub> <mo>+</mo> <msub> <mi>T</mi> <mrow> <mi mathvariant="bold-italic">W</mi> <mo>→</mo> <mi>Y</mi> <mo stretchy="false">|</mo> <mi mathvariant="bold-italic">V</mi> </mrow> </msub> <mo>+</mo> <msubsup> <mi>I</mi> <mrow> <mi mathvariant="bold-italic">V</mi> <mo>;</mo> <mi mathvariant="bold-italic">W</mi> <mo stretchy="false">|</mo> <mi>Y</mi> </mrow> <mi>Y</mi> </msubsup> </mrow> </semantics> </math>), expressed in their variance and entropy formulations, are computed as a function of the parameter <span class="html-italic">c</span> for the two simulations.</p> "> Figure 5
<p>Examples of computation of interaction information transfer <math display="inline"> <semantics> <mrow> <msubsup> <mi>I</mi> <mrow> <mi>V</mi> <mo>;</mo> <mi>W</mi> <mo stretchy="false">|</mo> <mi>Y</mi> </mrow> <mi>Y</mi> </msubsup> </mrow> </semantics> </math> for exemplary cases of jointly Gaussian processes <span class="html-italic">V</span>, <span class="html-italic">W</span> (sources) and <span class="html-italic">Y</span> (target): (<b>a</b>–<b>c</b>) uncorrelated sources; (<b>d</b>–<b>f</b>) positively correlated sources. Panels show the logarithmic dependence between variance and entropy measures of conditional information (<b>a</b>,<b>d</b>) and Venn diagrams of the information measures based on variance computation (<b>b</b>,<b>e</b>) and entropy computation (<b>c</b>,<b>f</b>). In (<b>a</b>–<b>c</b>), the variance-based interaction transfer is zero, suggesting no source interaction, while the entropy-based transfer is negative, denoting synergy. In (<b>d</b>–<b>f</b>), the variance-based interaction transfer is positive, suggesting redundancy, while the entropy-based transfer is negative, denoting synergy.</p> "> Figure 6
<p>Information decomposition of the heart period (process <span class="html-italic">H</span>) measured as the target of the physiological network including also respiration (process <span class="html-italic">R</span>) and systolic pressure (process <span class="html-italic">S</span>) as source processes. Plots depict the values of the (<b>a</b>,<b>d</b>) predictive information decomposition (PID), (<b>b</b>,<b>e</b>) information storage decomposition (ISD) and (<b>c</b>,<b>f</b>) the information transfer decomposition (ITD) computed using entropy measures (<b>a</b>–<b>c</b>) and prediction measures (<b>d</b>–<b>f</b>) and expressed as mean + standard deviation over 61 subjects in the resting baseline condition (B), during head-up tilt (T), during recovery in the supine position (R), and during mental arithmetics (M). Statistically significant differences between pairs of distributions are marked with * (T vs. B, M vs. B), with # (T vs. R, M vs. R), and with § (T vs. M).</p> "> Figure 7
<p>Information decomposition of systolic pressure (process <span class="html-italic">S</span>) measured as the target of the physiological network including also respiration (process <span class="html-italic">R</span>) and heart period (process <span class="html-italic">H</span>) as source processes. Plots depict the values of the (<b>a</b>,<b>d</b>) predictive information decomposition (PID), (<b>b</b>,<b>e</b>) information storage decomposition (ISD) and the (<b>c</b>,<b>f</b>) information transfer decomposition (ITD) computed using entropy measures (<b>a</b>–<b>c</b>) and prediction measures (<b>d</b>–<b>f</b>) and expressed as mean + standard deviation over 61 subjects in the resting baseline condition (B), during head-up tilt (T), during recovery in the supine position (R), and during mental arithmetics (M). Statistically significant differences between pairs of distributions are marked with * (T vs. B, M vs. B), with # (T vs. R, M vs. R), and with § (T vs. M).</p> "> Figure 8
<p>Graphical representation of the variance-based (red) and entropy-based (blue) measures of information content (<math display="inline"> <semantics> <mrow> <msub> <mi>H</mi> <mi>H</mi> </msub> </mrow> </semantics> </math>), storage (<math display="inline"> <semantics> <mrow> <msub> <mi>S</mi> <mi>H</mi> </msub> </mrow> </semantics> </math>), transfer (<math display="inline"> <semantics> <mrow> <msub> <mi>T</mi> <mrow> <mi>S</mi> <mo>→</mo> <mi>H</mi> <mo stretchy="false">|</mo> <mi>R</mi> </mrow> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mrow> <msub> <mi>T</mi> <mrow> <mi>R</mi> <mo>→</mo> <mi>H</mi> <mo stretchy="false">|</mo> <mi>S</mi> </mrow> </msub> </mrow> </semantics> </math>) and new information (<math display="inline"> <semantics> <mrow> <msub> <mi>N</mi> <mi>H</mi> </msub> </mrow> </semantics> </math>) relevant to the information decomposition of the heart period variability during baseline (dark colors) and during tilt (light colors), according to the results of <a href="#entropy-19-00005-f006" class="html-fig">Figure 6</a>. The logarithmic relation explains why opposite variations can be obtained by variance-based measures and entropy-based measures moving from baseline to tilt.</p> ">
Abstract
:1. Introduction
2. Information Decomposition in Multivariate Processes
2.1. Information Measures for Random Variables
2.1.1. Variance-Based and Entropy-Based Measures of Information
2.1.2. Variance-Based and Entropy-Based Measures of Information for Gaussian Variables
2.1.3. Measures Derived from Information and Conditional Information
2.2. Information Measures for Networks of Dynamic Processes
2.2.1. New Information and Predictive Information
2.2.2. Predictive Information Decomposition (PID)
2.2.3. Information Storage Decomposition (ISD)
2.2.4. Information Transfer Decomposition (ITD)
2.2.5. Summary of Information Decomposition
2.3. Computation for Multivariate Gaussian Processes
3. Simulation Study
3.1. Simulated VAR Processes
3.2. Information Decomposition
3.3. Interpretation of Interaction Information
4. Application to Physiological Networks
4.1. Experimental Protocol and Data Analysis
4.2. Results and Discussion
4.2.1. Information Decomposition of Heart Period Variability during Head-Up Tilt
4.2.2. Information Decomposition of Heart Period Variability during Mental Arithmetics
4.2.3. Information Decomposition of Systolic Arterial Pressure Variability during Head-Up Tilt
4.2.4. Information Decomposition of Systolic Arterial Pressure Variability during Mental Arithmetics
4.2.5. Different Profiles of Variance-Based and Entropy-Based Information Measures
5. Summary of Main Findings
- Information decomposition methods are recommended for the analysis of multivariate processes to dissect the general concepts of predictive information, information storage and information transfer in basic elements of computation that are sensitive to changes in specific network properties;
- The combined evaluation of several information measures is recommended to characterize unambiguously changes of the network across conditions;
- Entropy-based measures are appropriate for the analysis of information transfer thanks to the intrinsic normalization to the complexity of the target dynamics, but are exposed to the detection of net synergy in the analysis of information modification;
- Variance-based measures are recommended for the analysis of information modification since they yield zero synergy/redundancy for uncorrelated sources, but can return estimates of information transfer biased by modifications of the complexity of the target dynamics.
- The physiological stress induced by head-up tilt brings about a decrease of the complexity of the short-term variability of heart period, reflected by higher information storage and internal information, lower cardiorespiratory and higher cardiovascular information transfer, physiologically associated with sympathetic activation and vagal withdrawal;
- Head-up tilt does not alter the information stored in and transferred to systolic arterial pressure variability, but information decompositions reveal an enhancement during tilt of respiratory effects on systolic pressure independent of heart period dynamics;
- The mental stress induced by the arithmetic task does not alter the complexity of heart period variability, but leads to a decrease of the cardiorespiratory information transfer physiologically associated to vagal withdrawal;
- Mental arithmetics increases the complexity of systolic arterial pressure variability, likely associated with the action of physiological mechanisms unrelated to respiration and heart period variability.
6. Conclusions
Supplementary Materials
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Faes, L.; Porta, A. Conditional entropy-based evaluation of information dynamics in physiological systems. In Directed Information Measures in Neuroscience; Vicente, R., Wibral, M., Lizier, J.T., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 61–86. [Google Scholar]
- Lizier, J.T. The Local Information Dynamics of Distributed Computation in Complex Systems; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Wibral, M.; Lizier, J.T.; Priesemann, V. Bits from biology for biologically-inspired computing. Front. Robot. AI 2015, 2. [Google Scholar] [CrossRef]
- Chicharro, D.; Ledberg, A. Framework to study dynamic dependencies in networks of interacting processes. Phys. Rev. E 2012, 86, 041901. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Kugiumtzis, D.; Nollo, G.; Jurysta, F.; Marinazzo, D. Estimating the decomposition of predictive information in multivariate systems. Phys. Rev. E 2015, 91, 032904. [Google Scholar] [CrossRef] [PubMed]
- Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Local measures of information storage in complex distributed computation. Inf. Sci. 2012, 208, 39–54. [Google Scholar] [CrossRef]
- Wibral, M.; Lizier, J.T.; Vogler, S.; Priesemann, V.; Galuske, R. Local Active Information Storage as a Tool to Understand Distributed Neural Information Processing; Frontiers Media SA: Lausanne, Switzerland, 2015. [Google Scholar]
- Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 2000, 85, 461. [Google Scholar] [CrossRef] [PubMed]
- Wibral, M.; Vicente, R.; Lindner, M. Transfer entropy in neuroscience. In Directed Information Measures in Neuroscience; Vicente, R., Wibral, M., Lizier, J.T., Eds.; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Information modification and particle collisions in distributed computation. Chaos 2010, 20, 037109. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Marinazzo, D.; Stramaglia, S.; Jurysta, F.; Porta, A.; Nollo, G. Predictability decomposition detects the impairment of brain-heart dynamical networks during sleep disorders and their recovery with treatment. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2016, 374. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Porta, A.; Nollo, G. Information Decomposition in Bivariate Systems: Theory and Application to Cardiorespiratory Dynamics. Entropy 2015, 17, 277–303. [Google Scholar] [CrossRef]
- Porta, A.; Faes, L.; Nollo, G.; Bari, V.; Marchi, A.; De Maria, B.; Takahashi, A.C.M.; Catai, A.M. Conditional Self-Entropy and Conditional Joint Transfer Entropy in Heart Period Variability during Graded Postural Challenge. PLoS ONE 2015, 10, e0132851. [Google Scholar] [CrossRef] [PubMed]
- Porta, A.; Faes, L. Wiener-Granger Causality in Network Physiology with Applications to Cardiovascular Control and Neuroscience. Proc. IEEE 2016, 104, 282–309. [Google Scholar] [CrossRef]
- Stramaglia, S.; Wu, G.R.; Pellicoro, M.; Marinazzo, D. Expanding the transfer entropy to identify information circuits in complex systems. Phys. Rev. E 2012, 86, 066211. [Google Scholar] [CrossRef] [PubMed]
- Barrett, A.B. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E 2015, 91, 052802. [Google Scholar] [CrossRef] [PubMed]
- Williams, P.L. Nonnegative decomposition of multivariate information. arXiv 2010. [Google Scholar]
- Barnett, L.; Lizier, J.T.; Harre, M.; Seth, A.K.; Bossomaier, T. Information flow in a kinetic Ising model peaks in the disordered phase. Phys. Rev. Lett. 2013, 111, 177203. [Google Scholar] [CrossRef] [PubMed]
- Dimpfl, T.; Peter, F.J. Using transfer entropy to measure information flows between financial markets. Stud. Nonlinear Dyn. Econom. 2013, 17, 85–102. [Google Scholar] [CrossRef]
- Faes, L.; Nollo, G.; Jurysta, F.; Marinazzo, D. Information dynamics of brain-heart physiological networks during sleep. New J. Phys. 2014, 16, 105005. [Google Scholar] [CrossRef]
- Faes, L.; Porta, A.; Rossato, G.; Adami, A.; Tonon, D.; Corica, A.; Nollo, G. Investigating the mechanisms of cardiovascular and cerebrovascular regulation in orthostatic syncope through an information decomposition strategy. Auton. Neurosci. 2013, 178, 76–82. [Google Scholar] [CrossRef] [PubMed]
- Gomez, C.; Lizier, J.T.; Schaum, M.; Wollstadt, P.; Grutzner, C.; Uhlhaas, P.; Freitag, C.M.; Schlitt, S.; Bolte, S.; Hornero, R.; et al. Reduced Predictable Information in Brain Signals in Autism Spectrum Disorder; Frontiers Media: Lausanne, Switzerland, 2015. [Google Scholar]
- Lizier, J.T.; Pritam, S.; Prokopenko, M. Information Dynamics in Small-World Boolean Networks. Artif. Life 2011, 17, 293–314. [Google Scholar] [CrossRef] [PubMed]
- Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M. Application of information theory methods to food web reconstruction. Ecol. Model. 2007, 208, 145–158. [Google Scholar] [CrossRef]
- Pahle, J.; Green, A.K.; Dixon, C.J.; Kummer, U. Information transfer in signaling pathways: A study using coupled simulated and experimental data. BMC Bioinform. 2008, 9, 139. [Google Scholar] [CrossRef] [PubMed]
- Runge, J.; Heitzig, J.; Marwan, N.; Kurths, J. Quantifying causal coupling strength: A lag-specific measure for multivariate time series related to transfer entropy. Phys. Rev. E 2012, 86, 061121. [Google Scholar] [CrossRef] [PubMed]
- Stramaglia, S.; Cortes, J.M.; Marinazzo, D. Synergy and redundancy in the Granger causal analysis of dynamical networks. New J. Phys. 2014, 16, 105003. [Google Scholar] [CrossRef]
- Wibral, M.; Rahm, B.; Rieder, M.; Lindner, M.; Vicente, R.; Kaiser, J. Transfer entropy in magnetoencephalographic data: Quantifying information flow in cortical and cerebellar networks. Prog. Biophys. Mol. Biol. 2011, 105, 80–97. [Google Scholar] [CrossRef] [PubMed]
- Porta, A.; Faes, L.; Marchi, A.; Bari, V.; De Maria, B.; Guzzetti, S.; Colombo, R.; Raimondi, F. Disentangling cardiovascular control mechanisms during head-down tilt via joint transfer entropy and self-entropy decompositions. Front. Physiol. 2015, 6, 00301. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Porta, A.; Bari, V.; Marchi, A.; De Maria, B.; Takahashi, A.C.M.; Guzzetti, S.; Colombo, R.; Catai, A.M.; Raimondi, F. Effect of variations of the complexity of the target variable on the assessment of Wiener-Granger causality in cardiovascular control studies. Phys. Meas. 2016, 37, 276–290. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Marinazzo, D.; Jurysta, F.; Nollo, G. Linear and non-linear brain-heart and brain-brain interactions during sleep. Physiol. Meas. 2015, 36, 683–698. [Google Scholar] [CrossRef] [PubMed]
- Porta, A.; De Maria, B.; Bari, V.; Marchi, A.; Faes, L. Are nonlinear model-free approaches for the assessment of the entropy-based complexity of the cardiac control superior to a linear model-based one? IEEE Trans. Biomed. Eng. 2016. [Google Scholar] [CrossRef] [PubMed]
- Javorka, M.; Czippelova, B.; Turianikova, Z.; Lazarova, Z.; Tonhajzerova, I.; Faes, L. Causal analysis of short-term cardiovascular variability: state-dependent contribution of feedback and feedforward mechanisms. Med. Biol. Eng. Comput. 2016. [Google Scholar] [CrossRef] [PubMed]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley: New York, NY, USA, 2006. [Google Scholar]
- Barnett, L.; Barrett, A.B.; Seth, A.K. Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 2009, 103, 238701. [Google Scholar] [CrossRef] [PubMed]
- Barrett, A.B.; Barnett, L.; Seth, A.K. Multivariate Granger causality and generalized variance. Phys. Rev. E 2010, 81, 041907. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Marinazzo, D.; Montalto, A.; Nollo, G. Lag-Specific Transfer Entropy as a Tool to Assess Cardiovascular and Cardiorespiratory Information Transfer. IEEE Trans. Biomed. Eng. 2014, 61, 2556–2568. [Google Scholar] [CrossRef] [PubMed]
- Malliani, A.; Pagani, M.; Lombardi, F.; Cerutti, S. Cardiovascular neural regulation explored in the frequency domain. Circulation 1991, 84, 482–492. [Google Scholar] [CrossRef] [PubMed]
- Heart rate variability. Standards of measurement, physiological interpretation, and clinical use. Eur. Heart J. 1996, 17, 354–381.
- Cooke, W.H.; Hoag, J.B.; Crossman, A.A.; Kuusela, T.A.; Tahvanainen, K.U.O.; Eckberg, D.L. Human response to upright tilt: A window on central autonomic integration. J. Physiol. 1999, 517, 617–628. [Google Scholar] [CrossRef] [PubMed]
- Kuipers, N.T.; Sauder, C.L.; Carter, J.R.; Ray, C.A. Neurovascular responses to mental stress in the supine and upright postures. J. Appl. Physiol. 2008, 104, 1129–1136. [Google Scholar] [CrossRef] [PubMed]
- Baselli, G.; Cerutti, S.; Badilini, F.; Biancardi, L.; Porta, A.; Pagani, M.; Lombardi, F.; Rimoldi, O.; Furlan, R.; Malliani, A. Model for the assessment of heart period and arterial pressure variability interactions and of respiration influences. Med. Biol. Eng. Comput. 1994, 32, 143–152. [Google Scholar] [CrossRef] [PubMed]
- Cohen, M.A.; Taylor, J.A. Short-term cardiovascular oscillations in man: measuring and modelling the physiologies. J. Physiol. 2002, 542, 669–683. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Erla, S.; Nollo, G. Measuring connectivity in linear multivariate processes: Definitions, interpretation, and practical analysis. Comp. Math. Methods Med. 2012, 2012, 140513. [Google Scholar] [CrossRef] [PubMed]
- Patton, D.J.; Triedman, J.K.; Perrott, M.H.; Vidian, A.A.; Saul, J.P. Baroreflex gain: characterization using autoregressive moving average analysis. Am. J. Physiol. 1996, 270, H1240–H1249. [Google Scholar] [PubMed]
- Triedman, J.K.; Perrott, M.H.; Cohen, R.J.; Saul, J.P. Respiratory Sinus Arrhythmia—Time-Domain Characterization Using Autoregressive Moving Average Analysis. Am. J. Physiol. Heart Circ. Physiol. 1995, 268, H2232–H2238. [Google Scholar]
- Xiao, X.; Mullen, T.J.; Mukkamala, R. System identification: a multi-signal approach for probing neural cardiovascular regulation. Phys. Meas. 2005, 26, R41–R71. [Google Scholar] [CrossRef] [PubMed]
- Nollo, G.; Faes, L.; Porta, A.; Pellegrini, B.; Antolini, R. Synchronization index for quantifying nonlinear causal coupling between RR interval and systolic arterial pressure after myocardial infarction. Comput. Cardiol. 2000, 27, 143–146. [Google Scholar]
- Tukey, J.W. Exploratory Data Analysis; Pearson: London, UK, 1977. [Google Scholar]
- Schwarz, G. Estimating the dimension of a model. Ann. Stat. 1978, 6, 461–464. [Google Scholar] [CrossRef]
- Montano, N.; Gnecchi Ruscone, T.; Porta, A.; Lombardi, F.; Pagani, M.; Malliani, A. Power spectrum analysis of heart rate variability to assess the change in sympathovagal balance during graded orthostatic tilt. Circulation 1994, 90, 1826–1831. [Google Scholar] [CrossRef] [PubMed]
- Porta, A.; Tobaldini, E.; Guzzetti, S.; Furlan, R.; Montano, N.; Gnecchi-Ruscone, T. Assessment of cardiac autonomic modulation during graded head-up tilt by symbolic analysis of heart rate variability. Am. J. Physiol. Heart Circ. Physiol. 2007, 293, H702–H708. [Google Scholar] [CrossRef] [PubMed]
- Dick, T.E.; Baekey, D.M.; Paton, J.F.R.; Lindsey, B.G.; Morris, K.F. Cardio-respiratory coupling depends on the pons. Respir. Physiol. Neurobiol. 2009, 168, 76–85. [Google Scholar] [CrossRef] [PubMed]
- Miyakawa, K.; Koepchen, H.P.; Polosa, C. Mechanism of Blood Pressure Waves; Japan Science Society Press: Tokyo, Japan, 1984. [Google Scholar]
- Faes, L.; Nollo, G.; Porta, A. Information domain approach to the investigation of cardio-vascular, cardio-pulmonary, and vasculo-pulmonary causal couplings. Front. Physiol. 2011, 2, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Faes, L.; Nollo, G.; Porta, A. Non-uniform multivariate embedding to assess the information transfer in cardiovascular and cardiorespiratory variability series. Comput. Biol. Med. 2012, 42, 290–297. [Google Scholar] [CrossRef] [PubMed]
- Visnovcova, Z.; Mestanik, M.; Javorka, M.; Mokra, D.; Gala, M.; Jurko, A.; Calkovska, A.; Tonhajzerova, I. Complexity and time asymmetry of heart rate variability are altered in acute mental stress. Physiol. Meas. 2014, 35, 1319–1334. [Google Scholar] [CrossRef] [PubMed]
- Widjaja, D.; Montalto, A.; Vlemincx, E.; Marinazzo, D.; Van Huffel, S.; Faes, L. Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention. PLoS ONE 2015, 10, e0129112. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bernardi, L.; Wdowczyk-Szulc, J.; Valenti, C.; Castoldi, S.; Passino, C.; Spadacini, G.; Sleight, P. Effects of controlled breathing, mental activity and mental stress with or without verbalization on heart rate variability. J. Am. Coll. Cardiol. 2000, 35, 1462–1469. [Google Scholar] [CrossRef]
- Houtveen, J.H.; Rietveld, S.; de Geus, E.J. Contribution of tonic vagal modulation of heart rate, central respiratory drive, respiratory depth, and respiratory frequency to respiratory sinus arrhythmia during mental stress and physical exercise. Psychophysiology 2002, 39, 427–436. [Google Scholar] [CrossRef] [PubMed]
- Sloan, R.P.; Shapiro, P.A.; Bagiella, E.; Boni, S.M.; Paik, M.; Bigger, J.T., Jr.; Steinman, R.C.; Gorman, J.M. Effect of mental stress throughout the day on cardiac autonomic control. Biol. Psychol. 1994, 37, 89–99. [Google Scholar] [CrossRef]
- Widjaja, D.; Orini, M.; Vlemincx, E.; Van Huffel, S. Cardiorespiratory dynamic response to mental stress: a multivariate time-frequency analysis. Comput. Math. Methods Med. 2013, 2013, 451857. [Google Scholar] [CrossRef] [PubMed]
- Porta, A.; Baselli, G.; Guzzetti, S.; Pagani, M.; Malliani, A.; Cerutti, S. Prediction of short cardiovascular variability signals based on conditional distribution. IEEE Trans. Biomed. Eng. 2000, 47, 1555–1564. [Google Scholar] [PubMed]
- Porta, A.; Catai, A.M.; Takahashi, A.C.; Magagnin, V.; Bassani, T.; Tobaldini, E.; van de, B.P.; Montano, N. Causal relationships between heart period and systolic arterial pressure during graded head-up tilt. Am. J. Physiol. Regul. Integr. Comput. Physiol. 2011, 300, R378–R386. [Google Scholar] [CrossRef] [PubMed]
- Elstad, M.; Toska, K.; Chon, K.H.; Raeder, E.A.; Cohen, R.J. Respiratory sinus arrhythmia: opposite effects on systolic and mean arterial pressure in supine humans. J. Physiol. 2001, 536, 251–259. [Google Scholar] [CrossRef] [PubMed]
- Lackner, H.K.; Papousek, I.; Batzel, J.J.; Roessler, A.; Scharfetter, H.; Hinghofer-Szalkay, H. Phase synchronization of hemodynamic variables and respiration during mental challenge. Int. J. Psychophysiol. 2011, 79, 401–409. [Google Scholar] [CrossRef] [PubMed]
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Faes, L.; Porta, A.; Nollo, G.; Javorka, M. Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. Entropy 2017, 19, 5. https://doi.org/10.3390/e19010005
Faes L, Porta A, Nollo G, Javorka M. Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. Entropy. 2017; 19(1):5. https://doi.org/10.3390/e19010005
Chicago/Turabian StyleFaes, Luca, Alberto Porta, Giandomenico Nollo, and Michal Javorka. 2017. "Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks" Entropy 19, no. 1: 5. https://doi.org/10.3390/e19010005
APA StyleFaes, L., Porta, A., Nollo, G., & Javorka, M. (2017). Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. Entropy, 19(1), 5. https://doi.org/10.3390/e19010005