[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Interpreting the asymptotic increment of Jeffrey's divergence between some random processes

Published: 01 April 2018 Publication History

Abstract

In signal and image processing, Jeffrey's divergence (JD) is used in many applications for classification, change detection, etc. The previous studies done on the JD between ergodic wide-sense stationary (WSS) autoregressive (AR) and/or moving average (MA) processes state that the asymptotic JD increment, which is the difference between two JDs based on k and ( k − 1 )-dimensional random vectors when k becomes high, tends to a constant value, except JDs which involve a 1st-order MA process whose power spectral density (PSD) is null for one frequency. In this paper, our contribution is threefold. We first propose an interpretation of the asymptotic JD increment for ergodic WSS ARMA processes: it consists in calculating the power of the first process filtered by the inverse filter associated with the second process and conversely. This explains the atypical cases identified in previous works and generalizes them to any ergodic WSS ARMA process of any order whose PSD is null for one or more frequencies. Then, we suggest comparing other random processes such as noisy sums of complex exponentials (NSCE) by using the JD. In this case, the asymptotic JD increment and the convergence speed towards the asymptotic JD are useful to compare the processes. Finally, NSCE and pth-order AR processes are compared. The parameters of the processes, especially the powers of the processes, have a strong influence on the asymptotic JD increment.

Highlights

Providing an interpretation of the Jeffrey's divergence (JD) for ARMA processes.
Illustrating this interpretation by several cases.
Analyzing the JD between sums of exponentials disturbed by noises.
Comparing an AR process with a sum of exponentials + noise by using the JD.

References

[1]
R. Diversi, R. Guidorzi, U. Soverini, Identification of autoregressive models in the presence of additive noise, Int. J. Adapt. Control Signal Process. 22 (5) (2008) 465–481,.
[2]
W. Bobillet, R. Diversi, E. Grivel, R. Guidorzi, M. Najim, U. Soverini, Speech enhancement combining optimal smoothing and errors-in-variables identification of noisy AR processes, IEEE Trans. Signal Process. 55 (12) (2007) 5564–5578,.
[3]
M. Najim, Modeling, Estimation and Optimal Filtering in Signal Processing, Wiley, 2010.
[4]
E. Mazor, A. Averbuch, Y. Bar-Shalom, J. Dayan, Interacting multiple model methods in target tracking: a survey, IEEE Trans. Aerosp. Electron. Syst. 34 (1) (1998) 103–123,.
[5]
X.R. Li, V.P. Jilkov, Survey of maneuvering target tracking. Part V: Multiple model methods, IEEE Trans. Aerosp. Electron. Syst. 41 (2005) 1255–1321,.
[6]
Y. Bar-Shalom, X.R. Li, T. Kirubarajan, Estimation with Applications to Tracking and Navigation, Wiley, Hoboken, NJ, USA, 2001.
[7]
L. Bombrun, N.-E. Lasmar, Y. Berthoumieu, G. Verdoolaege, Multivariate texture retrieval using the SIRV representation and the geodesic distance, in: ICASSP, 2011, pp. 865–868.
[8]
M. Basseville, Distance measures for signal processing and pattern recognition, Signal Process. 18 (4) (1989) 349–369,.
[9]
C. Magnant, E. Grivel, A. Giremus, L. Ratton, B. Joseph, Classifying autoregressive models using dissimilarity measures: a comparative study, in: EUSIPCO, 2015, pp. 998–1002.
[10]
A.R. Gladston, P. Vijayalakshmi, N. Thangavelu, Improving speech intelligibility in cochlear implants using vocoder-centric acoustic models, in: International Conference on Recent Trends in Information Technology (ICRTIT), 2012, pp. 66–71,.
[11]
K. Pearson, On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can reasonably be supposed to have arisen from random sampling, Philos. Mag. 5 (50) (1900) 157–175.
[12]
E. Hellinger, Neue begründung der theorie quadratischer formen von unendlichvielen veränderlichen, (in german) J. Reine Angew. Math. 136 (1909) 210–271,.
[13]
A. Bhattacharyya, On a measure of divergence between two statistical populations defined by their probability distributions, Bull. Calcutta Math. Soc. 35 (1943) 99–109.
[14]
C.E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. 27 (1948) 623–656,.
[15]
S. Kullback, R.A. Leibler, On information and sufficiency, Ann. Math. Stat. 22 (1) (1951) 79–86,.
[16]
H. Chernoff, Measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations, Ann. Math. Stat. 23 (4) (1952) 493–507.
[17]
A. Rényi, On measures of entropy and information, fourth Berkeley symposium on mathematical statistics and probability, Bull. Calcutta Math. Soc. 1 (1961) 547–561,.
[18]
M. Basseville, Divergence measures for statistical data processing. An annotated bibliography, Signal Process. 93 (4) (2013) 621–633.
[19]
T.V. Erven, P. Harremos, Renyi divergence and Kullback–Leibler divergence, IEEE Trans. Inf. Theory 60 (7) (2014) 3797–3820.
[20]
R. Murthy, I. Pavlidis, P. Tsiamyrtzis, Touchless monitoring of breathing function, in: IEEE EMBS, 2004, pp. 1196–1199.
[21]
T.T. Georgiou, A. Lindquist, A convex optimization approach to arma modeling, IEEE Trans. Autom. Control 53 (5) (2008) 1108–1119,.
[22]
A. Schutz, L. Bombrun, Y. Berthoumieu, M. Najim, Centroid-based texture classification using the generalized gamma distribution, in: EUSIPCO, 2013, pp. 1–5.
[23]
J. Contreras-Reyes, Analyzing fish condition factor index through skew-gaussian information theory quantifiers, Fluct. Noise Lett. 15 (2) (2016) 1–16,.
[24]
J. Contreras-Reyes, R.B. Arellano-Valle, Kullback–Leibler divergence measure for multivariate skew-normal distributions, Entropy 14 (9) (2012) 1606–1626.
[25]
H. Jeffreys, Theory of Probability, Oxford University Press, United States, 1961.
[26]
H.G.F. Van Loan, Matrix Computations, The Johns Hopkins University Press, 1996.
[27]
S. Noschese, L. Pasquini, L. Reichel, Tridiagonal Toeplitz matrices: properties and novel applications, Numer. Linear Algebra Appl. 20 (2) (2013) 302–326,.
[28]
R.J. Stroeker, Approximations of the eigenvalues of the covariance matrix of a first-order autoregressive process, J. Econom. 22 (3) (1983) 269–279,.
[29]
C. Magnant, A. Giremus, E. Grivel, On computing Jeffrey's divergence between time-varying autoregressive models, IEEE Signal Process. Lett. 22 (7) (2014) 915–919,.
[30]
C. Magnant, A. Giremus, E. Grivel, Jeffreys divergence between state models: application to target tracking using multiple models, in: EUSIPCO, 2013, pp. 1–5.
[31]
L. Legrand, E. Grivel, Jeffrey's divergence between moving-average models that are real or complex, noise-free or disturbed by additive white noises, Elsevier Signal Process. 131 (2017) 350–363,.
[32]
W.-C. Yueh, Explicit inverses of several tridiagonal matrices, Appl. Math. E-Notes 6 (2006) 74–83.
[33]
L. Legrand, E. Grivel, Jeffrey's divergence between moving-average and autoregressive models, in: ICASSP, 2017, pp. 4291–4295.
[34]
B. Cernuschi-Frias, A derivation of the Gohberg–Semencul relation, IEEE Trans. Signal Process. 39 (1) (1991) 190–192,.
[35]
C. Rao, Information and the accuracy attainable in the estimation of statistical parameters 37 (1945), 235–247, https://doi.org/10.1007/978-1-4612-0919-5_16.
[36]
B.C. Sutradhar, P. Kumar, The inversion of correlation matrix for MA(1) process, Appl. Math. Lett. 16 (3) (2003) 317–321,.
[37]
G. Box, G. Jenkins, Time Series Analysis, Forecasting, and Control, Holden Day, San Francisco, 2008.
[38]
A.D. Polyanin, A.V. Manzhiron, Handbook of Mathematics for Engineers and Scientists, Chapman and Hall CRC, 2006,.
[39]
R.-M. Gray, Toeplitz and circulant matrices: a review, Found. Trends Commun. Inf. Theory 2 (3) (2006) 155–239,.
[40]
P. Colwell, Blaschke Products: Bounded Analytical Functions, University of Michigan Press, 1985.
[41]
C.E. Rasmussen, C.K.I. Williams, Gaussian Processes for Machine Learning, MIT Press, 2006.
[42]
C. Granger, R. Joyeux, An introduction to long-memory time series models and fractional differencing, J. Time Ser. Anal. 1 (1980) 15–29,.
[43]
J.R.M. Hosking, Fractional differencing, Biometrika 68 (1981) 165–176,.
[44]
P.M. Robinson, Time Series with Long Memory, Oxford University Press, 2003.

Cited By

View all

Index Terms

  1. Interpreting the asymptotic increment of Jeffrey's divergence between some random processes
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Digital Signal Processing
          Digital Signal Processing  Volume 75, Issue C
          Apr 2018
          266 pages

          Publisher

          Academic Press, Inc.

          United States

          Publication History

          Published: 01 April 2018

          Author Tags

          1. Jeffrey's divergence
          2. Kullback–Leibler divergence
          3. Model comparison
          4. ARMA process
          5. Asymptotic analysis
          6. Physical meaning

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 07 Mar 2025

          Other Metrics

          Citations

          Cited By

          View all

          View Options

          View options

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media