TY - JOUR

T1 - Minimum mean-squared error estimation of stochastic processes by mutual entropy

AU - Wu, Bing-Fei

PY - 1996/1/1

Y1 - 1996/1/1

N2 - An upper bound on the correlation coefficients in terms of the mutual entropy is developed for a general estimation problem. This upper bound may be reached by means of a nonlinear transformation such that, after the transformation, the processes are jointly gaussian. In order to minimize globally the mean-squared estimation error, we use an approach based on the Calculus of Variations to find the vector nonlinear functions whose elements turn out to be the eigenfunctions of two vector integral operators that could be concurrently solved from two vector integral equations. The relationship between the minimum mean-squared error (MMSE) and the mutual entropy is discussed. Since the MMSE error is linear with the number of experimental data, we consider the mutual entropy rate, which is the average of the mutual entropy, to evaluate the average MMSE error. This rate is related to the average MMSE error and it is show that a mutual entropy rate of 0·5 is the critical threshold for the MMSE problem. Moreover, given a correlation coefficient, ergodic and jointly gaussian signals can be generated easily by computer. An approach to create these signals is also presented.

AB - An upper bound on the correlation coefficients in terms of the mutual entropy is developed for a general estimation problem. This upper bound may be reached by means of a nonlinear transformation such that, after the transformation, the processes are jointly gaussian. In order to minimize globally the mean-squared estimation error, we use an approach based on the Calculus of Variations to find the vector nonlinear functions whose elements turn out to be the eigenfunctions of two vector integral operators that could be concurrently solved from two vector integral equations. The relationship between the minimum mean-squared error (MMSE) and the mutual entropy is discussed. Since the MMSE error is linear with the number of experimental data, we consider the mutual entropy rate, which is the average of the mutual entropy, to evaluate the average MMSE error. This rate is related to the average MMSE error and it is show that a mutual entropy rate of 0·5 is the critical threshold for the MMSE problem. Moreover, given a correlation coefficient, ergodic and jointly gaussian signals can be generated easily by computer. An approach to create these signals is also presented.

UR - http://www.scopus.com/inward/record.url?scp=0030384675&partnerID=8YFLogxK

U2 - 10.1080/00207729608929345

DO - 10.1080/00207729608929345

M3 - Article

AN - SCOPUS:0030384675

VL - 27

SP - 1391

EP - 1402

JO - International Journal of Systems Science

JF - International Journal of Systems Science

SN - 0020-7721

IS - 12

ER -