[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
A Novel Belief Entropy for Measuring Uncertainty in Dempster-Shafer Evidence Theory Framework Based on Plausibility Transformation and Weighted Hartley Entropy
Next Article in Special Issue
Spatial Organization of the Gene Regulatory Program: An Information Theoretical Approach to Breast Cancer Transcriptomics
Previous Article in Journal
Incentive Contract Design for the Water-Rail-Road Intermodal Transportation with Travel Time Uncertainty: A Stackelberg Game Approach
Previous Article in Special Issue
Characterizing Complex Networks Using Entropy-Degree Diagrams: Unveiling Changes in Functional Brain Connectivity Induced by Ayahuasca
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Robust Adaptive Filter for a Complex Hammerstein System

1
College of Electronic and Information Engineering, Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, Southwest University, Chongqing 400715, China
2
School of Mathematics and Statistics, Southwest University, Chongqing 400715, China
*
Authors to whom correspondence should be addressed.
Entropy 2019, 21(2), 162; https://doi.org/10.3390/e21020162
Submission received: 22 December 2018 / Revised: 2 February 2019 / Accepted: 6 February 2019 / Published: 9 February 2019
(This article belongs to the Special Issue Information Theory in Complex Systems)

Abstract

:
The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex domain, in this paper, we develop the robust Hammerstein adaptive filter under MCC to the complex domain, and propose the Hammerstein maximum complex correntropy criterion (HMCCC) algorithm. Thus, the new Hammerstein adaptive filter can be used to directly handle the complex-valued data. Additionally, we analyze the stability and steady-state mean square performance of HMCCC. Simulations illustrate that the proposed HMCCC algorithm is convergent in the impulsive noise environment, and achieves a higher accuracy and faster convergence speed than the Hammerstein complex least mean square (HCLMS) algorithm.

1. Introduction

Since traditional mean square error (MSE) criterion derived algorithms are sensitive to outliers, they cannot be used to deal with impulsive noise effectively [1,2]. However, impulsive noise commonly exists in practice. To solve this problem, a higher-order statistic, named correntropy, was proposed [3,4]. It has been proven that the maximum correntropy criterion (MCC) algorithm is robust to impulsive noises, and outperforms the traditional MSE algorithms obviously when the noises are non-Gaussian distributed. Thus, the MCC algorithm [5,6] and its variants [7,8,9,10], such as generalized maximum correntropy criterion (GMCC) [7], are widely used in practice.
Different from the well-known Wiener adaptive filter [11], the Hammerstein adaptive filter consists of two parts, namely: a nonlinear memoryless polynomial function and a linear finite impulse response (FIR) filter [12,13,14]. The Hammerstein system has been widely applied to signal processing [15,16,17,18] as well as other applications [19,20]. Considering that the performance of the Hammerstein adaptive filter using MSE criterion decreases dramatically when an impulsive noise exists, Wu et al. applied the MCC criterion to the Hammerstein adaptive filter, and developed a robust Hammerstein adaptive filtering algorithm [21]. This novel adaptive filter is insensitive to outliers and behaves better than the traditional Hammerstein adaptive filters, especially in the case of the impulsive noise.
However, the Hammerstein adaptive filters under the traditional MSE criterion and MCC criterion are defined in the field of real numbers. They cannot be directly employed to handle the complex-valued data. In fact, many signals are defined in the complex domain in practical applications [22,23,24,25]. Thus, in this work, we put forward a Hammerstein maximum complex correntropy criterion (HMCCC) algorithm, which extends the Hammerstein adaptive filter, using MCC criterion, to the complex domain. HMCCC can be used to handle complex-valued data directly, while being robust to impulsive noise. We analyzed the stability and provided the steady-state mean square performance of the HMCCC algorithm. Simulations show that the HMCCC is robust to outliers, and achieves a higher accuracy and faster convergence speed than the Hammerstein complex least mean square (HCLMS) algorithm.
The rest of the paper is organized as follows. A complex Hammerstein adaptive filter under MCCC is developed in Section 2. In Section 3, we analyze the stability and provide the steady-state mean square performance of the HMCCC algorithm. In Section 4, several simulations are presented so as to verify the superior performance of the HMCCC algorithm. Finally, a conclusion is drawn in Section 5.

2. Hammerstein Adaptive Filter under the Maximum Complex Correntropy Criterion

2.1. Complex Correntropy

Considering two complex variables that are C 1 = X 1 + j Y 1 and C 2 = X 2 + j Y 2 , respectively, the complex correntropy is defined by the following [22]:
V c ( C 1 , C 2 ) = E [ κ σ c ( C 1 C 2 ) ]
where κ σ c ( C 1 C 2 ) represents the kernel function, and X 1 , Y 1 , X 2 , and Y 2 denote the real variables.
A Gaussian kernel is adopted in this paper, which is expressed as follows:
κ σ c ( C 1 C 2 ) = 1 2 π σ 2 exp ( ( C 1 C 2 ) ( C 1 C 2 ) 2 σ 2 )
with σ being the kernel width.

2.2. Adaptive Filter for Complex Hammerstein System

2.2.1. Cost Function

Consider a complex Hammerstein system, the output of polynomial nonlinear part is as follows:
s ( k ) = p H x p ( k )
where p = [ p 1 p 2 p M ] T denotes the vector of the complex polynomial coefficient, M is the polynomial order, and x p ( k ) = [ x ( k ) x 2 ( k ) x M ( k ) ] T is the complex polynomial regressor vector, ( ) T and ( ) H denote the transpose and conjugate transpose, respectively.
The cost function of the complex Hammerstein filtering algorithm under MCCC is as follows:
J H M C C C = E [ κ σ c ( e ( k ) ) ]
where e ( k ) = d ( k ) w H s ( k ) is the error at the k -th iteration, w = [ w 1 w 2 w N ] T denotes the estimated weight vector, s ( k ) = [ s ( k ) s ( k 1 ) s ( k N + 1 ) ] T is the input vector of the complex finite impulse response (FIR) filter, d ( k ) = w 0 H s 0 ( k ) + v ( k ) is the desired signal at the k -th iteration, s 0 ( k ) = [ s 0 ( k ) s 0 ( k 1 ) s 0 ( k N + 1 ) ] T , s 0 ( k ) = p 0 H x p ( k ) , N denotes the length of the linear FIR filter, w 0 and p 0 are the unknown system parameters to be estimated, which are the optimum solutions for w and p , and v ( k ) is the observation noise.

2.2.2. Adaptive Algorithm

Based on the Wirtinger Calculus [26,27,28], we derive the stochastic gradient of J H M C C C with respect to p as follows:
J H M C C C p = 1 2 π σ 2 exp [ | e ( k ) | 2 2 σ 2 ] 1 2 σ 2 e ( k ) X ( k ) w ( k )
and with respect to w as follows:
J H M C C C w = 1 2 π σ 2 exp [ | e ( k ) | 2 2 σ 2 ] 1 2 σ 2 e ( k ) X T ( k ) p ( k )
where X ( k ) = [ x p ( k ) x p ( k 1 ) x p ( k N + 1 ) ] .
Then, the updates for p and w are as follows:
p ( k + 1 ) = p ( k ) + η p f ( e ( k ) ) X ( k ) w ( k )
w ( k + 1 ) = w ( k ) + η w f ( e ( k ) ) X T ( k ) p ( k )
where f ( e ( k ) ) = exp [ | e ( k ) | 2 2 σ 2 ] e ( k ) and the constant 1 2 π σ 2 is merged into the step-size parameters η p and η w .
Finally, Table 1 summarizes the HMCCC algorithm.

3. Convergence Analysis

To begin the derivation of the convergence analysis, some widely used assumptions are adopted, as follows:
(A1)
v ( k ) is independently identically distributed (iid), zero-mean, circular, and independent of e p ( k ) , e w ( k ) and x ( k ) ;
(A2)
Both X T ( k ) p ( k ) 2 and X ( k ) w ( k ) 2 are uncorrelated with | f ( e ( k ) ) | 2 when k .
Remark 1:
(1) A1 is reasonable in a practical case and is widely used in the theoretical analysis of an adaptive filter [21,29,30];
(2) When k , p ( k ) p 0 ( k ) and w ( k ) w 0 ( k ) . Additionally, X ( k ) is independent of e ( k ) , based on A1. Thus, both X T ( k ) p ( k ) 2 and X ( k ) w ( k ) 2 are uncorrelated with | f ( e ( k ) ) | 2 . Thus, A2 is also a reasonable assumption.

3.1. Stability Analysis

As | e ( k ) | 2 is a real-valued function for [ p T ( k ) w T ( k ) ] T , the following expression can be derived by taking the Taylor series expansion of | e ( k + 1 ) | 2 at [ p T ( k ) w T ( k ) ] T ,
| e ( k + 1 ) | 2 = | e ( k ) | 2 + 2 Re { | e ( k ) | 2 p ( k ) | w ( k ) = c o n s ( Δ p ( k ) ) } + 2 Re { | e ( k ) | 2 w ( k ) | p ( k ) = c o n s ( Δ w ( k ) ) } + h . o . t .
where
| e ( k ) | 2 p ( k ) = X ( k ) w ( k ) e ( k )
| e ( k ) | 2 w ( k ) = X T ( k ) p ( k ) e ( k )
Δ p ( k ) = η p f ( e ( k ) ) X ( k ) w ( k )
Δ w ( k ) = η w f ( e ( k ) ) X T ( k ) p ( k )
and h . o . t represents the terms of higher order infinitesimal.
Then,
E { | e ( k + 1 ) | 2 } = E { [ 1 2 η p exp [ | e ( k ) | 2 2 σ 2 ] X ( k ) w ( k ) 2 2 η w exp [ | e ( k ) | 2 2 σ 2 ] X T ( k ) p ( k ) 2 ] | e ( k ) | 2 } E { [ 1 2 η p exp [ | e ( k ) | 2 2 σ 2 ] X ( k ) w ( k ) 2 2 η w exp [ | e ( k ) | 2 2 σ 2 ] X T ( k ) p ( k ) 2 ] } E { | e ( k ) | 2 }
Thus, the sequence | e ( k ) | will decrease in the mean sense if
E { | 1 2 η p exp [ | e ( k ) | 2 2 σ 2 ] X ( k ) w ( k ) 2 2 η w exp [ | e ( k ) | 2 2 σ 2 ] X T ( k ) p ( k ) 2 | } 1
that is,
0 E { | η p X ( k ) w ( k ) 2 + η w X T ( k ) p ( k ) 2 | } 1 E { exp [ | e ( k ) | 2 2 σ 2 ] }
Considering the fact that exp [ | e ( k ) | 2 2 σ 2 ] 1 , we can obtain that the sequence | e ( k ) | will decrease in the mean sense, if
0 E [ | η p X ( k ) w ( k ) 2 + η w X T ( k ) p ( k ) 2 | ] 1
In this case, the HMCCC algorithm will converge in the mean sense.

3.2. Steady Excess Mean Square Error

We define H p = lim k E [ | e p ( k ) | 2 ] as the steady excess mean square error (EMSE) for the nonlinear part, H w = lim k E [ | e w ( k ) | 2 ] as the steady EMSE for the linear filter, and H p w = lim k E [ | e p w ( k ) | 2 ] as the steady EMSE for the whole Hammerstein system, where e p ( k ) = w 0 H X T ( k ) p 0 w 0 H ( k ) X T ( k ) p ( k ) , e w ( k ) = w 0 H X T ( k ) p 0 ( k ) w H ( k ) X T ( k ) p 0 ( k ) , e p w ( k ) = w 0 H X T ( k ) p 0 w H ( k ) X T ( k ) p ( k ) .
When the algorithm reaches the steady, the error for the whole Hammerstein system can be approximately divided into two parts, as follows:
e p w ( k ) = w 0 H X T ( k ) p 0 w H ( k ) X T ( k ) p ( k ) = w 0 H X T ( k ) p 0 w 0 H X T ( k ) p ( k ) + w 0 H X T ( k ) p ( k ) w H ( k ) X T ( k ) p ( k ) w 0 H ( k ) X T ( k ) p ˜ ( k ) + w ˜ H ( k ) X T ( k ) p 0 ( k ) = e p ( k ) + e w ( k )
When only the nonlinear part is taken into consideration, we have
e p ( k ) = w 0 H X T ( k ) p 0 w 0 H ( k ) X T ( k ) p ( k ) w 0 H ( k ) X T ( k ) p ˜ ( k )
and
p ˜ ( k + 1 ) = p ˜ ( k ) η p f ( e ( k ) ) X ( k ) w 0 ( k )
Multiplying each side of Equation (20) by its conjugate transpose and taking the expectation, we obtain the following:
E { p ˜ ( k + 1 ) 2 } = E { p ˜ ( k ) 2 } 2 η p E { Re [ e p ( k ) f ( e ( k ) ) ] } + η p 2 E { X ( k ) w 0 ( k ) 2 | f ( e ( k ) ) | 2 }
Since lim k E { p ˜ ( k + 1 ) 2 } = lim k E { p ˜ ( k ) 2 } , we further obtain the following:
2 lim k E { Re [ e p ( k ) f ( e ( k ) ) ] } = lim k η p E { X ( k ) w 0 ( k ) 2 | f ( e ( k ) ) | 2 }
Based on the results of Equations (38) and (46) in the literature [23], we similarly obtain the following expressions by replacing α and λ with 1 and 1 / 2 σ 2 , respectively, as follows:
E { Re [ e p ( k ) f ( e ( k ) ) ] } = H p E { exp [ | v ( k ) | 2 / 2 σ 2 ] [ 1 ( | v ( k ) | 2 / 2 σ 2 ) ] }
E { | f ( e ( k ) ) | 2 } = E { exp [ | v ( k ) | 2 / σ 2 ] | v ( k ) | 2 } + H p × R 1
where
R 1 = E { exp [ | v ( k ) | 2 / σ 2 ] [ | v ( k ) | 4 / σ 4 3 | v ( k ) | 2 / σ 2 1 ] }
Furthermore, based on the result of Equation (47) in the literature [23], we obtain the EMSE for the nonlinear part by replacing T r ( R x x H ) (i.e., E { x ( i ) 2 } ) with E { X ( k ) w 0 2 } ,
H p = η p E { X ( k ) w 0 2 } E { exp [ | v ( k ) | 2 / σ 2 ] | v ( k ) | 2 } 2 E { exp [ | v ( k ) | 2 / 2 σ 2 ] [ 1 ( | v ( k ) | 2 / 2 σ 2 ) ] } η p E { X ( k ) w 0 2 } R 1
When only the FIR filter is taken into consideration, we similarly derive
H w = η w E { X T ( k ) p 0 2 } E { exp [ | v ( k ) | 2 / σ 2 ] | v ( k ) | 2 } 2 E { exp [ | v ( k ) | 2 / 2 σ 2 ] [ 1 ( | v ( k ) | 2 / 2 σ 2 ) ] } η w E { X T ( k ) p 0 2 } R 1
Furthermore, when both the nonlinear part and FIR filter are considered, we have
H p w = lim k E [ | e p w ( k ) | 2 ] lim k E [ | e p ( k ) + e w ( k ) | 2 ] = lim k E [ | e p ( k ) | 2 ] + lim k E [ | e w ( k ) | 2 ] + lim k E [ 2 Re ( e p ( k ) e w ( k ) ) ] = H p + H w + H c r o s s
Remark 2:
H c r o s s = lim k E [ 2 Re ( e p ( k ) e w ( k ) ) ] is the cross EMSE of the Hammerstein system, and equals zero when both e p ( k ) and e w ( k ) are zero mean and independent.

4. Simulation

In this part, we provide some simulations to illustrate the superior performance of HMCCC. We chose the weight vector as w 0 = [ 1 + 0.6 j 0.6 + j 0.1 + 0.2 j 0.2 + 0.1 j 0.06 + 0.04 j 0.04 + 0.06 j ] T and the complex polynomial coefficient vector as p 0 = [ 1 + 0.6 j 0.6 + j ] T . An additive complex noise v = v R + j v I was considered in the simulations, with v R and v I being the real and imaginary parts, respectively. We compared the performance of HMCCC with HCLMS (HCLMS is the extension of HLMS [17] to complex domain, which is summarized in the Appendix A), and chose the parameters of both algorithms by trial, in order to ensure a desirable solution. Simulation results were obtained by averaging 100 Monte Carlo trials. The input signal x ( k ) was generated by a first-order autoregressive process, as follows:
x ( k ) = a x ( k 1 ) + 1 a 2 ξ ( k )
where x ( k ) = x R ( k ) + j x I ( k ) , x R ( k ) , and x I ( k ) are the real and imaginary parts of x ( k ) , a = 0.95 , ξ ( k ) = ξ R ( k ) + j ξ I ( k ) , ξ R ( k ) , and ξ I ( k ) are the real and imaginary parts of ξ ( k ) , ξ R ( k ) , ξ I ( k ) N ( 0 , 1 ) , and N ( μ , σ 2 ) denotes the Gaussian distribution with mean μ and variance σ 2 , respectively.
First, the superiority of HMCCC was verified in the complex alpha stable noise environment. The noise parameters were v R , v I σ v v a l p h a ( α , β , γ , δ ) , where σ v 2 = 0.1 , α = 1.2 is the characteristic factor, β = 0 is the symmetry parameter, γ = 0.6 is the dispersion parameter, and δ = 0 is the location parameter, respectively. Figure 1 shows the time sequence and histogram for the real and imaginary parts of the complex alpha stable noise. It is noted that HCLMS may diverge in the complex alpha stable noise environment. Thus, we omitted the trials for HCLMS if w 2 100 . The simulation shows that HCLMS diverged twice in the 100 trials, while HMCCC did not diverge. The performances of the different algorithms in terms of the normalized testing mean square error (MSE) are shown in Figure 2, where the testing MSE was obtained from a test set of 100 samples, and the trials of the divergence were omitted for HCLMS. It is clear that compared with HCLMS, HMCCC has a better filtering performance in the presence of complex alpha stable noise.
Then, we compared the steady testing MSE of HMCCC under different noise parameters. We ran 15,000 iterations to make sure the HMCCC reaches steady, and calculated the steady testing MSE with the average of next 1000 iterations. Figure 3 shows the steady normalized testing MSEs under different characteristic factors and dispersion parameters, respectively. It illustrates that HMCCC can perform well under different parameters of alpha stable noise.
Next, the superiority of HMCCC is verified in the contaminated Gaussian (CG) noise environment, where v ( k ) = ( 1 c ( k ) ) v 1 ( k ) + c ( k ) v 2 ( k ) , v 1 ( k ) = v 1 R ( k ) + j v 1 I ( k ) , v 2 ( k ) = v 2 R ( k ) + j v 2 I ( k ) , v 1 R , v 1 I N ( 0 , 0.1 ) , and v 2 R , v 2 I N ( 0 , 20 ) represent outliers. Additionally, P ( c ( k ) = 1 ) = 0.06 , P ( c ( k ) = 0 ) = 1 0.06 . Figure 4 shows the time sequence and histogram for the real and imaginary parts of the CG noise. The performances of the different algorithms on the basis of normalized testing MSE are shown in Figure 5, where the testing MSE was also obtained from a test set of 100 samples. One can clearly see that, compared with HCLMS, HMCCC has a better filtering performance in the presence of CG noise.
Furthermore, we tested the robustness of the HMCCC algorithm to the outlier. The CG noise was also used in this simulation, where v 1 R , v 1 I N ( 0 , 0.1 ) , P ( c ( k ) = 1 ) = p , v 2 R , v 2 I N ( 0 , σ B 2 ) , and P ( c ( k ) = 0 ) = 1 p . Figure 6 depicts the steady normalized testing MSE of the HMCCC algorithm in the case of different probabilities of outlier, p ( σ B 2 = 20 ), and variances of outlier, σ B 2 ( p = 0.06 ) , where 15,000 iterations were run to make sure HMCCC reached steady, and the steady normalized testing MSEs were calculated with the average of the next 1000 iterations. One can observe that the proposed HMCCC algorithm is robust to the outlier, and behaves well under different p and σ B 2 . Moreover, HMCCC has a slightly smaller steady testing MSE with a bigger σ B 2 , which is a little surprising, but is consistent with Chen’s work [9]. This is due to the fact that the convergence rates are slightly different under different σ B 2 , even with the same learning rate.
Afterward, we investigated the influences of the kernel width σ on the performance of HMCCC. The CG noise was also employed in this simulation, where v 1 R , v 1 I N ( 0 , 0.1 ) , P ( c ( k ) = 1 ) = 0.06 , v 2 R , v 2 I N ( 0 , 20 ) , and P ( c ( k ) = 0 ) = 1 0.06 . Figure 7 presents the normalized testing MSE of HMCCC in the case of three different kernel widths σ . It can be seen that kernel width σ has a vital role on the learning rate and steady value of HMCCC. With a small kernel width, HMCCC converges slowly, but achieves a small steady value. On the contrary, with a large kernel width, HMCCC converges quickly, but achieves a high steady value.
Finally, we compared the simulated steady testing MSEs with the theoretical ones. The Gaussian noise is used in this simulation, where v ( k ) = v R ( k ) + j v I ( k ) and v R , v I N ( 0 , σ v 2 ) . It is noted that the testing MSEs were not normalized in this simulation, and were obtained from a test set of 1000 samples. In addition, the theoretical values for the nonlinear part and FIR part were calculated by Equations (26) and (27), respectively. Figure 8 shows the simulated steady testing MSEs and the theoretical ones under different σ v 2 , where 40,000 iterations were run to make sure the algorithm reached steady, and the steady testing MSEs were calculated with the average of the next 1000 iterations. One can see that the simulated values almost matched with the theoretical ones for the nonlinear part and FIR part. Moreover, there is a little gap between the simulated whole system and the sum of theoretical nonlinear and FIR parts, which is the value of cross EMSE.

5. Conclusions

Since the Hammerstein adaptive filter can only be used to deal with real-valued data, in this paper, we extended the Hammerstein filter under maximum correntropy criterion (MCC) to the complex domain and developed a new algorithm, named the Hammerstein maximum complex correntropy criterion (HMCCC). Simultaneously, we analyzed the stability and derived some theoretical results for the HMCCC algorithm. The simulation illustrated that HMCCC is always convergent and performs better than the traditional Hammerstein complex LMS (HCLMS) algorithm in the presence of impulsive noises. Additionally, the kernel width has an important impact on the performance of HMCCC, and the novel algorithm behaves well with different probabilities and variances of outliers.

Author Contributions

Conceptualization, G.Q., D.L., and S.W.; methodology, G.Q. and S.W.; software, D.L.; validation, G.Q.; formal analysis, G.Q.; investigation, G.Q., D.L., and S.W.; resources, G.Q.; data curation, D.L.; writing (original draft preparation), G.Q. and D.L.; writing (review and editing), S.W.; visualization, D.L.; supervision, G.Q., D.L., and S.W.; project administration, G.Q.; funding acquisition, G.Q.

Funding

This research was funded by the China Postdoctoral Science Foundation Funded Project under grant 2017M610583, and the Fundamental Research Funds for the Central Universities under grant XDJK2018C020.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The summary of the HCLMS algorithm:
Input: η p , η w , M , N , d ( k ) , x ( k )
1. Initializations: p ( 0 ) , w ( 0 ) .
2. While { d ( k ) x ( k ) } available, do
3.   x p ( k ) = [ x ( k ) x 2 ( k ) x M ( k ) ] T
4.   X ( k ) = [ x p ( k ) x p ( k 1 ) x p ( k N + 1 ) ]
5.    s ( k ) = X T ( k ) p ( k )
6.    e ( k ) = d ( k ) w H ( k ) s ( k )
7.    p ( k + 1 ) = p ( k ) + η p e ( k ) X ( k ) w ( k )
8.    w ( k + 1 ) = w ( k ) + η w e ( k ) X T ( k ) p ( k )
9  End while
10. w ^ 0 = w ( k + 1 )
Output: Estimated polynomial coefficient p ^ 0 and filter weight w ^ 0 .

References

  1. Principe, J.C. Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives; Springer: New York, NY, USA, 2010. [Google Scholar]
  2. Chen, B.; Zhu, Y.; Hu, J.; Príncipe, J.C. System Parameter Identification: Information Criteria and Algorithms; Newnes: Oxford, UK, 2013. [Google Scholar]
  3. Liu, W.; Pokharel, P.P.; Príncipe, J. Correntropy: A localized similarity measure. In Proceedings of the 2006 IEEE International Joint Conference on Neural Network (IJCNN), Vancouver, BC, Canada, 16–21 July 2006; pp. 4919–4924. [Google Scholar]
  4. Liu, W.; Pokharel, P.P.; Príncipe, J. Correntropy: Properties and applications in non-Gaussian signal processing. IEEE Trans. Signal Process. 2007, 55, 5286–5298. [Google Scholar] [CrossRef]
  5. Singh, A.; Príncipe, J.C. Using correntropy as a cost function in linear adaptive filters. In Proceedings of the 2009 International Joint Conference on Neural Networks (IJCNN), Atlanta, GA, USA, 14–19 June 2009; pp. 2950–2955. [Google Scholar]
  6. Singh, A.; Príncipe, J.C. A loss function for classification based on a robust similarity metric. In Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN), Barcelona, Spain, 18–23 July 2010; pp. 1–6. [Google Scholar]
  7. Chen, B.; Xing, L.; Zhao, H.; Zheng, N.; Príncipe, J.C. Generalized correntropy for robust adaptive filtering. IEEE Trans. Signal Process. 2016, 64, 3376–3387. [Google Scholar] [CrossRef]
  8. Zhao, J.; Zhang, H. Kernel recursive generalized maximum correntropy. IEEE Signal Process. Lett. 2017, 24, 1832–1836. [Google Scholar] [CrossRef]
  9. Chen, B.; Xing, L.; Xu, B.; Zhao, H.; Zheng, N.; Príncipe, J.C. Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering. IEEE Trans. Signal Process. 2017, 65, 2888–2901. [Google Scholar] [CrossRef]
  10. Wang, S.; Dang, L.; Wang, W.; Qian, G.; Tse, C.K. Kernel Adaptive Filters With Feedback Based on Maximum Correntropy. IEEE Access. 2018, 6, 10540–10552. [Google Scholar] [CrossRef]
  11. Diniz, P.S.R. Adaptive Filtering: Algorithms and Practical Implementation, 4th ed.; Springer: New York, NY, USA, 2013. [Google Scholar]
  12. Bai, E.W.; Li, D. Convergence of the iterative Hammerstein system identification algorithm. IEEE Trans. Autom. Control. 2004, 49, 1929–1940. [Google Scholar] [CrossRef]
  13. Jeraj, J.; Mathews, V.J. Stochastic mean-square performance analysis of an adaptive Hammerstein filter. IEEE Trans. Signal Process. 2006, 54, 2168–2177. [Google Scholar] [CrossRef]
  14. Ding, F.; Liu, X.P.; Liu, G. Identification methods for Hammerstein nonlinear systems. Dig. Signal Process. 2011, 21, 215–238. [Google Scholar] [CrossRef]
  15. Scarpiniti, M.; Comminiello, D.; Parisi, R.; Uncini, A. Comparison of Hammerstein and Wiener systems for nonlinear acoustic echo cancelers in reverberant environments. In Proceedings of the 17th International Conference on Digital Signal Processing (DSP), Corfu, Greece, 6–8 July 2011; pp. 1–6. [Google Scholar]
  16. Stenger, A.; Kellermann, W. Adaptation of a memoryless preprocessor for nonlinear acoustic echo cancelling. Signal Process. 2000, 80, 1747–1760. [Google Scholar] [CrossRef]
  17. Scarpiniti, M.; Comminiello, D.; Parisi, R.; Uncini, A. Hammerstein uniform cubic spline adaptive filters: Learning and convergence properties. Signal Process. 2014, 100, 112–123. [Google Scholar] [CrossRef]
  18. Dang, L.; Wang, W.; Sun, Q.; Long, Z.; Qian, G.; Wang, S. Hammerstein adaptive filter with single feedback under minimum mean square error. In Proceedings of the 20th International Conference on Information Fusion (Fusion), Xi’an, China, 10–13 July 2017; pp. 1–7. [Google Scholar]
  19. Hunt, K.J.; Munih, M.; Donaldson, N.N.; Barr, F.M.D. Investigation of the Hammerstein hypothes is in the modeling of electrically stimulated muscle. IEEE Trans. Biomed. Eng. 1998, 45, 998–1009. [Google Scholar] [CrossRef] [PubMed]
  20. Su, S.W.; Wang, L.; Celler, B.G.; Savkin, A.V.; Guo, Y. Identification and control of heart rate regulation during treadmill exercise. IEEE Trans. Biomed. Eng. 2007, 54, 1238–1246. [Google Scholar] [CrossRef] [PubMed]
  21. Wu, Z.; Peng, S.; Chen, B.; Zhao, H. Robust Hammerstein adaptive filtering under maximum correntropy criterion. Entropy 2015, 17, 7149–7166. [Google Scholar] [CrossRef]
  22. Guimaraes, J.P.F.; Fontes, A.I.R.; Rego, J.B.A.; Martins, A.M.; Príncipe, J.C. Complex correntropy: Probabilistic interpretation and application to complex-valued data. IEEE Signal Process. Lett. 2017, 24, 42–45. [Google Scholar] [CrossRef]
  23. Qian, G.; Wang, S. Generalized Complex Correntropy: Application to Adaptive Filtering of Complex Data. IEEE Access. 2018, 6, 19113–19120. [Google Scholar] [CrossRef]
  24. Guimaraes, J.P.F.; Fontes, A.I.R.; Rego, J.B.A.; Martins, A.M.; Príncipe, J.C. Complex Correntropy Function: Properties, and application to a channel equalization problem. Expert Syst. Appl. 2018, 107, 173–181. [Google Scholar] [CrossRef]
  25. Qian, G.; Luo, D.; Wang, S. Recursive Minimum Complex Kernel Risk Sensitive Loss Algorithm. Entropy 2018, 20, 902. [Google Scholar] [CrossRef]
  26. Wirtinger, W. Zur formalen theorie der funktionen von mehr complexen veränderlichen. Math. Ann. 1927, 97, 357–375. (In Germany) [Google Scholar] [CrossRef]
  27. Bouboulis, P.; Theodoridis, S. Extension of Wirtinger’s calculus to reproducing Kernel Hilbert spaces and the complex kernel LMS. IEEE Trans. Signal Process. 2011, 59, 964–978. [Google Scholar] [CrossRef]
  28. Zhang, X. Matrix Analysis and Application, 2th ed.; Tsinghua University Press: Beijing, China, 2013. [Google Scholar]
  29. Chen, B.; Xing, L.; Liang, J.; Zheng, N.; Príncipe, J.C. Steady-state mean-square error analysis for adaptive filtering under the maximum correntropy criterion. IEEE Signal Process. Lett. 2014, 21, 880–884. [Google Scholar]
  30. Picinbono, B. On circularity. IEEE Trans. Signal Process. 1994, 42, 3473–3482. [Google Scholar] [CrossRef]
Figure 1. Time sequence and histogram for the complex alpha stable noise. (a) time sequence; (b) histogram.
Figure 1. Time sequence and histogram for the complex alpha stable noise. (a) time sequence; (b) histogram.
Entropy 21 00162 g001
Figure 2. Learning curves of different algorithms. HCLMS—Hammerstein complex least mean square; HMCCC—Hammerstein maximum complex correntropy criterion.
Figure 2. Learning curves of different algorithms. HCLMS—Hammerstein complex least mean square; HMCCC—Hammerstein maximum complex correntropy criterion.
Entropy 21 00162 g002
Figure 3. Steady testing mean square errors (MSEs) under different characteristic factors and dispersion parameters.
Figure 3. Steady testing mean square errors (MSEs) under different characteristic factors and dispersion parameters.
Entropy 21 00162 g003
Figure 4. Time sequence and histogram for the contaminated Gaussian (CG) noise. (a) time sequence; (b) histogram.
Figure 4. Time sequence and histogram for the contaminated Gaussian (CG) noise. (a) time sequence; (b) histogram.
Entropy 21 00162 g004
Figure 5. Learning curves of different algorithms.
Figure 5. Learning curves of different algorithms.
Entropy 21 00162 g005
Figure 6. Influence of the probability and variance of outlier.
Figure 6. Influence of the probability and variance of outlier.
Entropy 21 00162 g006
Figure 7. Influence of σ .
Figure 7. Influence of σ .
Entropy 21 00162 g007
Figure 8. Steady-state testing MSEs with different noise variances ( σ = 5 ).
Figure 8. Steady-state testing MSEs with different noise variances ( σ = 5 ).
Entropy 21 00162 g008
Table 1. The Hammerstein maximum complex correntropy criterion (HMCCC) algorithm.
Table 1. The Hammerstein maximum complex correntropy criterion (HMCCC) algorithm.
Input: σ , η p , η w , M , N , d ( k ) , x ( k )
1. Initializations: p ( 0 ) , w ( 0 ) .
2. While { d ( k ) x ( k ) } available, do
3.   x p ( k ) = [ x ( k ) x 2 ( k ) x M ( k ) ] T
4.   X ( k ) = [ x p ( k ) x p ( k 1 ) x p ( k N + 1 ) ]
5.    s ( k ) = X T ( k ) p ( k )
6.    e ( k ) = d ( k ) w H ( k ) s ( k )
7.    f ( e ( k ) ) = exp [ | e ( k ) | 2 2 σ 2 ] e ( k )
8.    p ( k + 1 ) = p ( k ) + η p f ( e ( k ) ) X ( k ) w ( k )
9.    w ( k + 1 ) = w ( k ) + η w f ( e ( k ) ) X T ( k ) p ( k )
10. End while
11.  w ^ 0 = w ( k + 1 )
Output: Estimated polynomial coefficient p ^ 0 and filter weight w ^ 0 .

Share and Cite

MDPI and ACS Style

Qian, G.; Luo, D.; Wang, S. A Robust Adaptive Filter for a Complex Hammerstein System. Entropy 2019, 21, 162. https://doi.org/10.3390/e21020162

AMA Style

Qian G, Luo D, Wang S. A Robust Adaptive Filter for a Complex Hammerstein System. Entropy. 2019; 21(2):162. https://doi.org/10.3390/e21020162

Chicago/Turabian Style

Qian, Guobing, Dan Luo, and Shiyuan Wang. 2019. "A Robust Adaptive Filter for a Complex Hammerstein System" Entropy 21, no. 2: 162. https://doi.org/10.3390/e21020162

APA Style

Qian, G., Luo, D., & Wang, S. (2019). A Robust Adaptive Filter for a Complex Hammerstein System. Entropy, 21(2), 162. https://doi.org/10.3390/e21020162

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop