TY - GEN
T1 - Sparsity-aware estimation of nonlinear Volterra kernels
AU - Kekatos, Vassilis
AU - Angelosante, Daniele
AU - Giannakis, Georgios B.
PY - 2009/12/1
Y1 - 2009/12/1
N2 - The Volterra series expansion has well-documented merits for modeling smooth nonlinear systems. Given that nature itself is parsimonious and models with minimal degrees of freedom are attractive from a system identification viewpoint, estimating sparse Volterra models is of paramount importance. Based on input-output data, existing estimators of Volterra kernels are sparsity agnostic because they rely on standard (possibly recursive) least-squares approaches. Instead, the present contribution develops batch and recursive algorithms for estimating sparse Volterra kernels using the least-absolute shrinkage and selection operator (Lasso) along with its recent weighted and online variants. Analysis and simulations demonstrate that weighted (recursive) Lasso has the potential to obviate the "curse of dimensionality," especially in the under-determined case where input-output data are less than the number of unknowns dictated by the order of the expansion and the memory of the kernels.
AB - The Volterra series expansion has well-documented merits for modeling smooth nonlinear systems. Given that nature itself is parsimonious and models with minimal degrees of freedom are attractive from a system identification viewpoint, estimating sparse Volterra models is of paramount importance. Based on input-output data, existing estimators of Volterra kernels are sparsity agnostic because they rely on standard (possibly recursive) least-squares approaches. Instead, the present contribution develops batch and recursive algorithms for estimating sparse Volterra kernels using the least-absolute shrinkage and selection operator (Lasso) along with its recent weighted and online variants. Analysis and simulations demonstrate that weighted (recursive) Lasso has the potential to obviate the "curse of dimensionality," especially in the under-determined case where input-output data are less than the number of unknowns dictated by the order of the expansion and the memory of the kernels.
UR - http://www.scopus.com/inward/record.url?scp=77951120434&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77951120434&partnerID=8YFLogxK
U2 - 10.1109/CAMSAP.2009.5413323
DO - 10.1109/CAMSAP.2009.5413323
M3 - Conference contribution
AN - SCOPUS:77951120434
SN - 9781424451807
T3 - CAMSAP 2009 - 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing
SP - 129
EP - 132
BT - CAMSAP 2009 - 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing
T2 - 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2009
Y2 - 13 December 2009 through 16 December 2009
ER -