Improved VC-based signal denoising

J. Shao, V. Cherkassky

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations


Signal denoising is closely related to function estimation from noisy samples. A similar problem is also addressed in statistics (nonlinear regression) and neural network learning. Vapnik-Chervonenkis (VC) theory provides a general framework for estimation of dependencies from finite samples. This theory emphasizes model complexity control according to Structural Risk Minimization (SRM) inductive principle, which considers a nested set of models of increasing complexity (called a structure), and then selects an optimal model complexity providing minimum error for future samples. Cherkassky and Shao [4] recently applied VC-theory for signal denoising and estimation. This paper extends the original VC-based signal denoising to practical settings where a (noisy) signal is oversampled. We show that in such settings one needs to modify analytical VC bounds for optimal signal denoising. We also present empirical comparisons between the proposed methodology and standard VC- based denoising for univariate signals. These comparisons indicate that the proposed denoising methodology yields superior estimation accuracy and more compact signal representation for various univariate signals.

Original languageEnglish (US)
Number of pages6
StatePublished - Jan 1 2001
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: Jul 15 2001Jul 19 2001


OtherInternational Joint Conference on Neural Networks (IJCNN'01)
Country/TerritoryUnited States
CityWashington, DC


Dive into the research topics of 'Improved VC-based signal denoising'. Together they form a unique fingerprint.

Cite this