Improved VC-based signal denoising

J. Shao, V. Cherkassky

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

Signal denoising is closely related to function estimation from noisy samples. A similar problem is also addressed in statistics (nonlinear regression) and neural network learning. Vapnik-Chervonenkis (VC) theory provides a general framework for estimation of dependencies from finite samples. This theory emphasizes model complexity control according to Structural Risk Minimization (SRM) inductive principle, which considers a nested set of models of increasing complexity (called a structure), and then selects an optimal model complexity providing minimum error for future samples. Cherkassky and Shao [4] recently applied VC-theory for signal denoising and estimation. This paper extends the original VC-based signal denoising to practical settings where a (noisy) signal is oversampled. We show that in such settings one needs to modify analytical VC bounds for optimal signal denoising. We also present empirical comparisons between the proposed methodology and standard VC- based denoising for univariate signals. These comparisons indicate that the proposed denoising methodology yields superior estimation accuracy and more compact signal representation for various univariate signals.

Original languageEnglish (US)
Pages2439-2444
Number of pages6
StatePublished - 2001
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: Jul 15 2001Jul 19 2001

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'01)
Country/TerritoryUnited States
CityWashington, DC
Period7/15/017/19/01

Fingerprint

Dive into the research topics of 'Improved VC-based signal denoising'. Together they form a unique fingerprint.

Cite this