## Abstract

We pose the following optimization: Given y = {y(n)}_{n=0}^{N-1} ε R^{N}, find a finite-alphabet x$+$/ = {x$+$/(n)}_{n=0}^{N-1} ε A^{N}, that minimizes d(x,y) + g(x) subject to: x satisfies a hard structural (syntactic) constraint, e.g., x is piecewise constant of plateau run-length ≥ M, or locally monotonic of lomo -degree α. Here, d(x,y) = ∑_{n=0}^{N-1} d_{n} (y(n), x(n)) measures fidelity to the data, and is known as the noise term, and g(x) = ∑_{n=1}^{N-1} g_{n}(x(n),x(n - 1)) measures smoothness-complexity of the solution. This optimization represents the unification and outgrowth of several digital nonlinear filtering schemes, including, in particular, digital counterparts of Weak Continuity (WC) [6, 7, 2], and Minimum Description Length (MDL) [4] on one hand, and nonlinear regression, e.g., VORCA filtering [11], and Digital Locally Monotonic Regression [10], on the other. It is shown that the proposed optimization admits efficient Viterbi-type solution, and, in terms of performance, combines the best of both worlds.

Original language | English (US) |
---|---|

Pages | 398-401 |

Number of pages | 4 |

State | Published - Jan 1 1996 |

Event | Proceedings of the 1996 8th IEEE Signal Processing Workshop on Statistical Signal and Array Processing, SSAP'96 - Corfu, Greece Duration: Jun 24 1996 → Jun 26 1996 |

### Other

Other | Proceedings of the 1996 8th IEEE Signal Processing Workshop on Statistical Signal and Array Processing, SSAP'96 |
---|---|

City | Corfu, Greece |

Period | 6/24/96 → 6/26/96 |