### Abstract

The recursive least square lattice (LSL) algorithm based on the newly developed scaled tangent rotations (STAR) is derived. Similar to other recursive least square lattice algorithms for adaptive filtering, this algorithm requires only O(N) operations. This algorithm also preserves the desired properties of the STAR recursive least square (STAR-RLS) algorithm. Specifically, it can be pipelined at fine-grain level. To this end, a pipelined version of the STAR-LSL (referred to as PSTAR-LSL) is also developed. Computer simulations show that the performance of the STAR-LSL algorithm is as good as the QRD-LSL algorithm. The finite precision error properties of the STAR-LSL algorithm are also analyzed. The mean square error expressions show that the numerical error propagates from stage to stage in the lattice, and the numerical error of different quantities in the algorithm varies differently with A. This suggests that different word lengths need to be assigned to different variables in the algorithm for best performance. Finally, finite word length simulations are carried out to compare the performances of different topologies.

Original language | English (US) |
---|---|

Pages (from-to) | 1040-1054 |

Number of pages | 15 |

Journal | IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing |

Volume | 44 |

Issue number | 12 |

DOIs | |

State | Published - Dec 1 1997 |

### Fingerprint

### Keywords

- Finite word length analysis
- Givens rotation
- High-speed
- Lattice structures
- Low-power
- Pipelining
- RLS adaptive filtering
- STAR rotation

### Cite this

**STAR recursive least square lattice adaptive filters.** / Li, Yuet; Parhi, Keshab K.

Research output: Contribution to journal › Article

*IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing*, vol. 44, no. 12, pp. 1040-1054. https://doi.org/10.1109/82.644588

}

TY - JOUR

T1 - STAR recursive least square lattice adaptive filters

AU - Li, Yuet

AU - Parhi, Keshab K

PY - 1997/12/1

Y1 - 1997/12/1

N2 - The recursive least square lattice (LSL) algorithm based on the newly developed scaled tangent rotations (STAR) is derived. Similar to other recursive least square lattice algorithms for adaptive filtering, this algorithm requires only O(N) operations. This algorithm also preserves the desired properties of the STAR recursive least square (STAR-RLS) algorithm. Specifically, it can be pipelined at fine-grain level. To this end, a pipelined version of the STAR-LSL (referred to as PSTAR-LSL) is also developed. Computer simulations show that the performance of the STAR-LSL algorithm is as good as the QRD-LSL algorithm. The finite precision error properties of the STAR-LSL algorithm are also analyzed. The mean square error expressions show that the numerical error propagates from stage to stage in the lattice, and the numerical error of different quantities in the algorithm varies differently with A. This suggests that different word lengths need to be assigned to different variables in the algorithm for best performance. Finally, finite word length simulations are carried out to compare the performances of different topologies.

AB - The recursive least square lattice (LSL) algorithm based on the newly developed scaled tangent rotations (STAR) is derived. Similar to other recursive least square lattice algorithms for adaptive filtering, this algorithm requires only O(N) operations. This algorithm also preserves the desired properties of the STAR recursive least square (STAR-RLS) algorithm. Specifically, it can be pipelined at fine-grain level. To this end, a pipelined version of the STAR-LSL (referred to as PSTAR-LSL) is also developed. Computer simulations show that the performance of the STAR-LSL algorithm is as good as the QRD-LSL algorithm. The finite precision error properties of the STAR-LSL algorithm are also analyzed. The mean square error expressions show that the numerical error propagates from stage to stage in the lattice, and the numerical error of different quantities in the algorithm varies differently with A. This suggests that different word lengths need to be assigned to different variables in the algorithm for best performance. Finally, finite word length simulations are carried out to compare the performances of different topologies.

KW - Finite word length analysis

KW - Givens rotation

KW - High-speed

KW - Lattice structures

KW - Low-power

KW - Pipelining

KW - RLS adaptive filtering

KW - STAR rotation

UR - http://www.scopus.com/inward/record.url?scp=0031378163&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031378163&partnerID=8YFLogxK

U2 - 10.1109/82.644588

DO - 10.1109/82.644588

M3 - Article

VL - 44

SP - 1040

EP - 1054

JO - IEEE Transactions on Circuits and Systems II: Express Briefs

JF - IEEE Transactions on Circuits and Systems II: Express Briefs

SN - 1549-8328

IS - 12

ER -