Abstract
Designing a non-ideal delay line (DL) with phase distortion in a transmitted-reference ultra-wideband system with an autocorrelation receiver is a great technical challenge. Differing from the currently empirical design method of DL, a semi-analytic approach is proposed through Gaussian approximation of the expression for conditional bit error rate (BER), based on investigation on the degradation of average BER caused by a group delay ripple range (GDRR) over independent Nakagami-m fading channels. This GDRR-based design method can directly evaluate its effects on the system performance and determine the acceptable phase distortion level to trade-off the BER performance and system complexity. © 2011 The Institution of Engineering and Technology.
| Original language | English |
|---|---|
| Pages (from-to) | 2578-2585 |
| Journal | IET Communications |
| Volume | 5 |
| Issue number | 17 |
| DOIs | |
| Publication status | Published - 25 Nov 2011 |
Fingerprint
Dive into the research topics of 'Designing delay lines based on group delay ripple range for transmitted-reference ultra-wideband systems'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver