Cite This Publication

Arthur D. Spaulding

Abstract: Since the normally assumed with Gaussian interference is the most destructive in terms of minimizing channel capacity, substantial improvement can usually be obtained if the real-word interference environment (non-Gaussian) is properly taken into account. In this report, the performance of the locally optimum Bayes detector (LOBD) is compared with the performance of various ad hoc nonlinear detection schemes. The theoretical results may be misleading due to the assumptions that are required in order to derive them analytically. For a particular type of broadband impulsive noise, the critical assumptions of "sufficiently" small signal level and large number of samples (large time-bandwidth product are so that the Central Limit Theorem applies) are removed; the first, analytically, and the second, by computer simulation. The thus derived performance characteristics are then compared, especially as the signal level increases. One result is that there are situations where the bandpass limiter outperforms the LOBD as the signal level increases; that is, the locally optimum detector may not remain "near optimum" in actual operational situations.

Keywords: non-Gaussian noise; optimum detection; communication system simulation; parametric signal detection

Disclaimer: Certain commercial equipment, components, and software may be identified in this report to specify adequately the technical aspects of the reported results. In no case does such identification imply recommendation or endorsement by the National Telecommunications and Information Administration, nor does it imply that the equipment or software identified is necessarily the best available for the particular application or uses.

For questions or information on this or any other NTIA scientific publication, contact the ITS Publications Office at ITSinfo@ntia.gov or 303-497-3572.

Back to Search Results