RANDOM SIGNAL ANALYSIS AND
A Five-Day Intensive Course
Randomness is a primary feature of all information-bearing signals and in the noise which accompanies their measurement and utilization. System development routinely requires that practical compromises be made to deal with randomness, but all too often the analytical basis for these measures is ill-understood and the opportunity for employing even better strategies is lost. This intensive course furnishes the theoretical framework needed, close alongside frequent hands-on computer-supported exercises. It promotes a digital instrumentation viewpoint, presenting various ways of characterizing random discrete-time signals and a range of appropriate processing strategies, placing strong emphasis on digital filtering and adaptive schemes
Course treatment here relies heavily on expectation as an operator similar to Fourier transformation and focuses on wide-sense stationary processes as interpreted in finite record-length scenarios. Thus the tasks familiar to DSP engineers - such as signal representation, transforming to the frequency and Z domains and convolution - are joined by simple ensemble averaging considerations, and related to time averages. Thereafter, time and frequency-domain commodities become deterministic, allowing familiar analysis procedures to be employed. Traditional emphasis on full-scale buildup of probability theory is largely circumvented, with focus directed mainly on key concepts needed in engineering practice.
Most signal treatment centres on Gaussian processes due to their prominent appearance in linear systems and their tractability for study of nonlinear processing. Effects of filters, modulation and memoryless nonlinearity effects are characterized at raw signal level and also at a mean-equivalent level, where correlation again allows randomness to be supplanted by convenient deterministic equivalents. There is a premium on optimal processing strategies, especially as regards Wiener filters and real-time LMS adaptive filtering.
2. Structure of the Five Days of Study
Each day of study is built around three core lecture sessions which are peppered with frequent hands-on computer-based confirmation and extension activities. These are followed by small analysis tasks aimed at consolidation of each session’s material. At least one Laboratory Investigation is undertaken daily to promote both individual reflection and small-group discussion of analysis and design issues. In total there is structured study time of about 8 hours per day.
(a) DAY 1: Stochastic Signals
One-dimensional random variable concepts underlying random signals; expectation; histograms and probability density functions; stochastic signals, ensemble averages, stationarity and ergodicity; wide-sense stationarity and implications for instrumentation.
Gaussian RVs and moments; Gaussian RVs under elementary arithmetic operations; random number generators; The Central Limit Theorem; sample records and moments; covariance, a.c. and d.c. power; independence, orthogonality and uncorrelatedness in the Gaussian case; bivariate Gaussian RVs.
Multi-dimensional Gaussian RVs, correlation matrices; Gaussian processes and white processes; Gaussian white noise; variability limitations on time-separated samples of a Gaussian process; introduction to the Autocorrelation Function and Power Spectral Density; wide-sense stationary sinusoids in additive noise; setting signal-to-noise ratios.
(b) DAY 2: Filtering and Power Spectral Density Shaping
Difficulties with DTFTs of random signals; imposing determinism for the statistical ACF; the crosscorrelation function and its interpretation; correlation properties and power spectrum properties.
Chebyshev’s Inequality; confidence intervals; sample second moment distribution and power; restriction to finite-energy records; estimating sample-record ACFs and CCFs; random properties of correlation measurements; estimating PSDs by periodograms; problems with persistent high measurement variance.
Linear time-invariant system effects on random signals; raw voltage-level input/output relationships; the time CCF and its relation to the statistical
CCF; review of discrete convolution; filter input-output correlations and mean-equivalent models of filters; covariance-equivalent models; difficulties with general filter output PDFs; Gaussianization tendency with filters.
(c) DAY 3: Optimality and Adaptive Filtering
Noise-power normalization of cascaded-pair filters; “bandlimited white” noise; noise colouration; pink noise; fading-memory averaging contrasted with N-point moving-window averaging; use of a time-varying processor to achieve “build-and-slide” averaging; un-pinking; system identification by I/O crosscorrelation.
The Orthogonality Principle in estimation; general Wiener Filter derivation; simplification for orthogonal signal and noise; matrix formulation for the FIR Wiener filter; error obtained with the Wiener filter; additional error due to FIR compromise.
The adaptive linear combiner; the LMS algorithm; structure of the LMS adaptive FIR filter; stylized problems suitable for configuring LMS adaptive solutions; eigenvalue spread; step sizing; coefficient initialization effects; learning curves; excess mean-square error; practice with system identification and equalization of varying channel effects.
(d) DAY 4: Nonlinear Processing
Structurally-constrained special parameter-adaptive filters; adaptive notch filters for frequency tracking and narrowband interference elimination; adaptive delay estimation.
Review of functions of RVs; the automatic root formula; zero-memory nonlinearities for Probability Density Function control of computer-generated random signals; shaping the PDF of a white process.
Second-order PDFs from nonlinearities; ACF of a nonlinearity fed by a Gaussian process; Bussgang’s Theorem for crosscorrelations of nonlinearity outputs; systematic mean-equivalent models of nonlinearities.
(e) DAY 5: Time-Varying and More Nonlinear Processing
Characteristic function method for autocorrelation of memoryless nonlinearity outputs; Price’s Theorem for output ACFs; Shutterly’s expansion for PSDs and spectral occupancy at outputs of nonlinearites; soft limiting and Baum’s tunable-PDF random process; generalizations of Bussgang’s and Price’s Theorems.
Running off the ends of data records; transient randomness in filters; causalization and time truncation effects; multiplicative processing of stochastic signals; correlation and PDFs for Amplitude Modulation of baseband random signals; Woodward’s Theorem for Frequency Modulation.
Wide-sense cyclostationary signals; ACFs and PSDs when zero-insertion interpolation is performed; zero-order hold interpolation; decimation of white and coloured discrete-time processes. Consolidation exercises.
3. Makeup of Course Notes
Course notes are PowerPoint slide images, with consistent notation and careful referencing of the relevant DSP literature. These bound notes are self-contained. Students receive a DVD containing all course exercises and videoclips of indicative solution approaches.
4. Computation Support
This course has an exceptionally strong hands-on flavour. All student work is reliant on the ready availability of the DSP Creations’ toolset (Slifer, Sketch-a-Filt and DSP_Speedster) which are built around the MATLAB/Simulink environment from The MathWorks Ltd. Learning motivation is sharpened by a rich blend of user-friendly graphics display and interactive instrument control features and numerous illustrations of random signal shaping through appropriate digital filter design. The special Simulink Labkit DSP_Speedster is central to this course for activities which re-enforce key ideas, making use of specially-configured virtual instruments for quick cementing of theory and practice. Students leave the course with their own personal copies of Slifer Lite and Sketch-a-Filt Lite for fuller absorption of, and later reflection upon, their course experience.
5. Course Pre-requisites
Engineers, mathematicians, physicists and other technical personnel with a ready grasp of mathematical analysis (including integral calculus and fundamentals of complex variables) are the intended participants. Familiarity with basic probability, filtering terminology, DSP concepts, or basics of MATLAB - while not required - would be an added advantage.