Xem mẫu

Real-Time Digital Signal Processing. Sen M Kuo, Bob H Lee Copyright # 2001 John Wiley& Sons Ltd ISBNs: 0-470-84137-0 (Hardback); 0-470-84534-1 (Electronic) 8 Adaptive Filtering As discussed in previous chapters, filtering refers to the linear process designed to alter the spectral content of an input signal in a specified manner. In Chapters 5 and 6, we introduced techniques for designing and implementing FIR and IIR filters for given specifications. Conventional FIR and IIR filters are time-invariant. Theyperform linear operations on an input signal to generate an output signal based on the fixed coeffi-cients. Adaptive filters are time varying, filter characteristics such as bandwidth and frequencyresponse change with time. Thus the filter coefficients cannot be determined when the filter is implemented. The coefficients of the adaptive filter are adjusted automaticallybyan adaptive algorithm based on incoming signals. This has the import-ant effect of enabling adaptive filters to be applied in areas where the exact filtering operation required is unknown or is non-stationary. In Section 8.1, we will review the concepts of random processes that are useful in the developmentandanalysisofvarious adaptivealgorithms.Themost popularleast-mean-square(LMS)algorithmwillbeintroducedinSection8.2.Itsimportantpropertieswillbe analyzed in Section 8.3. Two widely used modified adaptive algorithms, the normalized andleakyLMSalgorithms,willbeintroducedinSection8.4.Inthischapter,weintroduce and analyze the LMS algorithm following the derivation and analysis given in [8]. In Section 8.5, we will brieflyintroduce some important applications of adaptive filtering. The implementation considerations will be discussed in Section 8.6, and the DSP imple-mentations using the TMS320C55x will be presented in Section 8.7. 8.1 Introduction to Random Processes A signal is called a deterministic signal if it can be described preciselyand be reproduced exactlyand repeatedly. However, the signals encountered in practice are not necessarily of this type. A signal that is generated in a random fashion and cannot be described by mathematical expressions or rules is called a random (or stochastic) signal. The signals in the real world are often random in nature. Some common examples of random signals are speech, music, and noises. These signals cannot be reproduced and need to be modeled and analyzed using statistical techniques. We have briefly introduced probabilityand random variables in Section 3.3. In this section, we will review the important properties of the random processes and introduce fundamental techniques for processing and analyzing them. 352 ADAPTIVE FILTERING A random process maybe defined as a set of random variables. We associate a time function xn xn,A with everypossible outcome A of an experiment. Each time function is called a realization of the random process or a random signal. The ensemble ofall these time functions (called samplefunctions)constitutes the random process xn. If we sample this process at some particular time n0, we obtain a random variable. Thus a random process is a familyof random variables. We mayconsider the statistics of a random process in two ways. If we fix the time n at n0 and consider the random variable xn0, we obtain statistics over the ensemble. For example, Exn0 is the ensemble average, where E is the expectation operation introduced in Chapter 3. If we fix A and consider a particular sample function, we have a time function and the statistics we obtain are temporal. For example, Exn, Ai is the time average. If the time average is equal to the ensemble average, we saythat the process is ergodic. The propertyof ergodicityis important because in practice we often have access to onlyone sample function. Since we generallywork onlywith temporal statistics, it is important to be sure that the temporal statistics we obtain are the true representation of the process as a whole. 8.1.1 Correlation Functions For manyapplications, one signal is often used to compare with another in order to determine the similaritybetween the pair, and to determine additional information based on the similarity. Autocorrelation is used to quantify the similarity between two segments of the same signal. The autocorrelation function of the random process x(n) is defined as rxxn,k Exnxk: 8:1:1 This function specifies the statistical relation of two samples at different time index n and k, and gives the degree of dependence between two random variables of n k units apart. For example, consider a digital white noise x(n) as uncorrelated random variables with zero-mean and variance sx. The autocorrelation function is rxxn,k Exnxk ExnExk 0, , n k n k. 8:1:2 If we subtract the means in (8.1.1) before taking the expected value, we have the autocovariance function gxxn,k E xn mxnxk mxk rxxn,k mxnmxk: 8:1:3 The objective in computing the correlation between two different random signals is to measure the degree in which the two signals are similar. The crosscorrelation and crosscovariance functions between two random processes x(n) and y(n) are defined as rxyn,k Exnyk 8:1:4 INTRODUCTION TO RANDOM PROCESSES 353 and gxyn,k E xn mxnyk myk rxyn,k mxnmyk: 8:1:5 Correlation is a veryuseful DSP tool for detecting signals that are corrupted byadditive random noise, measuring the time delaybetween two signals, determining the impulse response of a system (such as obtain the room impulse response used in Section 4.5.2), and manyothers. Signal correlation is often used in radar, sonar, digital communications, and other engineering areas. For example, in CDMA digital commu-nications, data symbols are represented with a set of unique key sequences. If one of these sequences is transmitted, the receiver compares the received signal with every possible sequence from the set to determine which sequence has been received. In radar and sonar applications, the received signal reflected from the target is the delayed version of the transmitted signal. Bymeasuring the round-trip delay, one can determine the location of the target. Both correlation functions and covariance functions are extensivelyused in analyzing random processes. In general, the statistical properties of a random signal such as the mean, variance, and autocorrelation and autocovariance functions are time-varying functions. A random process is said to be stationaryif its statistics do not change with time. The most useful and relaxed form of stationaryis the wide-sense stationary (WSS) process. A random process is called WSS if the following two conditions are satisfied: 1. The mean of the process is independent of time. That is, Exn mx, 8:1:6 1. where mx is a constant. 2. The autocorrelation function depends onlyon the time difference. That is, rxxk Exn kxn: 8:1:7 Equation (8.1.7) indicates that the autocorrelation function of a WSS process is inde-pendent of the time shift and rxxk denotes the autocorrelation function of a time lag of k samples. The autocorrelation function rxxk of a WSS process has the following important properties: 1. The autocorrelation function is an even function of the time lag k. That is, rxxk rxxk: 8:1:8 2. The autocorrelation function is bounded bythe mean squared value of the process expressed as rxxk rxx0, 8:1:9 354 ADAPTIVE FILTERING 2. where rxx0 Ex2n is equal to the mean-squared value, or the power in the random process. In addition, if x(n) is a zero-mean random process, we have rxx0 Ex2n s2: 8:1:10 Thus the autocorrelation function of a signal has its maximum value at zero lag. If x(n) has a periodic component, then rxxk will contain the same periodic com-ponent. Example 8.1: Given the sequence xn anun, 0 < a < 1, the autocorrelation function can be computed as rxxk xn kxn ankan ak a2n: n n0 n0 Since a < 0, we obtain k rxxk 1 a2 : Example 8.2: Consider the sinusoidal signal expressed as xn cos!n, find the mean and the autocorrelation function of x(n). (a) mx Ecos!n 0. (b) rxxk Exn kxn Ecos!n !kcos!n 1Ecos2!n !k 1cos!k 1cos!k: The crosscorrelation function of two WSS processes x(n) and y(n) is defined as rxyk Exn kyn: 8:1:11 This crosscorrelation function has the property rxyk ryxk: 8:1:12 Therefore ryxk is simplythe folded version of rxyk. Hence, ryxk provides exactly the same information as rxyk, with respect to the similarityof x(n) to y(n). INTRODUCTION TO RANDOM PROCESSES 355 In practice, we onlyhave one sample sequence xn available for analysis. As discussed earlier, a stationaryrandom process x(n) is ergodic if all its statistics can be determined from a single realization of the process, provided that the realization is long enough. Therefore time averages are equal to ensemble averages when the record length is infinite. Since we do not have data of infinite length, the averages we compute differ from the true values. In dealing with finite-duration sequence, the sample mean of x(n) is defined as N1 mx xn, 8:1:13 n0 where N is the number of samples in the short-time analysis interval. The sample variance is defined as N1 s2 xn mx : 8:1:14 n0 The sample autocorrelation function is defined as Nk1 rxxk xn kxn, k 0,1, ...,N 1, 8:1:15 n0 where N is the length of the sequence x(n). Note that for a given sequence of length N, Equation (8.1.15) generates values for up to N different lags. In practice, we can onlyexpect good results for lags of no more than 5±10 percent of the length of the signals. The autocorrelation and crosscorrelation functions introduced in this section can be computed using the MATLAB function xcorr in the Signal Processing Toolbox. The crosscorrelation function rxyk of the two sequences x(n) and y(n) can be computed using the statement c = xcorrx, y); where x and y are length N vectors and the crosscorrelation vector c has length 2N 1. The autocorrelation function rxxk of the sequence x(n) can be computed using the statement c = xcorrx); In addition, the crosscovariance function can be estimated using v = xcovx, y); and the autocovariance function can be computed with v = xcovx); See Signal Processing Toolbox User`s Guide for details. ... - tailieumienphi.vn
nguon tai.lieu . vn